CODE:
import numpy as np
import matplotlib.pyplot as plt
Generate synthetic data for linear regression (same as before)
np.random.seed (42)
X2 np.random.rand(100, 1) 100 random points for X y-4+3x np.random.randn(100, 1)y4+ 3x noise
Plot the generated data
plt.scatter (X, y)
plt.xlabel('X')
plt.ylabel('y')
plt.title('Synthetic Linear Data')
Adagrad Gradient Descent Function for Linear Regression
plt.show()
def adagrad_gradient descent (X, y, learning_rate=0.01, n_epochs-50,
epsilon-le-8):
mlen (X)
theta np.random.randn(2, 1) Random initialization of
parameters
Add bias term (column of ones) to X X_bnp.c_(np.ones((m, 1)), X)
Initialize accumulated squared gradients to 0
accumulated_gradients np.zeros((2, 1)) epoch in range (n epochs):
for gradients2/m X b.T.dot (X_b.dot (theta) y) # Compute
the gradients accumulated_gradients + gradients**2 Accumulate the squared
gradients adjusted_learning_rate learning_rate / (np.sqrt(accumulated gradients) epsilon) Adjust learning rate theta theta adjusted_learning_rate gradients Update
parameters (theta)
return theta
Apply AdaGrad to fit the model
theta_adagradadagrad_gradient_descent (X, y, learning_rate=0.1,
n_epochs-100) Display the resulting parameters (theta) print (f"AdaGrad estimated parameters
: Intercept (theta) = (theta_adagrad [0] [0]:.4f), Slope (thetal) (theta_adagrad [1][0]:.4f)") #Plot the fitted line
X new np.array([[0], [2]]) New data to plot the line
X new b np.c (np.ones((2, 1)), X_new] Add bias term to new data
y predict X_new_b.dot (theta_adagrad) Predict y based on new X plt.plot (X_new, y_predict, "r-", linewidth=2, label="Predictions")
plt.plot (X, y, "b.")
plt.xlabel('X')
plt.title('Linear Regression with AdaGrad')
plt.legend (
plt.ylabel('y')
)plt.show()
Comments