要真正了解梯度下降法的原理,需掌握的预备知识:
导数及偏导数、向量、方向导数和梯度,开始学习起来很抽象,在学习过程,最好以实例和图像的形式来加强理解。
本文以二元函数z=x2+y2为例,用梯度下降法计算该函数的最小值。
1 from mpl_toolkits import mplot3d 2 import matplotlib.pyplot as plt 3 import numpy as np 4 5 # 计算梯度 6 7 8 def gradient(x, y): 9 return np.array([x * 2, 2 * y]) 10 11 # 原函数 12 13 def f(x, y): 14 return x ** 2 + y ** 2 15 16 x = 10 # 初始坐标 17 y = 10 # 初始坐标 18 n = 0 # 迭代次数 19 alpha = 0.01 # 学习率 20 while n < 500: 21 grad = gradient(x, y) 22 grad = np.array([x, y]) - grad * alpha 23 x = grad[0] 24 y = grad[1] 25 n += 1 26 print(f(x, y)) 27 28 fig = plt.figure() 29 ax = plt.axes(projection='3d') 30 ax.set_xlabel('x') 31 ax.set_ylabel('y') 32 ax.set_zlabel('z') 33 34 # 绘制梯度方向的直线 35 z = f(x, y) 36 x = np.linspace(-10, 10, 60) 37 y = grad[0] * x / grad[1] 38 ax.plot3D(x, y, z, 'r') 39 40 # 绘制源函数图形 41 x = np.linspace(-10, 10, 60) 42 y = np.linspace(-10, 10, 60) 43 X, Y = np.meshgrid(x, y) 44 Z = f(X, Y) 45 ax.contour3D(X, Y, Z, 50, cmap='binary') 46 47 plt.show()
标签:10,plt,二元,梯度,最小值,np,ax,grad From: https://www.cnblogs.com/lingdian92/p/16954199.html