6
\$\begingroup\$

Since I could not get numpy.gradient() to compute a derivative successfully, I wrote a script to compute it manually. Running the script below will output a plot of two functions f(x) = sin(x) and f'(x) = cos(x) over the interval 0 ≤ x ≤ 2 pi.

import numpy as np
import matplotlib.pyplot as plt
def compute_numerical_derivative(func, x, method='custom'):
 y = func(x)
 if method == 'custom':
 size = len(y0)
 res = np.zeros(size, 'd') # 'd' for double
 # centered differences
 for idx in range(1, size-1):
 res[idx] = (y[idx+1] - y[idx-1]) / (x[idx+1] - x[idx-1])
 # one-sided differences
 res[0] = (y[1] - y[0]) / (x[1] - x[0])
 res[-1] = (y[size-1] - y[size-2]) / (x[size-1] - x[size-2])
 # elif method == 'numpy':
 # res = np.gradient(y)
 return res
x = np.linspace(0, 2*np.pi, 100)
y0 = np.sin(x)
y1_true = np.cos(x) # exactly d/dx (y0)
y1_cust = compute_numerical_derivative(np.sin, x, method='custom')
# y1_nump = compute_numerical_derivative(np.sin, x, method='numpy')
plt.plot(x, y1_true, 'r', marker='o', linestyle=None, alpha=0.5)
plt.plot(x, y1_cust, 'b', marker='^', linestyle=None, alpha=0.5)
# plt.plot(x, y1_nump, 'k', marker='*', linestyle='-', alpha=0.5)
plt.show()

Can the speed and/or accuracy of this algorithm be improved? Is it proper to use centered differences at interior points and one-sided differences at the boundaries?

asked Apr 6, 2018 at 6:23
\$\endgroup\$
5
  • 4
    \$\begingroup\$ Did you consider passing x as the second parameter of np.gradient? \$\endgroup\$ Commented Apr 6, 2018 at 8:39
  • \$\begingroup\$ Ahhh, that is varargs... I can try that in a little bit. \$\endgroup\$ Commented Apr 6, 2018 at 8:41
  • \$\begingroup\$ yes, by default numpy consider each point on the x axis to be spaced by one unit, unless you tell it otherwise. \$\endgroup\$ Commented Apr 6, 2018 at 8:43
  • \$\begingroup\$ Out of curiosity, is there an advantage to using *varargs as an argument including x rather than having x as an argument by itself? \$\endgroup\$ Commented Apr 6, 2018 at 11:37
  • 1
    \$\begingroup\$ *varargs is just the Python notation to mean variable number of arguments, just call np.gradient(y, x) Python will figure it out just fine. \$\endgroup\$ Commented Apr 6, 2018 at 11:42

2 Answers 2

6
\$\begingroup\$

You can use np.roll to compute the centered differences as a vectorised operation rather than in a for loop:

res = (np.roll(y, -1) - np.roll(y, 1)) / (np.roll(x, -1) - np.roll(x, 1))

You then still need to account for incorrect values at the boundaries.

Also note that you have a bug at the line

size = len(y0)

It should be len(y). But you don't need it anymore as you can access the last values as y[-1] or x[-2].

Lastly, consider putting your test code under an if __name__ == '__main__' clause.

answered Apr 6, 2018 at 9:10
\$\endgroup\$
4
\$\begingroup\$

You can vectorize the calculation

def compute_numerical_derivative3(func, x, method='custom'):
 y = func(x)
 if method == 'custom':
 res = np.zeros_like(y) # 'd' for double
 # centered differences
 x_dif = x[2:] - x[:-2]
 y_dif = y[2:] - y[:-2]
 res[1:-1] = y_dif / x_dif
 res[0] = (y[1] - y[0]) / (x[1] - x[0])
 res[-1] = (y[-1] - y[-2]) / (x[-1] - x[-2])
 # elif method == 'numpy':
 # res = np.gradient(y)
 return res
answered Apr 6, 2018 at 9:46
\$\endgroup\$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.