# Imagination is more important than knowledge (지식보다 상상)

## 자동 구배 (auto gradient)by 바죠

함수의 미분, 이것을 컴퓨터가 자동으로 해준다.

$f(x)$,
$\frac{\partial f(x)}{\partial x}$,
$\frac{\partial^2 f(x)}{\partial x^2}$

from __future__ import absolute_import
import matplotlib.pyplot as plt

'''
Mathematically we can only take gradients of scalar-valued functions, but
of scalar functions, which is used in this example.

To be precise, elementwise_grad(fun)(x) always returns the value of a
vector-Jacobian product, where the Jacobian of fun is evaluated at x and the
vector is an all-ones vector with the same size as the output of fun. When
vectorizing a scalar-valued function over many arguments, the Jacobian of the
overall vector-to-vector mapping is diagonal, and so this vector-Jacobian
product simply returns the diagonal elements of the Jacobian, which is the
(elementwise) gradient of the function at each input value over which the
function is vectorized.
'''

def tanh(x):
return (1.0 - np.exp(-x))  / (1.0 + np.exp(-x))

x = np.linspace(-7, 7, 200)
plt.plot(x, tanh(x),

plt.axis('off')
plt.savefig("tanh.png")
plt.show() from __future__ import absolute_import
from __future__ import print_function
from scipy.optimize import minimize

def rosenbrock(x):
return 100*(x - x**2)**2 + (1 - x)**2 