Continuing Adventures in Machine Learning

In the last post, I wrote about calculating the cost of linear regression learning models combined with using gradient descent to find the minimized cost. Quick review of the key equations. Hypothesis: \(h_\theta(x) = \theta_0 + \theta_{1}x\) Parameters: \(\theta_0, \theta_1\) Cost Function: \(J(\theta_0,\theta_1) = \frac{1}{2m} \sum_{i=1}^m(h_\theta(x^{(i)}) - y^{(i)})^2\) Goal: \(\underset{\rm \theta_0,\theta_1}{\rm minimize}\) \(J(\theta_0, \theta_1)\) With these tools, we can perform a gradient descent, an optimization algorithm designed to find \(\underset{\rm \theta_0,\theta_1}{\rm minimize}\) \(J(\theta_0, \theta_1)\). [Read More]
math  ML 

Rediscovering Math Through Machine Learning

There are two major, obvious, technology trends of interest to me that are being used to solve business problems today: blockchain and machine learning. The promise of AI has tantalized computer scientists and the general public for a long time, with general human intelligence out of grasp even still, however, modern advances in approaches to implementing machine learning algorithms coupled with a dramatic growth in computational capacity have yielded powerful tools to address discrete problem domains. [Read More]
math  ML