top of page
  • Writer's pictureSaad

A Peek into the World of AI

When we hear AI, we think androids, and big evil mainframes: yikes!

Although, today`s AI might not be on par with the near-human robots in Westworld or the clunky battle droids in Star Wars, computer science has made leaps and bounds into the foray of automation.





Machine Learning is one such leap in the grand AI game. With so many different models developed, various systems like email or web search can be programed. Today, we`ll be taking a closer look into one of these models, namely linear regression.

To start off, linear regression deals with all continuous (non-discrete) values of data. A piece of code is fed certain test examples (real-world data), and then used to make predictions about new un-tested examples. To put it in easier terms, let`s consider an example: A real state agency is looking to predict prices for 1000 houses each with different sizes, floors, and number of bedrooms. The agency will feed its linear regression model data, from previous house sales, so it can hypothesize the prices at which the 1000 houses will likely be sold.


How does this work, you ask? To answer that, I`ll be presenting some machine learning code (from MATLAB) below:



In this code, ‘X*theta’ is the hypothesis (the code that makes predictions). Here, ‘J’ is the cost of the regression model. What does that mean? It`s a value indicating how close the model`s prediction is to the example data. When ‘J’ is zero or almost zero, the prediction is fully in line with the example data. Consider the following graph:




The three points are real-world data examples of houses with a certain number of bedrooms that sold for these amounts. Notice how the line isn`t passing through these points. That`s because the cost function ‘J’ is not zero, so the prediction has deviated. Ideally it`ll pass through all the points.

To bring the cost function ‘J’ to zero, we use the gradient descent function. This function continually changes theta (the parameters of the regression model) until the cost ‘J’ is almost zero.




Note: temp1 and temp2 are just intermediary variables.


In the gradient descent function, 'alpha' defines the number of steps taken downward to reach a value of zero for the cost 'J'.


The above graph is an example of a gradient descent curve.


Linear Regression can be used in a variety of applications involving statistics [like in a real state agency, DUH ;) ].






Now I can`t say the information above is going to help you make your own R2D2, but it`ll certainly give you a head start in your AI journey!




20 views0 comments
Post: Blog2_Post
bottom of page