WEEK-3 ( 1/01/2021 - 5/02/2021 )
In this week we covered Generative and Discriminative Models, Joint Probability Distribution, Marginalisation, and Linear Regression.
A Generative Model learns the joint probability distribution p(x,y). It predicts conditional probability with the help of the Bayes Theorem.
A Discriminative model learns the conditional probability distribution p(y|x). Both of these models were generally used in supervised learning problems.
Linear Regression :
Simple linear regression is useful for finding a relationship between two continuous variables. One is a predictor or independent variable and the other is a response or dependent variable. It looks for a statistical relationship but not a deterministic relationship. The relationship between the two variables is said to be deterministic if one variable can be accurately expressed by the other. For example, using temperature in degrees Celsius it is possible to accurately predict Fahrenheit. A statistical relationship is not accurate in determining the relationship between two variables. For example, the relationship between height and weight.
It also required us to understand the concept of gradient descent. Gradient Descent is the process of minimizing a function by following the gradients of the cost function.
After understanding the various concepts involved, we then started writing the NumPy implementation of Linear Regression on randomly created data.
Comments
Post a Comment