Friday, March 14, 2014

Matrix and vector review and application on linear regression

Matrix is just an array of numbers
  • Take a 2d array with 2 rows and 3 columns for example 
  • It'll be a 2 by 3 matrix, or 2x3 matrix. 
  • Row dimension then column dimension 
  • (Matrix is Really Cool)

Vector is just a 1-column matrix

Matrix addition and subtraction
  • The matrices have to be of same dimensions. 
  • Just add or subtract element by element and result in a matrix with same dimension

Matrix multiplication
  • Number of columns in 1st has to = number of rows in 2nd
  • Say 2x3 * 3x4 will result in a 2x4 matrix 
  • With the element = sum of products between the rows of 1st matrix and columns of the 2nd

Application on linear regression
  • With Model:  h(x) = t0 + t1 * x
  • Say now, we've a bunch of x's and would like to get the corresponding y's
  • One way to program it is to loop:

For i in 0 ... n-1
y[i] = t0 + t1* x[i]
  • Another way is to do it with matrix multiplication, which is more computation efficient: 
| 1 x0 |  *  | t0 |
| 1 x1 |     | t1 |


= | t0 + t1*x0 |  
  | t0 + t1*x1 |

i.e.: data matrix * parameter vector = prediction vector

What if u have multiple hypothesis ?
Then,
data matrix * parameter matrix  = prediction matrix 

Data matrix - each row corresponds to a set of data
Parameter matrix - each column corresponds to the set of parameters for a given hypothesis 
Prediction matrix - each column corresponds to the set of predictions for a given hypothesis 

I = identity matrix has a diagonal of 1s and 0 elsewhere 
It's equivalent to 1 in real numbers; whatever * 1 = whatever 
I * A = A * I = A

A^-1 = inverse matrix of A
A * A^-1 = I 
Just like the inverse in real number, whatever * inverse of itself = 1

A^T = matrix transpose of A
Rows of A become columns of A^T

No comments:

Post a Comment