Linear Regression in 2 Minutes (using PyTorch)

Written by init_27 | Published 2018/01/14
Tech Story Tags: machine-learning | linear-regression | pytorch | primer | tutorial

TLDRvia the TL;DR App

You can find all the accompanying code in this Github repo

This is Part 2 of the PyTorch Primer Series.

Linear Regression is linear approach for modeling the relationship between inputs and the predictions

Source: Wikipedia

We find a ‘Linear fit’ to the data.

Fit: We are trying to predict a variable y, by fitting a curve (line here) to the data. The curve in linear regression follows a linear relationship between the scalar (x) and dependent variable.

Creating Models in PyTorch

  1. Create a Class
  2. Declare your Forward Pass
  3. Tune the HyperParameters

class LinearRegressionModel(nn.Module):

**def** \_\_init\_\_(self, input\_dim, output\_dim):  

    super(LinearRegressionModel, self).\_\_init\_\_()   
    _\# Calling Super Class's constructor_  
    self.linear = nn.Linear(input\_dim, output\_dim)  
    _\# nn.linear is defined in nn.Module_  

**def** forward(self, x):  
    _\# Here the forward pass is simply a linear function_  

    out = self.linear(x)  
    **return** out  

input_dim = 1output_dim = 1

Steps

  1. Create instance of model
  2. Select Loss Criterion
  3. Choose Hyper Parameters

model = LinearRegressionModel(input_dim,output_dim)

criterion = nn.MSELoss()# Mean Squared Lossl_rate = 0.01optimiser = torch.optim.SGD(model.parameters(), lr = l_rate) #Stochastic Gradient Descent

epochs = 2000

Training The Model

for epoch in range(epochs):

epoch +=1  
#increase the number of epochs by 1 every time

inputs = Variable(torch.from\_numpy(x\_train))  
labels = Variable(torch.from\_numpy(y\_correct))  

_#clear grads as discussed in prev post_

optimiser.zero\_grad()

_#forward to get predicted values_

outputs = model.forward(inputs)  
loss = criterion(outputs, labels)  
loss.backward()_\# back props_  
optimiser.step()_\# update the parameters_  
print('epoch **{}**, loss **{}**'.format(epoch,loss.data\[0\]))

Finally, Print the Predicted Values

predicted =model.forward(Variable(torch.from_numpy(x_train))).data.numpy()

plt.plot(x_train, y_correct, 'go', label = 'from data', alpha = .5)plt.plot(x_train, predicted, label = 'prediction', alpha = 0.5)plt.legend()plt.show()print(model.state_dict())

If you want to read about Week 2 in my Self Driving Journey, here is the blog post

The Next Part in the Series will discuss about Linear Regression.

You can find me on Twitter @bhutanisanyam1, connect with me on Linkedin here

Subscribe to my Newsletter for a weekly curated list of Deep Learning and Computer Vision Reads


Written by init_27 | 👨‍💻 H2Oai 🎙 CTDS.Show & CTDS.News 👨‍🎓 fast.ai 🎲 Kaggle 3x Expert
Published by HackerNoon on 2018/01/14