Too Long; Didn't Read
Web scraping is the process of collecting data from websites using automatized scripts. It’s used, of course, to gather large amounts of data that would be impossible to gather manually. It consists of three main steps: fetch the page, Parse the HTML and extract the information you need. It's possible to collect virtually not only any data you need but as you need it and also store it how you want to. It means you can clean the data and create new features as you collect the data, avoiding the need for treatment.