Too Long; Didn't Read
A.I projects require massive amounts of data to be of any use. 80% of the work done when creating an algorithm involves data extraction, cleansing, filling, and normalizing to make sure simple errors can be systematically avoided. Algorithms have the ability to systematically “make” unfair decisions without anyone noticing, or even understanding why, making ethics more relevant than ever. As such, teams should make sure that the data is representative reality, and that it does not reflect reality’s existing prejudices.