Is Bitcoin the revolution against unequal economic systems, or a scam and money laundry mechanism? Will artificial intelligence (AI) improve and boost humankind, or terminate our species? These questions present incompatible scenarios, but you will find supporters for all of them. They cannot be all right, so who’s wrong then? , whether they are good or bad, right or wrong. In fact, the “truth” is just one of the elements used or avoided in order to build any story or idea. There are different interests behind any statement (e.g. economic or sentimental), and messages are issued and received with huge amounts of human bias. Ideas spread because they are attractive We’re living in the age of fake news. consist of deliberate misinformation under the guise of being authentic news, spread via some communication channel, and produced with a particular objective like generating revenue, promoting or discrediting a public figure, a political movement, an organization, etc. Fake news During the 2018 national elections in , WhatsApp was used to spread alarming amounts of misinformation, rumors and false news favoring . Using this technology, involving up to 256 people, making these chat groups much harder to spot compared to the Facebook News Feed or Google’s search results. Brazil Jair Bolsonaro it was possible to exploit encrypted personal conversations and chat groups Last year, the two main Indian political parties took these tactics to a new scale by through creating content in Facebook and spreading it on WhatsApp ( ). is WhatsApp’s largest market (more than 200 million Indians users), and a place where users forward more content than anywhere else in the world. trying to influence India’s 900 million eligible voters both parties have been accused of spreading false or misleading information, or misrepresentation online India But these tactics are not only used in the political arena: they are also involved in activities that go from to attacking commercial rivals with fake customer critics. How can fake news have such an impact? The answer is in the way humans process information. manipulating share prices Understanding is Believing suggested that (i.e. represented in the mind as true) (i.e. represented as false). In other words, the mental representation of a proposition or idea always has a truth value associated with it, and by default this value is true. Baruch Spinoza all ideas are accepted prior to a rational analysis of their veracity, and that some ideas are subsequently unaccepted The automatic acceptance of representations would seem evolutionarily prudent since . Understanding and believing is not a two-stage process that’s independent of each other. Instead, if we had to go around checking every percept all the time we’d never get anything done understanding is already believing. How the Future Looks Massive amounts of data have given birth to , powering a new scale of disinformation operation. Based on techniques, several lifelike text-generating systems have proliferated and they are becoming smarter every day. that are already producing human-like synthetic texts AI systems Natural Language Processing (NLP) This year, announced , a tool to produce text that is so real, that in some cases it’s nearly impossible to distinguish from human writing. GPT-3 can also figure out how concepts relate to each other, and discern context. Tools like this one can be used to generate misinformation, spam, phishing, abuse of legal and governmental processes, and even fake academic essays. OpenAI the launch of GPT-3 Deepfakes relate to technologies that make it possible to create evidence of scenes that never happened through video, photo and audio fakes. These technologies can (placing people into compromising scenarios), ( ), , or even by putting words in the mouths of politicians. Deepfakes enable bullying more generally boost scams swindling employees into sending money to fraudsters damage a company’s reputation pose a danger to democracies (Facial reenactment tech manipulates Putin in real time. Source: RT ) But deepfakes have another impressive effect: they in two ways. First, if accused of having said or done something that they did say or do, liars may generate and spread altered sound or images to create doubt. The second way is simply to denounce the authentic as being fake, a technique that becomes more plausible as the public becomes more educated about the threats posed by deep fakes. make it easier for liars to deny the truth How can we fight this battle? believed that our knowledge of the world is confined to knowledge of , and that’s probably right nowadays too. In a world of appearances (being social media one of its icons), it seems nearly impossible to avoid being deceived. But there is always a way to resist. Arthur Schopenhauer appearance rather than reality : on the one side, warning news consumers and promoting tools so they can be aware and challenge the sources of information is a very positive thing, while on the other side, we may be producing news consumers that don’t believe in the power of well-sourced news and mistrust everything. If we follow the latter path, we may achieve a general state of disorientation, with news consumers uninterested or unable to determine the credibility of any news source. Fighting fake news is a double-edged sword . AI makes it possible to find words and patterns that indicate fake news in huge volumes of data, and tech companies are already working on it. is working on a system that can detect videos that have been altered, and encouraging others to develop deepfake detection methods. We need technology to fight this battle Google making their datasets open source declared that it and how to participate in the 2020 census. YouTube won’t allow election-related “deepfake” videos and anything that aims to mislead viewers about voting procedures (A sample of videos from Google’s contribution to the FaceForensics benchmark. Source: Google AI Blog ) As data consumers, we have the conditions to fight back. is a social psychologist who found that , but this potential can only be realized when the person has . This means that we can resist false ideas, but also that anyone who lacks any of these characteristics is an easy prey for fake news. Daniel Gilbert people do have the potential for resisting false ideas (a) logical ability, (b) a set of true beliefs to compare to new beliefs, and c) motivation and cognitive resources Your assumptions are your windows on the world. Scrub them off every once in a while, or the light won’t come in. (Isaac Asimov) Interested in these topics? Follow me on or Linkedin Twitter Also published at https://medium.com/datadriveninvestor/the-future-of-fake-news-2093f2652ce6