Entropy, Information & Technology

Written by wtfmitchel | Published 2017/06/06
Tech Story Tags: information-technology | information-theory | entropy | computer-science | science

TLDRvia the TL;DR App

“My greatest concern was what to call it. I thought of calling it ‘information,’ but the word was overly used, so I decided to call it ‘uncertainty.’ When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.” — Claude Shannon

Totally how Claude Shannon looked…

This may come as a surprise to a lot of people, but while many in the information technology industry have ‘IT’ in their title, few of them can actually give you a scientific definition of what ‘Information’ or ‘Technology’ stands for. Of course, they can point to a MacBook Pro or an iPhone and identify them as an example of information technology, but they cannot tell you the common denominator which they both must share in order to be classifiable as information technology nor can they tell you why a book qualifies as such. This isn’t a slight on these people though and ignorance of such things does not preclude anyone from doing great things, but it does make things more difficult and more subjective; which I hear can be counter-productive and failure to fully understand these terms may explain why a lot of concepts and technology concepts as a whole fall flat on their face.

John Von Neumann: https://en.wikipedia.org/wiki/John_von_Neumann

Just so that there is no mistake, if you can accept that Information is Entropy, as prescribed by Von Neumann and Shannon, then you can also accept that Technology is the method in which we optimize and reduce entropy, including but not limited to entropy in the form of information. Unfortunately, Von Neumann was also right when he said that ‘no one really knows what entropy is’ and this statement seems to be just as true today as it was 50 or so years ago, but only because people have not been shown a good enough reason to learn about it in my opinion. Needless to say, if you are unwilling to learn about entropy, then you are limiting your understanding of information and technology. While a mathematical understanding of it is difficult, a functional understanding is not that hard and even seems to be quite intuitive, which should come as no surprise because this understanding has made just as many waves in the field of cognitive science as it has in the computer science.

Claude Shannon: https://en.wikipedia.org/wiki/Claude_Shannon

Information is synonymous with Entropy. While entropy has universal applications, it is merely a measure of uncertainty in discrete packages called ‘bits’, as coined by Claude Shannon, at least in this regard. For instance, if you increase your own entropy, you increase uncertainty of something, but if you decrease your own entropy, you reduce uncertainty while increasing your knowledge of something instead. Under most circumstances, misinformation also results in an increase in uncertainty just true information can result in a decrease uncertainty. Whether we want to check our mail or bank balance, when leveraging information technology to do so, it enables us to reduce uncertainty more efficiently with said method than without it. As such, Information is entropy and technology is a method which allows us to reduce entropy and make decisions; often at a prodigious rate in comparison to what we could accomplish without it.

Further Reading:A Mathematical Theory of Communicationhttp://math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf

Information Theory Wikihttps://en.wikipedia.org/wiki/Information_theory

Entropy Wikihttps://en.wikipedia.org/wiki/Entropy

Technology Wikihttps://en.wikipedia.org/wiki/Entropy


Published by HackerNoon on 2017/06/06