šŸ”®šŸ° Doughnut economics; algorithms & management; facial hair & terrible products ++ #109

Written by azeem | Published 2017/04/16
Tech Story Tags: artificial-intelligence | economics | management | facebook | robots

TLDRvia the TL;DR App

šŸš€ Issue 109 of Exponential View. Sign-up to receive it each week.

21st century economics to ditch physics; Amazonā€™s leadership. Cambridge Analytica; Algorithmic management and the failure of decency; AVs and the insurance dilemma; Uber losses; Power market turned upside-down; Human heart grows on spinach leaves; Water solidified at boiling point.

Dept of the nearĀ future

šŸ’­ Jeff Bezosā€™s letter to Amazon shareholders. ā€œDay 2 is stasis. Followed by irrelevance. Followed by excruciating, painful decline. Followed by death. And that is why it is always Day 1.ā€ COMPELLING

šŸ”« We need alternatives to Facebook, argues Brian Bergstein. THOUGHTFUL (See also: ā€œthe use of Facebook [is] negatively associated with overall well-beingā€ argue Shakya & Christakis in HBR.)

šŸ© Kate Raworth on doughnut economics. Traditional economics is based on a fallacy, the most pernicious legacy of this fake physics. Instead, we need dynamic, systems approaches. EXCELLENT (See also EV#19 on biologically-inspired economics.)

šŸ“£ EV reader, Fred Wilson: We need more decentralised, self-organizing systems.

šŸ–‡ļø Why people may prefer unequal societies. ā€œ[P]eople have an aversion toward unfairness [which often] leads them to favour unequal distributions. [We should] focus not on whether the [distribution of wealth] is viewed as fair.ā€ INTERESTING (Long-ish academic paper.)

šŸšØ Mercedes promises self-driving taxis but who will carry the insurance for autonomous vehicles? Tesla already offers insurance as an option in the purchase price of cars.

šŸ—³ļø Paul-Olivier Dehayeā€™s presentation on Cambridge Analytica, personal data and electioneering is a MUST READ.

Dept of algorithmic management

One element of the United Airlines story intrigues me: how firms start to rely on process so heavily that it becomes an excuse for eliminating employee discretion, common sense or kindness. We know that in the recent case, United didnā€™t follow its processes exactly but the incident demonstrates the fundamental issue of processes debilitating decency. Itā€™s a small example of the risks of algorithmic management.

John Robb sums it up in this EXCELLENT essay:

The entire process was inevitable. Itā€™s also not a unique situation. Weā€™re going to see much more of this in the future as algorithms and authoritarianism grow

To which Iā€™ll add that while the passengerā€™s treatment resonates viscerally, we are all, as consumers and citizens, at the mercy of black-box processes (aka algorithms). When we engage with business and government and are fed through a decision tree. When an insurance claim is processed. When we are triaged by a telenurse. When we make a complaint. These are often rigid black-box systems.

The humans operating these processes in many cases have limited discretion. This discretion has been whittled away over years in the name of optimisation or standardisation. And in many cases, that process has led to efficiency, fewer errors and higher quality for us.

But in some instances (especially with quasi-monopolists) a black-box culture that is irredeemably inflexible and opaque. In many industries, the human is already out-of-the-loop, simply following a script. Weā€™ve all heard an agent say ā€œthe system doesnā€™t allow me to do that.ā€

Will automated systems make this more or less common? The silver lining is that they could be less brittle than current processes. A machine learning system could learn rapidly from experience, ultimately optimising more efficiently and with greater efficacy that an ā€˜architect-once, implement foreverā€™ process manual. Human overseers could have the power to override such genuinely automated systems if the machines suggest macabre outcomes.

The risk is the opposite will happen because firms will expediently implement poorly designed ones with little consumer redress or business impact.

Dan Hon has an amusing tweetstorm that plays out this management calamity.

Iā€™d be happy to hear your thoughts on this via Twitter. (Use the hashtag #ev109.)

Elsewhere:

Dept of artificial intelligence

Have fun playing with autocomplete for drawing Google Autodraw. (See also Miles Brundage on facial image completion using machine learning.)

Will Knight on the challenges of making artificial intelligence explainable and the progress we are making to do so. EXCELLENT READ

Neural networks made easy. Nice, non-technical introduction.

Accessible & fun history of dueling neural nets.

Nate Soares on ensuring smarter-than-human intelligence has a positive outcome. (Long read.)

AI learns gender biases from text. Weā€™ve highlighted this risk many times in EV. Read the original research here.

Mammoth overview of AI accelerator and incubator programmes.

Small morsels to appear smart at dinnerĀ parties

ā›‘ Uber: Losses of $2.8bn on $20bn of sales, accounting challenges remain. (Airbnb has reached tax agreements with 250 US jurisdictions.)

How Apple and Google are taking aim at diabetes and cancer.

Human heart grown on spinach leaves. (Thanks to my son for finding this.)

šŸ—‘ Are you a fan of Trump board game, Coca-Cola BlāK or Peek? Theyā€™re displayed in the Museum of Failure.

Neuroscientists are rethinking the rules of memory.

108 years of university graduate facial hair styles. šŸ‘Øā€šŸŽ“ Beards, ahoy.

šŸŒž Solar and wind now the cheapest power globally.

Male justices interrupt female justices 3x as often as they interrupt each other.

šŸ¼ Infants show racial biases even at six months.

Fear of diversity made more people vote Trump.

Index funds beat active managers 92% of the time.

šŸ’§šŸ”„ Scientists solidify water at boiling point.

šŸš€ Issue 109 of Exponential View. Sign-up to receive it eachĀ week.


Written by azeem | i am new here.
Published by HackerNoon on 2017/04/16