Doc Huston


News — At The Edge — 12/15

Worried about being watched 24/7 and AI — governments & business win and AI expert fears — good, you are right to be worried.


Australia’s Encryption-Busting Law Could Impact the World —

Australian authorities will be able to compel tech companies…to make backdoors in their [platforms]….

Cryptographers and privacy advocates…[oppose] backdoors on public safety and human rights grounds…[because] vulnerable to exploitation by criminals and governments [who]…will inevitably demand the same capability….

[T]he government could even compel the individual or a small group…to carry this out in secret…[and] companies that fail or refuse…will face fines up to about $7.3 million. Individuals who resist could face prison time….

Australia’s intelligence allies — the United States, the United Kingdom, Canada, and New Zealand, known collectively as the Five Eyes — have spent decades lobbying for these mechanisms….Privacy advocates note that the Five Eyes have increasingly used euphemisms like ‘responsible encryption,’ implying some sort of balance…[but] indicates some double speak….

[So] messages would all still be end-to-end encrypted, just between the three of you, instead of the two of you….

IEEE, the international professional engineering association, said unequivocally…[Such] mechanisms would create risksEfforts to constrain strong encryption or introduce key escrow schemes into consumer products can have long-term negative effects on the privacy, security and civil liberties of the citizens….

Australia’s new law has…vagueness about when and how often investigators can make data requests…[and] not clear that companies will be able to effectively resist….

Authoritarian states like China, Russia, and Iran already do this. Now the Five Eyes are closer to it than ever.”

Your Apps Know Where You Were Last Night, and They’re Not Keeping It Secret —

“[A]s smartphones have become ubiquitous and technology more accurate, an industry of snooping on people’s daily habits has spread and grown more intrusive….

[Over] four months’ [NYT]…location was recorded over 8,600 times — on average, once every 21 minutes…[and] 75 companies receive…precise location data from apps whose users enable location services to get…[news] weather or other information….

[Some] businesses claim to track up to 200 million mobile devices in the [U.S.]… to within a few yards and in some cases updated more than 14,000 times a day….These companies sell, use or analyze the data to cater to advertisers, retail outlets and even hedge funds….

Businesses say their interest is in the patterns, not the identities…[but] those with access to the raw data — including employees or clients — could still identify a person without consent….

An app may tell users…their location will help…get traffic information, but not mention that the data will be shared and sold…[because] often buried in a vague privacy policy. ‘Location information can reveal some of the most intimate details of a person’s life — whether you’ve visited a psychiatrist, whether you went to an A.A. meeting, who you might date….[and] a person’s preferences….

Health care facilities are…enticing but troubling areas for tracking…[plus] dozens of schools…[and] time at the playground….

Google’s Android system…[had] about 1,200 apps with such code…[and] about 200 on Apple’s iOS….[Company] Reveal Mobile…had location-gathering code in more than 500 apps… many that provide local news….

There are really no consequences’ for companies that don’t protect the data’…[and] people either don’t read…policies or [not]…understand their opaque language…. There is no federal law limiting the collection or use of such data….

Location data companies pay half a cent to two cents per user per month….Google and Facebook, which dominate the mobile ad market, also lead in location-based advertising.”

AI desperately needs regulation and public accountability, experts say —

“[AI] systems and creators are in dire need of direct intervention by governments and human rights watchdogs, according to a new report from researchers at Google, Microsoft and others at AI Now…[because] tech industry just isn’t that good at regulating itself….

[Already] AI-based tools have been deployed with little regard for potential ill effects or…documentation of good ones…in places where they can deeply affect thousands or millions of people…[like] border patrol…school districts and police departments…[with] no systems in place to stop them…[or] track and quantify that harm….

‘As the pervasiveness, complexity, and scale of these systems grow, the lack of meaningful accountability and oversight…[are] increasingly urgent concern’….[Recommendations]

  • We don’t need a Department of AI, but the FAA should be ready to assess the legality of, say, a machine learning-assisted air traffic control system….
  • Facial recognition, in particular…[be] subjected to the kind of restrictions as are false advertising and fraudulent medicine….
  • Public accountability and documentation need to be the rule…not just for basic auditing and justification for using a given system, but for legal purposes should such a decision be challenged….
  • These things need to be taken to court and the people affected need mechanisms of feedback….
  • ’Expanding the disciplinary orientation of AI research…[for] deeper attention to social contexts, and…potential hazards when these systems are applied to human populations’
Find more of my ideas on Medium at,
A Passion to Evolve.
Or click the Follow button below to add me to your feed.
Prefer a weekly email newsletter — free, no ads, no spam, & never sell the list — email me with “add me” in the subject line.
May you live long and prosper!
Doc Huston

More by Doc Huston

Topics of interest

More Related Stories