paint-brush
Let's Guess How AR Will Change The Way We See The World in 2020 by@matt-law
1,083 reads
1,083 reads

Let's Guess How AR Will Change The Way We See The World in 2020

by Matt LawJanuary 24th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

There is a dominant model of thinking, a means by which we interact and interpret the world, a lens so thoughtlessly worn, that it's almost impossible to see the assumptions we share. The dominant interaction paradigm we’ve had since the dawn of the internet is pages on screens. In its own time it was revolutionary, and not at all obvious, but simply because we are used to it does not make it any more of a leap. We are now seeing the pages paradigm straining and breaking.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Let's Guess How AR Will Change The Way We See The World in 2020
Matt Law HackerNoon profile picture

They say that fish cannot see the water.

It's the same with us.

In every area of our lives, there is a dominant model of thinking, a means by which we interact and interpret the world, a lens so thoughtlessly worn, so ubiquitously, that it's almost impossible to see the assumptions we share.

Computers have always had a keyboard. Since Windows brought PCs into homes, screens have always had a graphical representation of a virtual ‘workspace’. This model has come across to mobile phones, which present us with a tinier desktop for us to jab around on.

Fifty years ago, the paradigm was born here at the 'Augmented Human Intellect Research Center', a concept so unfamiliar it could only be shown, not described, in what has become known as the “Mother of All Demos”.

Doug Englebart, 1968:

“If you're in your office, you as an intellectual worker were supplied with a computer display, backed up with a computer, that was alive for you all day and instantly responsive to every action you have, how much value could you derive from that?”

It reached the mainstream here some 17 years later...

Apple desktop, 1985

In this paradigm, you have a means by which you can interact with this digital space.

A bridge between the real and the virtual, in the form of a pointer, the mouse, or with a smartphone - your finger touching the screen - like Morten Harket touching the mirror to an ‘other’ world in the video for A-Ha’s Take on Me.

The world wide web = shared desktop

The dominant interaction paradigm we’ve had since the dawn of the internet is pages on screens. That is to say flat paper substitutes, rendered as pages in a representation of the previous generation of technology. In its own time it was revolutionary, and not at all obvious — simply because we are used to it does not make it any more of a leap. That which is obvious in hindsight, rarely is in foresight.

The first coding I ever did was on a now defunct package, called HyperCard: the promise was that these were more than mere analogies of paper cards on a screen, they were “hyper” cards with “hyper” text that could be connected together, to skip and flip trans-dimensionally in a way never seen before.

“Let me introduce the word ‘hypertext’ to mean a body of written or pictorial material interconnected in such a complex way that it could not conveniently be presented or represented on paper.”

— Ted Nelson, 1965

This in itself was a proper mind-bender for people, and again it is hard now to understand the shock of a new paradigm released upon the world.

And so it has continued for 25 years or more, that device looks into a virtual room, a workspace, or a desk-top, and you have a digital representation of your hand, that reaches through the screen through which you can touch and move things, along with an old fashioned typewriter stuck to the front, where you can write on a representation of piece of paper.

Sounds a bit weird when you write it out like that.

It wasn’t always intended to be such, during the 1990s there was a wild and wonderful selection of “hypermedia” that would be soon travelling down the information superhighway, virtual reality was a promise in 1991 that was coming very soon, even then.

29 years ago.

From one paradigm to the next

Each generation of technology has to jump off from where the previous one left off. The first cars for instance were steered with a tiller arm, as the only other powered vehicles were boats, and they were controlled with rudder and a tiller, so that was the starting point.

However, we are now seeing the pages paradigm straining and breaking. Not just with all the other realities we are now promised, but also that data representation and data input methods are moving away from pointing and clicking and towards other types of sensory and haptic enhancement.

Ever wanted to see some extra colours?

https://www.cyborgarts.com/

Always know which way is north?

This man has a magnet embedded in his body which buzzes every time he faces north

Talking to your radio?

It’s actually quite weird to constantly be talking to your house.

What about having an extra thumb?

Because why shouldn’t a modern human have four thumbs.

Insert a computer into your head?

Because basically that’s what you’re doing already anyway, right.

Coming to an office near you?

The thing is with the future, it is sometimes coming for a long time.

The vision is clear long before the delivery gets here.

Why is that?

Mode switching

All through her life, and despite my urging, and tireless instruction, my mother could never program the VCR recorder. Some of you might be too young to understand what a VCR was.

It was one of these — a “Video Cassette Recorder”

Video Cassette Recorder, enemy of the Boomers

I think the reason for the trouble, was that many of that generation that grew up in the 1940s, and 1950s, were born into and built their world model in a fundamentally analog age. One feature of this mechanical, rather than electronic age, was that each button, or switch only had a single function. You press or turn something, and it moves a real thing in the physical realm to a different state: on or off, up or down, to a different frequency.

VCRs in the 80s and 90s, and most electronic devices since, made use of “mode” buttons, where you hold something down and one thing happens, you cycle through some modes, and the exact same button does something completely different.

The mental models we have are a shorthand that enable us to process information, to achieve more than we otherwise could, and enable us to act, but also limit us by what you can understand. My mum got there eventually with the internet, as skeumorphs like virtual paper, a desk top and hands inside the screen are shortcuts that help us understand.

So one part of the question is having the mental models that can deal with these innovations — to first conceive, then understand, and ultimately to use them.

Earlier generations struggled in the shift from mechanical technology to electronic technology.

Our generational challenge is from the screen paradigm, to pervasive and disembodied technology.

But have we got the right technology trigger?

I worked for a while on mobile internet, serving on the board of the Mobile Marketing Association between 2005–7, and after that IPA Mobile Marketing Group and literally every year for about 10 years the next year was going to be “The Year of Mobile” where all the promise was finally going to kick in, and the potential of the tech would be revealed to the world.

Before you reach the mainstream first you need a technology trigger, the iPhone and perhaps more importantly the app ecosystem of iTunes did that for smartphone adoption in the 2010s.

Right now, we are missing this highly usable version of embodied technology, most things we are using is a little clunky. However, it’s important we don’t underestimate how far we have come in augmented reality over the past few years, that which was fantastical a few years ago is now commonplace.

We each have a supercomputer permanently attached to our body

It is literally already (!) augmenting your reality.

It is worth considering that we have been living in an age of augmented reality for many years — money, the book, the limited liability company, the telegraph, the telephone are all prototypes of technology mediated reality.

However, the last 20 or so are showing the beginnings of a move to embodied technology. If the long term trend is human augmentation via technology, the last 30 or so have shown the phone and personal computer bringing it closer to the body. And the next generation will begin to apply it directly to the clothing, the skin, and into the body. The gap is something in the visual field, but this will come at some point.

Look out for the technology trigger for mainstream adoption of stuff you wear, stuff you insert into your body, under the skin.

We are becoming cyborgs of a kind, and I for one, welcome our new cyborg overlords.