Top Stories for Issue 39: August 2017

1. A good listener, but not a friend

Did you know hundreds of thousands of users say “good morning” to their voice activated assistant every day? Amazon Echo and Google Home are selling in the millions, but why? We could say good morning to our phones as they feature the same, or similar, voice recognition technologies.

Surely these new devices are just jumped-up speakers with a voice. They don’t have heads like robots and they don’t move, so why are we confusing technology with people? The reason is partly where these devices are located (generally kitchens) and partly because voice is the only way users can interact with these devices.

If you don’t own one of these devices they are essentially agents that offer convenient home help:  they can play songs, set reminder timers, research the internet, read out recipes or render a variety of routine household tasks hands-free.

The fact these devices are ‘always on’ is one clue as to why large numbers of people are being polite to these machines and even treating them as members of the family.  The device is always listening out for their name. In the case of Echo, it responds to hearing its name by lighting up a series of LEDs in the direction of a user’s voice.

Such non-verbal behaviour is quite powerful. Contrast this to humanoid robots, which seem to be having a much harder time fitting in. Voice is also a very intuitive interface. Such devices feel a lot more seamless, so the interface almost disappears and the device becomes a character or even a friend.

The voices are all female too. This tends to imbue trust, although it’s been pointed out this is reinforcing sexist stereotypes. Many of the roles these devices take on (secretary, assistant, caregiver) have historically been categorised as women’s work. (See New Scientist, ‘Lazy coding is teaching software to be sexist’).

There’s another big downside to these devices too. Every utterance made in range of these devices is captured and analysed by Amazon or Google. This is made clear in device privacy policies, but how many people actually bother to read these in detail?  This has huge implications not only for personal privacy, but for data ownership and security too.

These devices will become more ubiquitous, not only at home, but at work and in between. They will become increasingly personal and better able to understand our individual whims, wants and even our fears and dreams - if we allow them to.

Read/post comments on this story (currently 0 comments)

2. A global epidemic of attention-whoredom

With the notable exception of the TV series Black Mirror, it’s interesting how most television, film and fiction has missed the internet and social media in particular. There are other exceptions too.

Jarett Kobek’s novel, I Hate the Internet, makes the timely observation that the internet is a colossal scam, a “bad ideology created by thoughtless men” and “a computer network which people use to remind other people they are awful pieces of s***t”.

Werner Herzog’s documentary outlines how the sci-fi imaginations of a handful of techno-optimists in Silicon Valley is unravelling and Dave Eger’s 2013 novel, The Circle, evokes the totalitarian creepiness of digital technology. Add to the list the writings of Jaron Lanier, Evgeny Morozov and Andrew Keen and there are perhaps the makings of a backlash. Even Zadie Smith, not known for her technological themes, comments that, when humans become data, everything from character to friendship and language shrinks. 

So, what’s next? It’s generally assumed the internet is here to stay, but why? Surely online fraud, invasions of privacy and theft of intimate data that ultimately belongs to users could bring the show to an end? 

One interesting recent development is the fact not only old people are starting to question the utopia that was cyberspace. Younger people are now starting to question the motives of Facebook, Twitter, Google and Amazon and even Apple is looked upon with suspicion.  The internet could yet become a failed state.

It is also dawning on people that, with the exception of Apple, Big Tech is actually in the ad business. That’s their model. They collect data, profile users and sell this information or ads around the information. They are, with some exceptions, not saving the planet or even making us nicer people - just selling ads. Worse still, many of these companies are employing their users or customers as unpaid workers.

We are the ones creating the content, but all we get in return is some personalisation and some ads. Moreover, the techno-optimistic myth that hardware and software are neutral is being debunked.  Most of these websites and apps are deliberately designed from the get-go to hook people They are designed to be addictive and stealing our mental and physical health. We surrender information without questioning what it might be used for, and even worse is our whorish desire for it.

We want to be famous and adored and think that, by posting meaningless and inconsequential bits of information, our lives will be transformed.

As for Big Data, be very afraid. There are good uses for data, but also bad uses. Because clicks are the only measure that matters, we are in a frantic rush to the lowest common denominator. All that matters now is how popular or how recent things are. Good things, great things and especially old things are seen as irrelevant. 

The Internet is becoming a giant echo-chamber, an endless recycler of well-worn formats, clichés and templates. Anything remotely original or creative is ignored or re-crafted until it fits these templates or, as writer and activist Naomi Klein puts it, the internet promotes “changeless change”.

Read/post comments on this story (currently 0 comments)

3. Indoor location analytics – what we do in shops

In the 2002 movie Minority Report, the Tom Cruise character is greeted in person as he enters a Gap store. Iris recognition technology spots him and asks how his last purchase (some assorted tank tops) worked out for him. Creepy? Invasion of privacy? Perhaps, but something similar is already happening.

Traditionally, retailers have used crude lasers stretched across shop entrances to count the number of people coming into their stores. But given the ubiquity of always-on smartphones, there’s now a much better way to find out how many people enter a store and what they do inside it.

If a shopper’s phone is switched on and WiFi enabled, retailers can track the customer throughout their store. They can tell, for instance, how many people move directly from underwear to alcohol or how many people entering a changing room then proceed to a check out. In Australia, shoppers who enter a store within a Westfield shopping mall and then Google a rival retailer while inside, can be sent an ad or discount voucher to dissuade them from leaving.
In other words, what happens in a real store is starting to resemble what happens inside a virtual one.

According to some observers, in-store tracking could become a $US21 billion market by 2012, assuming customers accept it. Both Apple and Google are active in this space. Estimates are around one third of the 100 biggest US stores are working with one of these companies to glean data about their customers – termed “indoor location analytics”.

Read/post comments on this story (currently 0 comments)

4. An emotional future

Schools around the world have started to teach children computer coding on the basis they will have to program computers at work. It’s a logical but somewhat flawed idea.

First, although few people know it, there are already computers that can code themselves. Code that writes code. We will still need highly skilled top-level human coders in the future, but we might not need that many.

Second, teaching people how to think logically like machines is illogical given what computers are likely to be capable of one day. We should be doing the exact opposite. We should be teaching people how to behave totally unlike sterile number-crunching computers.

In 1983 Arlie Hochschild, sociologist, named invented the term ‘emotional labour’ to describe work that required a high level of emotional intelligence. For example, airline crew know how to deal with nervous or disruptive passengers. In 2015, a study by David Deming, an education economist at Harvard, discovered almost all new jobs created in the US from 1980 to 2012 required relatively high social skills.

AI and machine learning will edge into more routine, repetitive and logical work previously done by people. But human work will move further into areas that require empathy and perhaps some imagination, inspiration and persuasion. Machines are good at solving problems, but not especially good at inventing them and are almost useless at managing people or providing inspirational leadership.

Any job that involves caring for people, getting people to trust you or persuading people to do certain things they normally resist will be difficult to automate. Doctors, teachers, law enforcement officials and top-flight lawyers, for example, might be able to relax a tiny bit. 

Even so, we don’t celebrate these kinds of jobs as much as we could. Caring is generally undervalued and underpaid in developed societies. Nurses, childcare and aged care workers often earn very little for their efforts. Indeed, most of the emotional work done in society is done for free, often by women and isn’t even regarded as real work. It also fails to count towards GDP.

We are on the cusp of machines that can give the appearance of being empathetic. Affective computers judge a user’s emotional state using biofeedback and adjust themselves accordingly. This could work, up to a point, but these machines will surely lack nuance or a deep understanding of human nature. Only a person can truly grasp how another person might feel, at a deep level.

This all offers a huge economic opportunity. First, start to recognise the true value of emotional work, to the economy and to society. Second, refocus our education systems away from teaching things that smart machines are already capable of (IQ) and towards the things they aren’t (EQ).

Empathy would be top of the list, although invention and inspiration wouldn’t be far behind. Educators and employers would place less emphasis on rote learning, short-term memory and exam success, and spend more time considering how to expand personality, persuasion, compassion and moral character.

Read/post comments on this story (currently 0 comments)

5. Why computers can persuade you

Have you ever wondered why your iPhone is addictive? It makes you release dopamine. Your phone has been deliberately designed so you cannot physically leave it alone. The apps on your phone - Instagram, Fitbit, Netflix – are even worse. Remember when you used to watch a TV show, enjoy it, and that was the end? Nowadays you watch House of Cards and when you’ve finished, the next episode starts instantly.

Technology design makes it harder to stop than carry on and users have little or no awareness of what’s being fed to them. The aim is to create habit-forming products containing emotional mini-highs. These highs come from instant playing of yet another episode or accumulating as many followers, likes or votes of approval as possible.

What’s at the root of all this? The answer is basic human needs for connection, approval and affirmation.  But is it ethical for unknown third parties to design our attitudes and behaviours? In the early days of the internet, there was information and to some extent, enlightenment. Now it’s the commercial imperative to compel and seize the attention of the masses.

Facebook is particularly guilty of attention crimes. Its business model uses our unconscious impulses to compel attention, which is then sold to advertisers for cash. Is it ethical? If you essentially control the psychology of one eighth of the planet, is it responsible?

In Las Vegas gambling machines are designed to ensure people spend as much time as possible on them (known as ‘time on device’). Everything about the way the machines operate is deliberate, even the angle of light emitted from the screen. Gamblers can order food and drink on the same screen without even standing up. 

Is the future a world where we never leave home, chronically addicted to screens? Digital technology designed by a small slice of society may be diminishing our capacity to make free choices. It is even removing us from the people whose love and approval we need. Time to switch off.

Read/post comments on this story (currently 0 comments)