Type to search

Apple’s idea for an operating system that knows how you feel could have advertising applications

Technology & Data

Apple’s idea for an operating system that knows how you feel could have advertising applications

Share

This article is a cross-posting from Marketing‘s sister site Macworld Australia.

 

Did you see Spike Jonze’s latest film, Her? In which a colourfully-clad, mild-mannered office worker, played by Joaquin Phoenix, becomes enamoured of his operating system (voiced by Scarlett Johansson)? And, the admiration seems to be mutual.

It’s an intriguing movie, though one most of us would be slotting into the fantasy/sci fi box.

But maybe it’s not so futuristic after all. A couple of Apple-related stories that emerged today seem to suggest that maybe the powers that be at the Cupertino, California company have also seen Her and it’s set their old grey cells a-spinning.

The first is a fairly innocuous story from Apple Insider, that reports Apple has begun testing new Siri voices in the most recent iOS 7.1 beta. The new voices are described as ‘natural-sounding’ and come in a range of different flavours – Australian English, UK English, Japanese and Mandarin Chinese. No mention of Scarlett Johansson, but we hear her dance card is pretty full these days.

You don’t get all this off the bat, however, as any new Siri-enabled devices will ship with a ‘compact’ voice. You can download the new voices after your iGadget is connected to Wi-Fi.  But it does suggest Apple is looking at making your Siri more personal and relatable – talking in an accent just like yours at the very least.

More fascinating though is the other snippet picked up by Beta Slashdot. As is often the case, this one related to a new patent filed at the US Patent and Trademark Office. And this is where we jump from Her to fully-fledged Aldous Huxley-ville. The patent’s title is ‘Inferring User Mood Based on User and Group Characteristic Data’.

We’re not talking about a cheap fairground ring that changes colour willy-nilly, but, “A computer-implemented method comprising: receiving current mood-associated data associated with a user, the current mood-associated data specifying one or more current mood-associated data items, wherein at least one current mood-associated data item corresponds to a recently consumed content item; obtaining at least one baseline mood profile; applying, via a processor, at least one mood rule to the current mood-associated data and the at least one baseline mood profile to generate an inferred mood for the user.”

The post at Slashdot is from the appropriately named Soulskill (and we’ll leave it to you to ruminate over whether that implies Soul Skill or Souls Kill…), who, while proclaiming surprise that this hasn’t been picked up on by media outlets in a massive fashion, notes the increased concerns for privacy intrusion..

The post then goes on to quote Apple’s statement about the patent.

“Mood-associated physical characteristics can include heart rate; blood pressure; adrenaline level; perspiration rate; body temperature; vocal expression, e.g. voice level, voice pattern, voice stress etc; movement characteristics; facial expression etc. Mood-associated behavioural characteristics can include sequence of content consumed, e.g. sequence of applications launched, rate at which the user changed applications etc; social networking activities, e.g. likes and/or comments on social media; user interface (UI) actions, e.g. rate of clicking, pressure applied to a touch screen, etc.; and/or emotional response to previously served targeted content…

“In some cases, a user terminal can be equipped with hardware and/or software that facilitates the collection of mood-associated characteristic data. For example, a user terminal can include a sensor for detecting a user’s heart rate or blood pressure. In another example, a user terminal can include a camera and software that performs facial recognition to detect a user’s facial expressions.”

So basically how you’re feeling can be gauged not just physiologically, but by the way you interact with the online world. And this technology would be able to read and process all that information. And then use it… for what?

The possible applications for such technology are manifold – especially for the advertising industry, but it all sounds very Brave New World to us and suggests a whole new level to the ‘deep capture’ concept examined in such detail by noted scholars Jon Hanson and David Yosifon.

But to note any changes in your demeanour, a possible ad system would have to know your general state of mind. Cnet explains, “The ad delivery system would start by compiling a ‘baseline mood profile’ against which it can compare your future moods.”

Apple mood patent filing

 

Or as Apple’s patent application goes on to say, “An individual’s responsiveness to targeted content delivery can be affected by a number of factors, such as an interest in the content, other content the user is currently interacting with, the user’s current location or even the time of day. A way of improving targeted content delivery can be to infer a user’s current mood and then deliver content that is selected, at least in part, based on the inferred mood. The present technology analyses mood-associated characteristic data collected over a period of time to produce at least one baseline mood profile for a user. The user’s current mood can then be inferred by applying one or more mood rules to compare current mood-associated data to at least one baseline mood profile for the user.”

What do you think? Imagine what an advertiser could do if they were able to ascertain that Scarlett Johansson (or her look-a-like) had just dumped you, you were feeling utterly bereft and they had a ton of chocolate products to shift…

The mind boggles.

 

Tags:

You Might also Like

Leave a Comment