Menu

Get In Touch
I have very little interest in the romance and affairs of the British monarchy, not to mention celebrity spotting at high profile events. However, there is one reason I may just open the Sky News app or website and watch a stream of the event.
 

In a world first, Sky News in partnership with Amazon will use artificial Intelligence to identify and provide information of famous guests as they enter Windsor castle throughout their broadcast.

Dubbed ‘who’s who live’ the tech known as ‘Amazon Rekognition’ is a cloud based technology that can recognise and compare faces in both still images and video using A.I.  The purpose in this case is to enhance the viewer experience by solving a typical viewer question in real time…"who the hell is that?" 

The tech is a game changer in the broadcast and live event arenas, which will spark many new applications for all types of live sporting events, festivals experiences and space design.

We are already seeing A.I. prototyping in the design of spaces where facial recognition technology from any device with a camera will read people’s emotional expressions.  A company called ‘affectiva’, an emotion measurement tech company, is leading the way and has collected over 5.5 million expressions of data to date.

Software solutions such as affectiva will provide us with the tools to analyse images, videos and audio of people expressing emotion. In return, facial and vocal emotion metrics are utilised to make real time decisions to improve the customer journey. For example, in a retail or exhibition space we will be able to analyse data from the guests that don’t even touch or engage with products in a space, which can be up wards of 60% of your traffic. That’s an enormous percentage of your audience’s feedback not being measured!

Having attended google I/O, Google’s annual developer conference in Silicon Valley – I’m always interested to get my hands on the tech, meet the developers and better understand what’s on the horizon in regards to A.I. and machine learning.  

Our work with Google has already provided opportunities to implement these into a live environment.  We designed a ‘doodling booth’, pictured below, where guests have 20 seconds to draw an object that is recognised by the machine from previous learnings. It turned out to be a fun and engaging interaction between human and machine. I recommend having a look at https://experiments.withgoogle.com to spark some other ideas.

It’s a very exciting time for consumers, designers and marketers as A.I. develops and provides new opportunities to create meaningful interactions between people and brands. I’m looking forward to the challenge and to see how other brands adopt the technology. For now, I'll be watching closely how people respond to ‘who’s who live’ by Sky News at the royal wedding.