DIGITAL IRIS

DIGITAL IRIS

Innovative eye hyper tracking-technology reads emotions from the eyes

BY LARA VIRIOT

(Published in The Produktkulturmagazin issue 2 2018)

Seeing what others see. Seeing what others perceive. Seeing what others sense. This sounds very futuristic when you first hear it – and futuristic is precisely what it is. Recording and analysing human perception and, for the very first time, even emotions is truly revolutionary. With its data glasses, Vienna-based start-up Viewpointsystem has set itself the goal of establishing new standards regarding the utilisation of technology for the benefit of people. To this end, the company wants to break through the barrier between man and technology with the aim of making the future a reality. 

Mr Berger, the CES Innovation Award for your product, a large EU grant and presentations in the US – you appear to be in the process of conquering the world. How and when did you come up with the idea of venturing into the fiercely competitive data glasses market? 

Strictly speaking, we have been focusing on this idea for a really long time. Our company was established as a spinoff of the University of Vienna, where we had been conducting eye tracking studies in accident prevention and road safety for more than two decades. Here, the prototype for some pretty revolutionary eye tracking glasses was developed, capable of more than other eye tracking systems. 

More than two years ago, we identified the potential for eye tracking in conjunction with smart glasses and mixed reality, because current data glasses have a decisive drawback: they may be technical miracles, they cannot however react intuitively to human behaviour. They must actively inform their devices of what they want – as they are unable to read between the lines. With eye tracking, we have found the decisive key for connecting man to machine and – in a manner of speaking – bringing the man into the digital loop. The major Silicon Valley technology giants, which have acquired small, specialised eye tracking companies over the past few years, are also searching for this key. However, we think that we are already a decisive step ahead of them. 

Can you please explain in a little more detail the concept and operation of your eye tracking glasses, for which you received the CES Innovation Award last year. 

The model currently available on the market is a high-performance eye tracker that is light, pleasant to wear and looks like a normal pair of sports glasses. The glasses include three cameras, one focusing outwards and two inwards. These allow us to record accurate, three-dimensional images and the precise view of the user – without any time lag. In a manner of speaking, we digitalise human seeing behaviour, making it visible on laptops, tablets and smart phones. It is possible to identify exactly where a person is looking – but also, and this is decisive, what they truly register and how they are feeling in the process. For this reason, we also talk about ‘eye hyper tracking’. And it is precisely this technology that will be so exciting for communication between man and machine in the future. 

The glasses can read stress and fatigue from the eyes, for example. In which area does this function offer the greatest value added? 

Determining emotional reactions is extremely relevant in the area of security, to mention just one current application. In police training or in the armed forces – wherever precise actions are essential and where every split-second counts in critical situations. Based on our recordings, training staff can accurately identify where the focus of the officer or soldier lies in each and every second. During a deployment simulation, staff can accurately determine where possible weaknesses lie and how well the officer or soldier is able to concentrate under stress. This allows for totally targeted and tailored training seminars. 

Digital Iris, the new technology that you are currently developing and preparing to launch, not only captures what the beholder sees, but it also reads their emotions from their eyes. How does this work? 

The eye is not only our most important sensory organ, it is physiologically also bidirectional. In other words, it not only registers impressions, it also identifies huge amounts of information about people. It discloses how we are and what we are feeling. It is not without reason that we refer to the eyes as the windows to the soul. We are virtually unable to manipulate eye movements, which is why they are an important key to capturing the cognitive and emotional state of people. Our technology, Digital Iris, measures eye movements and iris contractions very precisely and combines them with the right software. To this end, we are able to determine and interpret the emotional state of a person. We are in the fortunate position of having access to the findings from more than two decades of intensive sight research. The challenge for us lies in combining these findings with the right hardware and software components. 

What new possibilities does identifying human needs and perceptions offer in conjunction with augmented or mixed reality? 

Imagine the following: until now, you have had to actively inform your device of what you want it to do. That would pretty much be like you constantly telling your husband or wife how you are feeling and that you would really like a kiss now. Would it not be so much better if your partner were to look at you and know? A machine that intuitively understands you and that you operate just as intuitively – that is the communication of the future. It enables natural interaction between man and machine. 

This might look as follows: whenever we are unsure in certain situations, the information displayed on the glasses automatically provides us with the right instructions. When lost in a strange city, we automatically receive navigational support without having to actively request it. These are just a few potential scenarios. Soon, the devices will even be able to identify users’ personal states of mind and preferences, with which they will be able to convert this data into personal behavioural predictions. In other words, your device can provide you with help and information without you needing to utter a single word, or without you even being aware that you require help. I must make it clear that we are not there yet. But we are not far away from being able to do this. We are planning to unveil a basic version of Digital Iris at the CES in Las Vegas next January.

Coca-Cola, Deutsche Bahn and Ferrero are just three of your well-known clients. What do companies approaching you expect and hope to achieve?

Smart glasses must be functional and robust – this is something we are constantly discovering, and with companies from all sectors of industry. In the production hall, but also in field work – we often have to deal with changing light conditions and temperature fluctuations, and employees operating the workbench or working in the warehouse occasionally get themselves into difficult situations. This is one of the weaknesses that so many well-known augmented and mixed reality glasses have, because they produce reliable results in the development lab or at the desk, but not in real working environments. Jam-packed with technology, they are so heavy that the wearer quickly develops pressure marks on their nose, and the low power reserve limits many products. Despite people’s enthusiasm for new technology and data – at the end of the day, it is the ROI that counts for businesses. The glasses simply have to work. For this reason, our philosophy is not to create what is technically possible at any price. We want to create things that truly make sense for our company and that guarantee full functionality. The product currently on the market is extremely robust and pleasant to wear, for instance. For this reason, they are frequently used in situations in which AR and MR glasses would otherwise be deployed – for instructions when repairing machines, for example. 

What are the specific applications of your product within companies? 

Repair instructions by means of remote diagnosis as just mentioned is one important field of application. Costs soon start to rise whenever machines stop operating during production. Each minute of machine downtime costs a huge amount of money. We can also offer help when there is no expert on site to repair the often highly complex systems: by means of live streaming, on-site engineers are connected to specialists at other sites. Using the eye tracking function, they can accurately see which machine part the employee is looking at on site and can provide precise and fast instruction. The actions are simultaneously recorded for internal documentation purposes or for training other members of staff. 

In future, help will increasingly be provided by digital assistants. For example, our system is able to identify what information pilots fail to register in certain situations and display this information on the glasses or even warn pilots, while factors such as fatigue and agitation are also taken into account. This considerably increases safety. 

Do you see potential B2C applications for your technology?

Certainly. I am absolutely convinced that smart glasses will conquer the consumer market in the very near future and replace smart phones. Smart phones restrict our actions and are constantly distracting us. Imagine, there are traffic signs in South Korea that inform drivers of distracted smart phone users! In London and the Netherlands, ground traffic lights protect smart phone users against traffic accidents. With regards to perspective, the operation of smart glasses will become considerably better and more intuitive – something that our technology will ultimately ensure. 

The look of the glasses is, of course, decisive for the consumer market. We all want to look as good as possible, particularly in our free time. It is tremendously important that the glasses find social acceptance. They must be pleasant to wear. We believe we have the perfect offerings here, but we are also dependent on technological progress in areas that we are unable to influence. This also applies to the contrast ability of the displays, the power reserve, the processing power and the connectivity. Many components have to become much smaller, lighter and more functional. Data glasses will only be accepted by people and become a mass market phenomenon once they are no heavier than a standard pair of sunglasses, look cool and work in all situations. 

What are your plans for the future? 

We will continue working hard and doing everything we can to surprise the sector with our new product at the CES 2019. Over the coming weeks and months, I will be speaking about Digital Iris and the digital loop at industry events in the US. I hope to meet lots of interesting people and conduct plenty of productive discussions. After all, the entire sector – be it VR, AR, MR or XR – is hunting, hunting for the key that will unite man machine and the real and digital worlds. It is time we started collaborating more. 

Loved this article? Put it in your cart!

viewpointsystem.com

NILS BERGER

Nils Berger is the CEO & co-owner of Viewpointsystem. As an experienced founder and CEO, he has comprehensive knowledge relating to strategic and general management, change management, international business development and business growth acceleration. With Viewpointsystem, he is developing ‘Digital Iris’ technology – a man-machine interface with which mixed reality data glasses will in future be able to intuitively react to human behaviour.

Picture credit © Viewpointsystem


Leave a comment

Please note, comments must be approved before they are published