There’s a lot of talk these days about ‘smart’ things: smart phones, for example. But it’s not always very clear what ‘smart’ really means. It sometimes seems as though it just means a phone you can use to also surf the web.
In reality a smartphone is only barely a phone. The ‘phone’ part is more of a marketing ploy than anything. After all, how many people would queue outside stores to buy a ‘pocket computer’? None, I suspect.
Smart adds ‘brainpower’
No, ‘smart’ is a lot more interesting. What ‘smart’ really does is to take some input from somewhere, apply some ‘brainpower’ and take some actions.
Think about glasses, spectacles, for a moment.
Standard glasses bend light to correct vision
Regular old glasses — you may be wearing a pair now — bend light to correct your vision. Pieces of glass or plastic are shaped in such a way as to make the world appear sharper and clearer. Some kinds of glass or plastic may perhaps change tint in sunshine. It seems clever, but it’s just a physical reaction — these photochromic lenses contain molecules that react to certain kinds of light.
But add a camera and a computer to a pair of glasses and there are all kinds of things you can do. Take a look at these ‘smart’ developments.
Smart glasses process light
Oxford University in the UK have created an early version of an ordinary looking pair of glasses for people with extremely bad vision.
Video cameras at the corners of the specs feed into a tiny pocket computer. The computer then lights up parts of an LED array in the lenses so the wearer can see objects in more detail. Although this is only a prototype so far, the real thing could include optical character recognition for reading newspaper headlines.
In this case the glasses are using cameras and software to interpret the world and essentially put zoomed in images on a screen in front of the wearer’s eyes.
Smart glasses build a soundscape of the world
Meanwhile Spanish engineers have equipped a pair of sunglasses with two micro cameras and headphones in a system called EYE 21. The system makes a 3D model of the space the cameras see and represents it with sounds.
A blind person can then hear the visual space around them and their brain reconstructs its shape. This method takes advantage of the way a person with normal hearing can ‘place’ a sound: near or far, left or right, front or behind.
Smart glasses read expressions
A scientist at the University of Cambridge, UK is using facial expression analysis to help people recognise the emotions of those they’re talking to.
A camera and software in a pair of glasses tracks 24 feature points on a face and matches expressions to a database. An earpiece advises the wearer of the findings, while a small light inside the lens provides quick alerts. Findings can also be displayed on a computer screen. In tests the 64% accuracy of the glasses beat out the 54% accuracy of humans.
Now those are all ‘smart’ glasses.
Written by Miraz Jordan for, and reproduced from CommunityNet Aotearoa Panui, August 2011. This article has been modified for publication here.