Detecting position and orientation of faces with iOS

4 October 2017 – I updated the code for Swift 4 and iOS 11. You can find it here.

When an iPhone is processing an image from one of its cameras, the main purpose is to make it look good for us. The iPhone has no interest in interpreting the image itself. It just wants to adjust the brightness and the colours, so that we can optimally enjoy the image.

There is however one exception. The iPhone can detect whether there is a human face in the image. It is not interested in who it is, merely that there is a face present. It can even keep track of multiple faces. Continue reading “Detecting position and orientation of faces with iOS”

What can your iPhone see?

The short answer is: not much.

Well. Maybe we first have to talk about what it means to “see”. Vision is an extremely rich natural phenomenon. Most of us humans have the uncanny ability to turn light into meaning – as do many other species in the animal kingdom. Vision is mainly used for navigation and recognition. We use our eyes to detect objects in our environment and use the shapes and layout of these objects to navigate our way through life. Continue reading “What can your iPhone see?”