4 October 2017 – I updated the code for Swift 4 and iOS 11. You can find it here.
When an iPhone is processing an image from one of its cameras, the main purpose is to make it look good for us. The iPhone has no interest in interpreting the image itself. It just wants to adjust the brightness and the colours, so that we can optimally enjoy the image.
There is however one exception. The iPhone can detect whether there is a human face in the image. It is not interested in who it is, merely that there is a face present. It can even keep track of multiple faces. Continue reading “Detecting position and orientation of faces with iOS”