All eyes on WWDC 2016

There is presently a limited number of computer vision capabilities in Apple’s impressive technology stack. The main one that comes to mind is the face detection functionality. You can use it in the Camera app on your iDevice and in the Photos app on your Mac.

Detect position and orientation of faces with iOS

4 October 2017 – I updated the code for Swift 4 and iOS 11. You can find it here. When an iPhone is processing an image from one of its cameras, the main purpose is to make it look good for us. The iPhone has no interest in interpreting the image itself. It just wants to adjust the brightness and the… Continue reading Detect position and orientation of faces with iOS

What can your iPhone see?

The short answer is: not much. Well. Maybe we first have to talk about what it means to “see”. Vision is an extremely rich natural phenomenon. Most of us humans have the uncanny ability to turn light into meaning – as do many other species in the animal kingdom. Vision is mainly used for navigation and recognition. We use our eyes to detect objects… Continue reading What can your iPhone see?

Switching to Swift

This week I seriously started learning Swift. Swift is a novel programming language developed by Apple to replace Objective-C. I already like it and I definitely enjoy the learning process. Some iOS developers I know are talking about how they love Swift. I am not there yet, although I have found three things that may ignite my love.

Published
Categorized as iOS, Swift