Apple ARKit - Another Breakthrough for Accessibility on Mobile Devices?
ARKit is a framework for creating augmented reality apps for iPhone and iPad. ARKit utilizes the camera on your iPhone and iPad to place digital objects in the environment around you. With the new iPhone 10 and iPad Pros, the ARKit utilizes the front-facing camera to do face tracking. The cool thing is it also detects facial expressions in real time. Because it can track head movement and facial expressions, the face or head can be used to control movements or items on the screen. This includes eye-tracking. For years, one of the most often questions I have been asked during my workshops is, is there eye-tracking or eye-gaze detection on the iPad. Well now there is. While it is not part of the iOS, several app developers are developing apps using ARKit for eye-gaze. While not perfect, the first iterations of eye-gaze apps are pretty cool and fairly responsive if you set them up correctly and control certain conditions. It’s getting there. I can’t help but think that Apple will eventually incorporate this into their accessibility features, thus providing access to even more users.
There are so many applications of ARKit facial detection beyond eye-gaze. Things like emotion detection and training, training for focus and attention, modeling, training for speech therapy and so much more. There are so many possible applications and I am so excited to see where this technology will be in a year.
There are several articles on the current solutions or apps that are currently using the ARKit facial recognition to provide access for persons with limited motor access. Dana Cappel from Beit Issie Shapiro wrote an excellent post on apps currently using ARKit and facial recognition.