Google has announced the launch of Project Gameface, a new tool that allows users to control applications and games on Android devices using facial movements and expressions. This technology aims to make technology more accessible for individuals with movement difficulties or diseases such as muscular dystrophy. The code source for this technology is now open for developers to create Android apps with this control system.

Last year, Google introduced Project Gameface as a way to control a computer cursor through head movements and facial expressions. This technology has since been expanded to include Android devices, allowing users to customize their experience by adjusting facial expressions, gesture size, cursor speed, and other options. To integrate Project Gameface into Android, Google utilized the operating system’s accessibility service to create a new cursor. They also leveraged the facial landmark detection API of its MediaPipe service to program the cursor to move across the screen based on the user’s head movements. This API can recognize 52 facial gestures such as raising eyebrows, opening mouth, or winking, which can be used to control a range of functions in applications.

Users can now control games and applications by analyzing facial expressions and head movements captured by the device’s front camera. This innovative control system aims to make Android devices more accessible to a wider range of users. The announcement was made at Google’s annual developer event where they also discussed advancements in Artificial Intelligence (AI) and updates to the Android operating system.