To honor Global Accessibility Awareness Day, Apple recently announced a series of innovative features that will be added to iOS and iPadOS in the coming months. One of the standout features is an advanced eye-tracking technology designed to help users with physical disabilities navigate their devices without needing to touch them. This technology uses artificial intelligence to interpret users’ eye movements and allows for intuitive navigation by identifying where the user is looking through the front camera of the iPhone or iPad.

While this new eye-tracking technology was created with individuals with physical disabilities in mind, it has potential applications beyond this group. It could be used to control devices remotely or assist individuals with busy hands, such as when driving or doing chores, to operate their devices without touch. Additionally, Apple has introduced “Music Responses via Vibration” for the hearing impaired, which uses the iPhone’s Taptic Engine to provide vibration feedback, textures, and responses tailored to the rhythm of the music.

Apple is also making it easier for users to perform tasks using voice commands with the “Voice Shortcuts” feature, allowing users to set up unique voice commands that Siri can understand for launching apps, completing tasks, setting reminders, and more. For those who experience nausea when using a device in a moving vehicle, Apple has introduced “Vehicle Motion Detection” to help reduce the sensation of dizziness by illustrating the vehicle’s movements through moving dots on the display, which can be turned on or off in the Control Center.