Apple’s Voice Control is important for accessibility, and you

[ad_1]

Rather than save Mackay a few mouse clicks, the new version of macOS spared him from having to use a switch controlled by his tongue to interact with a machine. That’s the beauty of the update’s Voice Control system: With the right combination of commands, you can control a Mac, iPhone or iPad with the same level of precision as a finger or a mouse cursor. (Just don’t confuse it with Apple’s earlier Voice Control feature, a now-deprecated tool in older versions of iOS that allowed for rudimentary device interactions.)

Even better, there’s no extra software involved — Voice Control is baked directly into Apple’s forthcoming versions of macOS, iOS and iPadOS, and should be functional in the public beta builds the company will release this summer.

Tools like this aren’t uncommon; Windows 10 has its own voice control system and while it requires more setup that macOS’s approach, it seems to work quite well. We also know that, thanks to its work shrinking machine learning models for voice recognition, Google will release a version of Android that’ll respond to Google Assistant commands near-instantaneously. And more broadly, the rise of smart home gadgetry and virtual assistants have made the idea of talking to machines more palatable. Whether it’s to help enable more people to use their products, or just borne from a need for simplicity, controlling your devices with your voice is only becoming more prevalent.

That’s great news for people like Ian who live with motor impairments that make the traditional use of computers and smartphones difficult. “Whether you have motor impairments or simply have your hands full, accessibility features like voice commands have for a long time made life easier for all device users,” said Priyanka Ghosh, Director of External Affairs at the National Organization on Disability. “It’s terrific to see Apple stepping up in this area, and as technology continues to remove barriers to social connection and productivity, it should also remove barriers to employment.”

The way Voice Control works is straightforward enough: If you’re on an iOS device, you’ll see a tiny blue microphone light up when the software is listening. (By default, it’s set to listen for commands all the time unless you enable a feature that stops the device from recording when you’re not looking at the screen.) On Macs, a small window will appear to confirm your computer can hear you, and spell out your commands so you can tell whether it understood you correctly.

Where Voice Control shines is the sheer granularity of it all. Apple says it’s built on much of the same underlying algorithmic intelligence that powers Siri, so it’s more than adequate for actions like launching apps and transcribing your voice into text. It’s also smart enough to recognize menu items and dialog prompts by name — you can say “tap continue” to accept an app’s terms of service, for instance. Beyond that, though, you can tell Voice Control to “show numbers,” at which point it attaches a number to every single element on-screen you can interact with; from there, you can just say that number to select whatever it was you were looking for.

[ad_2]

Source link