Архив рубрики: accessibility

Auto Added by WPeMatico

iPhones can now tell blind users where and how far away people are

Apple has packed an interesting new accessibility feature into the latest beta of iOS: a system that detects the presence of and distance to people in the view of the iPhone’s camera, so blind users can social distance effectively, among many other things.
The feature emerged from Apple’s ARKit, for which the company developed “people occlusion,” which detects people’s shapes and lets virtual items pass in front of and behind them. The accessibility team realized that this, combined with the accurate distance measurements provided by the lidar units on the iPhone 12 Pro and Pro Max, could be an extremely useful tool for anyone with a visual impairment.
Of course during the pandemic one immediately thinks of the idea of keeping six feet away from other people. But knowing where others are and how far away is a basic visual task that we use all the time to plan where we walk, which line we get in at the store, whether to cross the street and so on.

The new feature, which will be part of the Magnifier app, uses the lidar and wide-angle camera of the Pro and Pro Max, giving feedback to the user in a variety of ways.

The lidar in the iPhone 12 Pro shows up in this infrared video. Each dot reports back the precise distance of what it reflects off of.

First, it tells the user whether there are people in view at all. If someone is there, it will then say how far away the closest person is in feet or meters, updating regularly as they approach or move further away. The sound corresponds in stereo to the direction the person is in the camera’s view.
Second, it allows the user to set tones corresponding to certain distances. For example, if they set the distance at six feet, they’ll hear one tone if a person is more than six feet away, another if they’re inside that range. After all, not everyone wants a constant feed of exact distances if all they care about is staying two paces away.
The third feature, perhaps extra useful for folks who have both visual and hearing impairments, is a haptic pulse that goes faster as a person gets closer.
Last is a visual feature for people who need a little help discerning the world around them, an arrow that points to the detected person on the screen. Blindness is a spectrum, after all, and any number of vision problems could make a person want a bit of help in that regard.

As ADA turns 30, tech is just getting started helping people with disabilities

The system requires a decent image on the wide-angle camera, so it won’t work in pitch darkness. And while the restriction of the feature to the high end of the iPhone line reduces the reach somewhat, the constantly increasing utility of such a device as a sort of vision prosthetic likely makes the investment in the hardware more palatable to people who need it.
Here’s how it works so far:

Here’s how people detection works in iOS 14.2 beta – the voiceover support is a tiny bit buggy but still super cool https://t.co/vCyX2wYfx3 pic.twitter.com/e8V4zMeC5C
— Matthew Panzarino (@panzer) October 31, 2020

This is far from the first tool like this — many phones and dedicated devices have features for finding objects and people, but it’s not often that it comes baked in as a standard feature.
People detection should be available to iPhone 12 Pro and Pro Max running the iOS 14.2 release candidate that was just made available today. Details will presumably appear soon on Apple’s dedicated iPhone accessibility site.

Microsoft Soundscape helps the visually impaired navigate cities

iPhones can now tell blind users where and how far away people are

Accessibility’s nextgen breakthroughs will be literally in your head

Jim Fruchterman
Contributor

Share on Twitter

Jim Fruchterman is the founder of Tech Matters and Benetech, nonprofit developers of technology for social good.

More posts by this contributor

A $6 trillion wake up call for the tech industry

Predicting the future of technology for people with visual impairments is easier than you might think. In 2003, I wrote an article entitled “In the Palm of Your Hand” for the Journal of Visual Impairment & Blindness from the American Foundation for the Blind. The arrival of the iPhone was still four years away, but I was able to confidently predict the center of assistive technology shifting from the desktop PC to the smart phone. 
“A cell phone costing less than $100,” I wrote, “will be able to see for the person who can’t see, read for the person who can’t read, speak for the person who can’t speak, remember for the person who can’t remember, and guide the person who is lost.” Looking at the tech trends at the time, that transition was as inevitable as it might have seemed far-fetched.
We are at a similar point now, which is why I am excited to play a part of Sight Tech Global, a virtual event Dec. 2-3 that is convening the top technologists to discuss how AI and related technologies will usher in a new era of remarkable advances for accessibility and assistive tech, in particular for people who are blind or visually impaired.
To get to the future, let me turn to the past. I was walking around the German city of Speyer in the 1990s with pioneering blind assistive tech entrepreneur Joachim Frank. Joachim took me on a flight of fancy about what he really wanted from assistive technology, as opposed to what was then possible. He quickly highlighted three stories of how advanced tech could help him as he was walking down the street with me. 

As I walk down the street, and walk by a supermarket, I do not want it to read all of the signs in the window. However, if one of the signs notes that kasseler kipchen (smoked porkchops, his favorite) are on sale, and the price is particularly good, I would like that whispered in my ear.
And then, as a young woman approaches me walking in the opposite direction, I’d like to know if she’s wearing a wedding ring.
Finally, I would like to know that someone has been following me for the last two blocks, that he is a known mugger, and that if I quicken my walking speed, go fifty meters ahead, turn right, and go another seventy meters, I will arrive at a police substation! 

Joachim blew my mind. In one short walk, he outlined a far bolder vision of what tech could do for him, without bogging down in the details. He wanted help with saving money, meeting new friends and keeping himself safe. He wanted abilities which not only equaled what people with normal vision had, but exceeded them. Above all, he wanted tools which knew him and his desires and needs. 
We are nearing the point where we can build Joachim’s dreams.  It won’t matter if the assistant whispers in your ear, or uses a direct neural implant to communicate. We will probably see both. But, the nexus of tech will move inside your head, and become a powerful instrument for equality of access. A new tech stack with perception as a service. Counter-measures to outsmart algorithmic discrimination. Tech personalization. Affordability. 
That experience will be built on an ever more application rich and readily available technology stack in the cloud. As all that gets cheaper and cheaper to access, product designers can create and experiment faster than ever. At first, it will be expensive, but not for long as adoption – probably by far more than simply disabled people – drives down price. I started my career in tech for the blind by introducing a reading machine that was a big deal because it halved the price of that technology to $5,000. Today even better OCR is a free app on any smartphone.
We could dive into more details of how we build Joachim’s dreams and meet the needs of millions of others of individuals with vision disabilities. But it will be far more interesting to explore with the world’s top experts at Sight Tech Global on Dec. 2-3 how those tech tools will become enabled In Your Head!
Registration is free and open to all. 

Accessibility’s nextgen breakthroughs will be literally in your head

Google highlights accessible locations with new Maps feature

Google has announced a new, welcome and no doubt long-asked-for feature to its Maps app: wheelchair accessibility info. Businesses and points of interest featuring accessible entrances, bathrooms and other features will now be prominently marked as such.
Millions, of course, require such accommodations as ramps or automatic doors, from people with limited mobility to people with strollers or other conveyances. Google has been collecting information on locations’ accessibility for a couple years, and this new setting puts it front and center.
The company showed off the feature in a blog post for Global Accessibility Awareness Day. To turn it on, users can go to the “Settings” section of the Maps app, then “Accessibility settings,” then toggle on “Accessible places.”

This will cause any locations searched for or tapped on to display a small wheelchair icon if they have accessible facilities. Drilling down into the details where you find the address and hours will show exactly what’s available. Unfortunately it doesn’t indicate the location of those resources (helpful if someone is trying to figure out where to get dropped off, for instance), but knowing there’s an accessible entrance or restroom at all is a start.
The information isn’t automatically created or sourced from blueprints or anything — like so much on Google, it comes from you, the user. Any registered user can note the presence of accessible facilities the way they’d note things like in-store pickup or quick service. Just go to “About” in a location’s description and hit the “Describe this place” button at the bottom.

Google highlights accessible locations with new Maps feature