Accessible Street-Crossing App Oko Gets Acquired
It isn’t often I cover M&A news, but today happens to be one of those rare times.
Texas-based transportation safety company Synapse on Tuesday announced its plan to acquire Oko. Oko, an Apple Design Award winner and last year’s App of the Year, is an iPhone app which uses the device’s camera(s) and artificial intelligence to help Blind and low vision people accessibly (and safely!) cross streets. In its announcement, Synapse describes Oko as developed by “a team of accessibility and mobility experts” and which technical attributes include “[harnessing] location data, audio cues, and accessible interface design to help guide users through the most dangerous aspects of crossing intersections.” Oko collaborated with the Blind community to build its app.
Once the deal is officially completed, Oko will become a free app in an effort to “make its services accessible to as many pedestrians as possible,” according to Synapse.
Synapse is owner of Polara, which specializes in building accessible pedestrian signal (APS) technology. The deal with Oko enables Synapse to integrate Oko’s technologies with the existing APS infrastructure by way of its consumer-oriented PedApp in an effort to make street-crossing more accessible to the Blind and low vision community.
“This acquisition is a natural extension of our mission to increase roadway safety, accessibility, and efficiency with state-of-the-art software,” Josh LittleSun, Synapse’s chief technology officer, said in a statement included in the company’s press release. “The fusion of Oko’s smart navigation technology with Polara’s trusted PedApp moves us closer to a future where pedestrian equity and safety are built into every crosswalk.”
Synapse’s vice president of intersection sales, Matthew Baker, agrees with LittleSun.
“This is the kind of life-changing innovation we’re proud to bring into the Polara family,” Baker said. “By eliminating subscription fees, we’re making Oko accessible to all and speeding up adoption in communities that need it the most.”
In an email to me, Oko founder Willem Van de Mierop said he believes the acquisition “can be seen as one of the most exciting deals for the accessibility space.” He also noted Oko recently hit a milestone, as it has helped people navigate 10 million streets.
Catching Up with CSD Chief Executive Chris Soukup
When I recently caught up with Chris Soukup over email amidst Disability Pride Month and ahead of the Americans with Disabilities Act’s (ADA) 35th birthday later this month, the chief executive officer of Communication Service for the Deaf (CSD) explained to me plainly and succinctly he believes disability to be “an inherent part of the human condition.” How those in the disability community experience the world around them, he went on to rightfully tell me, “is continuously evolving throughout our lives.”
“Disability Pride Month is a celebration of how these experiences vary, the beauty of our collective intersection, and our commitment to unity and inclusion,” Soukup said. “The ADA and laws similar to it provide our society with important guardrails to ensure that no one is inadvertently or intentionally left behind as progress and innovation propel us into the future.”
As I wrote in my profile of Soukup and CSD last October, CSD, established in 1975, is a self-described “Deaf-led social impact organization” which exists to “create a better world for Deaf and hard-of-hearing people.” In a nutshell, CSD is devoted to the betterment of the Deaf and hard-of-hearing community; in January, the organization announced it was supporting efforts for the Los Angeles-area Deaf community whose lives were upended by the wildfires that utterly ravaged the Southern California region.
Deaf people, Soukup said, still bear the brunt of bias by society writ large. Discrimination in the workplace, as well as unemployment altogether, remain “pervasive issues” for the community, he said. Not only are securement and sustainability problematic for Deaf job-seekers, Soukup went on to say, career advancement is equally few and far between for Deaf workers. And although technology has unquestionably broken down barriers and subsequently increased access to communication and information, the reality is there still exists “significant challenges,” according to Soukup. Many Deaf people, he said, remain reliant upon intermediaries—people like sign language interpreters, for one—to facilitate proper communication. It’s oftentimes difficult, Soukup said, to ensure “qualified individuals are always available to facilitate between a signing Deaf person and a non-signer.”
Soukup believes a crucial part of gaining greater accessibility is good storytelling.
“[It’s] so important,” he said. “Amplifying and spotlighting people with disabilities that are thriving and achieving their goals is so important. There is not enough attention in the media. People who identify as being disabled make up at least 25% of the population. We need to be seen, and our successes should be celebrated as an opportunity to transform how people perceive disability.”
When asked to elaborate more on the ADA’s significance, Soukup told me the law (and any extending legislation) is of crucial import because they provide “important protection” to all disabled Americans. He reiterated a popular refrain in the disability community, which is the ADA is seen as the “floor” rather than the ceiling. The ADA, Soukup said, is a “baseline” atop of which society can “[build] a world that is designed to be more inclusive from the beginning.” The disability community, he added, would “love” to see lawmakers go further with regulation by “[continuing] to close gaps and eliminate barriers that make it harder for people with disabilities to lead fulfilling lives.”
Soukup’s last sentiment was a good segue into talking about the future. In peering into his proverbial crystal ball, he said it’s his hope society reaches at point at which we “normalize disability.” In addition, he spoke of the domino effect disability has on the lives of everyone, disabled or not, saying “we recognize that when we speak about disability, we are talking about everyone: ourselves, our families, [and] our loved ones.”
“Embracing disability as an ordinary part of what it means to be human expands our thinking and challenges some of the implicit bias that we carry,” Soukup said of his hopes and dreams for disability’s societal future. “A Deaf person can be an incredible physician, airline pilot, college professor, or entrepreneur. Our work is closer to being achieved when the general public internalizes and embraces all of these possibilities.”
Waymo, Autism Society of America Partner On giving riders on the spectrum Greater Accessibility
Following up on my story from last Friday on Waymo introducing teen accounts, I came across another report involving Waymo. This one sees the company working with the Autism Society of America to help people on the spectrum gain fuller independence.
The work was detailed in a story by Phillip Palmer at the ABC7 affiliate in Los Angeles.
“Waymo has partnered with the Autism Society to highlight how a driverless vehicle can actually offer a consistent, safe and predictable way to travel for young adults on the spectrum,” Palmer wrote in describing the pair’s partnership. “By working with the Autism Society of America in the very early stages of development, they can avoid any challenges that might come as the company grows.”
The allure of Waymo to autistic people is pretty much precisely what it is for the Blind and low vision: driverless cars affords those in our respective communities greater agency and autonomy with which to travel. Palmer notes only a third of people on the spectrum have driver’s licenses, according to a study by the Journal of Autism. By contrast, many autistic people do hold college degrees and hold jobs; having access to Waymo means their independence (and self-esteem) is increased. Indeed, the Autism Society’s chief marketing officer, Kristyn Roth, said to Palmer in part “having this autonomy [via Waymo] is something that builds confidence and it uplifts people.”
For its part, Waymo is committed to tackling these kinds of accessibility issues.
“What are the moments where the existing services and transportation office options are not working? Because we take those problems and we actually design product of future solutions around their specific problems, so that we’re not just assuming these are the problems that you faced,” said Orlee Smith, senior product manager at Waymo.
Waymo, and its underlying technologies, are not above reproach. They should both be scrutinized. Yet in a world where a lot of people are suspicious of artificial intelligence—and make no mistake, driverless cars are effectively AI-powered robots on wheels—the genuine good a company like Waymo can do for people pushed by the wayside under the guise of prioritizing safety and competence. The reality is, there are a lot of disabled people out there for whom conventional driving is impossible. For those folks, which includes yours truly, the advent of autonomous vehicles is nothing short of revolutionary. As the technical bits and bobs inevitably mature, the next step for mainstreaming driverless cars even further is advocating for individual ownership. That will be an even more monumental task because it entwines the law and regulation with the societal view on disability—in this case, challenging the notion Blind and low vision people can buy and “drive” their own driverless cars. Perhaps I’m overly pessimistic, but such a sea change makes climbing Mount Kilimanjaro akin to climbing a molehill.
I’m not certain such a thing will happen in my lifetime, if ever.
Waymo’s Making Getting around more accessible to blind, low vision Kids with new teen Accounts
Jennifer Elias reported for CNBC this week Waymo has begun offering accounts to teenagers ages 14–17 in Phoenix. The decision reflects Waymo’s desire to “increase ridership amid a broader expansion of its ride-hailing service across US cities.”
“The Alphabet-owned company said that, beginning Tuesday, parents in Phoenix can use their Waymo accounts ‘to invite their teen into the program, pairing them together.’ Once their account is activated, teens can hail fully autonomous rides,” Elias wrote in describing Waymo’s ambitions for rider expansion. “Previously, users were required to be at least 18 years old to sign up for a Waymo account, but the age range expansion comes as the company seeks to increase ridership amid a broader expansion of its ride-hailing service across US cities. Alphabet has also been under pressure to monetize AI products amid increased competition and economic headwinds.”
Elias noted Waymo will provide so-called “specially-trained Rider Support agents” for teen riders, adding teens are able to share real-time updates of their trip status with parents. Their parents also receive the ride receipt(s). Another friend of mine, Bloomberg’s Natalie Lung, wrote on X Waymo’s teen accounts are limited to Phoenix for now because “California currently does not allow unaccompanied minors on AVs.”
Uber launched a similar service oriented towards teens in 2023, according to Elias.
Reading this story got me thinking wistfully about how Waymo—and particularly its new teen account feature—would’ve been so great during my high school years. I took a yellow school bus to and fro school every day from the time I was a wee kindergartner until I graduated high school in 2000. The vehicles were one of the shorter, smaller busses generally used to transport disabled kids back and forth from school. I never minded it, as I made some great friends on those trips—my favorite driver, Shirley, is a longtime Facebook friend of mine now—but as I grew into driving age, had Waymo existed then, I would’ve felt of a feather amongst my flock of friends who had normal vision and thus could drive. With Waymo, I could’ve asked a friend to ride with me after school to visit a Tower Records or The Wherehouse with me. I could’ve had the very same agency and autonomy I enjoy today two decades earlier as I was traversing my formative years. This is why I chose to cover Elias’ report: as it stands, Blind and low vision kids—in Arizona’s capital city, anyway—have a tremendous opportunity before them to potentially “drive” a car and get around independently. Not only is the autonomous driving tech cool as hell, the heightened feelings of self-esteem and empowerment on the still-in-development brains of today’s teenagers make a helluva difference in their socio-emotional growth—especially if they cope with a disability.
Waymo’s teen account comes not long after it announced expansion to New York City, as well as Tesla’s own robotaxi competitor seeks to add service here in the Bay Area.
White House Argues ASL Interpreters unnecessary for Accessibility at press Briefings, report says
Molly Reinmann reported for CNN last week US District Judge Amir Ali, a Biden appointee, “grappled for over an hour” over whether to force the Trump administration to provide American Sign Language (ASL) interpreters at White House press briefings.
A lawsuit was brought by the National Association of the Deaf (NAD). Reinmann writes the suit alleges “the White House is violating deaf Americans’ rights under the Rehabilitation Act of 1973 from accessing ‘critical information in real time.’” The attorney for the NAD, Ian Hoffman, subsequently argued Deaf and hard-of-hearing Americans are “deprived of their ability to participate in the democratic process.”
Biden’s briefings were staffed with ASL interpreters. The Justice Department ended the practice upon the transfer of power for President Trump’s second term, contending assistive technologies such as captioning and transcripts are sufficient enough. The NAD pushed back, saying—rightly so—that ASL and English are bespoke, distinct languages while emphasizing captioning oftentimes can prove “especially inaccessible to the many thousands of deaf persons fluent only in ASL,” according to Reinmann.
Relatedly, the NAD took umbrage over the first Trump administration’s lack of sign language interpretation during critical Covid–19 pressers that took place back in 2020.
Reinmann’s story, while newsworthy on merit alone, is especially appalling given the backdrop of July being Disability Pride Month and the Americans with Disabilities Act turning 35 on the 26th. The cretins representing the Justice Department argued the burden of proof is on the NAD to, as Reinmann said, “show that more thorough ASL translations were necessary and repeated her previous claim that the type of services provided should be at the discretion of the White House.” The Department of Justice is essentially paternally (and patronizingly) dictating accessibility—a move suggesting the able-bodied majority know best how to accommodate people with disabilities.
If Trump’s immigration policies are racist—they are—the inaccessibility is ableist.
Moreover, what rankles me most is the part in Reinmann’s lede when she writes Judge Ali “grappled” with his decision. I don’t blame her, but what anguish is there? You have a segment of the citizenry advocating for accessibility so as to be more informed. Disabled Americans, myself included, are Americans. We tune into White House news conferences. We read and watch CNN. We vote. That Judge Ali wrestled with some combination of legal and/or legal gymnastics in issuing his ruling underscores the deeply entrenched societal ableism that, in so many respects, are a bedrock of how not only the country works, but the world too. Most people treat disability like a disease.
As a CODA, I can empathize with Reinmann’s story on so many levels. My parents watched the local news every night after dinner while I was growing up and, despite the presence of captioning, they would rely on me to translate what was going on in the world and explain its meaning. It was quite the ask of a disabled kid himself going into his pre-pubescent years and beyond, but it’s illustrative of the notion that captions, however vital in their own right, has but limited utility. Captions can go only so far. Likewise, transcripts are good, but have their problems because, again, English typically isn’t a Deaf person’s first language and thus comprehension is compromised.
Karoline Leavitt and team clearly don’t understand that—or, if they do, they don’t care.
Editors, this is why accessibility in tech so richly deserves its own beat in newsrooms.
Xbox Announces AGI Tags Availability, More
Microsoft-owned Xbox put out a blog post this week wherein it announced the Accessible Games Initiative (AGI) tags are available “across all digital Xbox experiences.” Team Xbox wrote the news is “just in time” for Disability Pride Month.
The news comes a few months after Xbox announced plans in March to join the AGI.
“[Xbox] is proud to announce that the new Accessible Games Initiative tags, designed to provide players with clear and consistent information about the accessibility features in video games, are now available across all digital Xbox experiences including console, PC, mobile, and web storefronts,” Xbox wrote in the post’s introduction.
Xbox’s contribution to the AGI, a consortium which includes fellow gaming industry heavyweights Nintendo of America, Electronic Arts, Ubisoft, and others, builds upon its own work, begun in 2021 with the release of the Xbox Game Accessibility Feature tags. Of note is Xbox’s explicit callout in the announcement that any Xbox accessibility tags which don’t have a correlating AGI tag will remain available on the platform, with the company saying the combination “will make it even easier for players with disabilities to learn about available accessibility features and find their next great game.”
Xbox’s post finishes with an Q&A-style interview with Brannon Zahand, senior technical program manager at Xbox, and content creator and gaming accessibility advocate Steve Saylor. The conversation “[discusses] the work towards greater standardization of accessibility in games, what these tags mean for players today, and why this work is important,” according to Xbox. Additionally, Xbox published another blog post featuring an interview with Phil Crabtree from Kaizen Game Works. Microsoft Game Dev contributing editor Miguel Lopez writes his conversation with Crabtree delves into “how integrating Accessible Games Initiative tags has supported their development practices, highlights the community’s enthusiastic reception, and explores how accessibility tools and standards can further transform the gaming industry.”
I posted an interview with Entertainment Software Association senior vice president Aubrey Quinn back in early April. The executive told me all about her organization’s stewardship of the AGI and the need for tags, as well as how the group came to exist.
“Before the Accessible Games Initiative, the burden was on players to interpret existing tags in the marketplace from platform to platform, game to game. We hope to change that,” Quinn said of the driving force behind the AGI’s formation. “This new initiative is meant to help consumers identify specific accessibility features in individual video games, so that players buying games can make better informed purchasing decisions. Having a clear approach to identify accessibility features across different games, with criteria behind each accessibility tag, will provide consumers with information about the accessibility features they can find in games prior to purchasing them.”
How Amazon is made ‘Accessible for everyone’
Amazon last week published a blog post on its European site in which the Seattle-based company detailed myriad ways it makes Amazon “accessible for everyone.” The post, bylined by the About Amazon Team, was shared coincident with the recent enforcement of the European Accessibility Act. Amazon said it is “well prepared” for the legislation.
“Amazon’s vision is to be Earth’s most customer-centric company, which means making our devices and services accessible to everyone. We have pioneered accessibility features across our products and services for over a decade and our commitment to accessibility is deeply rooted in our customer-obsessed culture. For us, this is more than meeting requirements: it’s about staying true to our mission of serving every customer,” Amazon wrote of its North Star in the post’s introduction. “We design with inclusivity at the forefront, making our products and services accessible to all customers. By integrating new technologies, such as AI, we are able to create solutions that enhance the experience for all customers, including those with specific accessibility needs. This shows that when accessibility is treated not as an afterthought but as a core design principle, technology can truly become a force for inclusion.”
The post, the scope of which is familiar, offers a cursory rundown of the accessibility attributes of Amazon-branded products—including Alexa, Fire TV, Kindle, Prime Video, and more. In addition, the company highlights how it has made the general shopping experience more inclusive and which goes beyond “more than just a well-structured webpage.” The company mentions features such as Navigation Assistant and Product Summary, both of which are characterized as helping keyboard-oriented and/or screen reader users “shop more efficiently, showing that accessibility and convenience go hand-in-hand.” Likewise, Amazon Locker locations, where people go to a physical place to retrieve their order(s), features a Lower Locker Slot selection for wheelchair users, as well as an audio-based UI (with audio jack) for the Blind and low vision.
“More than 100 million people have disabilities in Europe,” Peter Korn, director of accessibility for Amazon devices, said in a statement. “What I love most about this work is how it embodies one of Amazon’s core principles: Customer Obsession. Building products customers love means including them in the process, not only by talking directly with customers with accessibility needs, including the elderly or with disabilities. Our culture of inclusion is also reflected in the many people with disabilities who work throughout Amazon in diverse roles—including our product teams.”
I covered the redesigned Prime Video app last summer; my report included interviews with executives Raf Soltanovich and Kam Keshmiri. And I last spoke with Korn last year about, amongst other things, using artificial intelligence to power search on Fire TV.
For more on the European Accessibility Act, check out my story earlier this week on it.
Gemini App Gets support for Google Broadcasts
Abner Li reports today for 9to5 Google the Gemini app is gaining the ability to broadcast voice messages to Nest speakers and smart displays integrated with Google Home.
Li notes the aptly-named Broadcasts feature was previewed by Google last month and, perhaps most notably, comes ahead of Gemini supplanting Google Assistant on Android devices. The change will happen sometime “later this year,” according to Li.
Broadcasts can be sent to specific devices or rooms, or to the entire household.
While certainly a feature of convenience—nobody wants to run around their house telling the family dinner is ready, Paul Revere-style—the reality is, as ever, Broadcasts can prove beneficial for accessibility too. Especially for those with limited mobility, or someone who’s immobile altogether, trying to share a message (or get assistance) can be arduous if people are spread across other parts of the house. For example, someone might need to alert their caregiver, who may be in the kitchen, they need to bring their medications to take along with their food. Likewise, someone in a wheelchair may not be able to move about to every part of the house to tell everyone a meal is ready, so Broadcasts makes relaying the message a more accessible task. Google positions Broadcasts as an amenity that makes life easier and nicer when, in truth, the feature has serious applicability as a lifesaving de-facto accessibility feature for many people.
Google’s Broadcasts is similar to Apple’s Intercom on HomePods. Apple’s implementation is effectively identical, catered towards HomePod and HomePod mini, but comes with an additional perk: Intercom can transcribe messages. They will show up on people’s iPhone or Apple Watch, for instance, and is a cool but useful feature for those in mixed households where, say, hearing and Deaf people cohabitate. That Intercom offers transcripts makes the feature more inclusionary when it ostensibly would be exclusionary, much like how—as I’ve proffered often lately—Music Haptics and transcripts breaks barriers for music and podcast listening in the Apple ecosystem.
Apple, Columbia University Researchers discuss Accessibility-Minded ‘SceneScout’ Project
Marcus Mendes reports for 9to5 Mac on a project from Apple and Columbia University aimed at using artificial intelligence to make navigation and street-crossing more accessible to the Blind and low vision. The prototype, called SceneScout, is described on Apple’s Machine Learning Research blog as “a multimodal large language model (MLLM)-driven AI agent that enables accessible interactions with street view imagery.”
The paper was written by Gaurav Jain, Leah Findlater, and Cole Gleason.
“People who are blind or have low vision (BLV) may hesitate to travel independently in unfamiliar environments due to uncertainty about the physical landscape,” the trio wrote in the paper’s introduction. “While most tools focus on in-situ navigation, those exploring pre-travel assistance typically provide only landmarks and turn-by-turn instructions, lacking detailed visual context. Street view imagery, which contains rich visual information and has the potential to reveal numerous environmental details, remains inaccessible to BLV people.”
SceneScout relies upon Apple Maps APIs alongside the aforementioned LLM to “provide interactive, AI-generated descriptions of street view images,” Mendes wrote.
As the researchers explain, SceneScout supports two modes: Route Preview and Virtual Exploration. The former is intended to “[enable] users to familiarize themselves with visual details along a route” while the latter is meant to “[enable] free movement within street view imagery.” Mendes notes SceneScout also features “a GPT–4o-based agent within real-world map data and panoramic images from Apple Maps.”
According to Mendes, the SceneScout study, comprising 10 Blind and low vision people, found both SceneScout modes highly lauded by participants. The more open-ended Virtual Exploration mode in particular was praised for providing information “they would normally have to ask others about.” The study’s participants all were well-versed in using screen readers and all worked in the tech industry, Mendes wrote.
As ever, however, there were as many shortcomings as there were advances.
“A technical evaluation [of SceneScout] shows that most descriptions are accurate (72%) and describe stable visual elements (95%) even in older imagery, though occasional subtle and plausible errors make them difficult to verify without sight,” the researchers said of the problems that cropped up. “We discuss future opportunities and challenges of using street view imagery to enhance navigation experiences.”
At 30,000 feet, the SceneScout project is encouraging because, dangerous hallucinations aside, it does prove further potential of artificial intelligence as an assistive technology. As SceneScout is iterated and refined on, it’s plausible the technology could be incorporated somewhere else so as to be available to a more “mainstream” contingent of Blind users such as myself. If SceneScout someday is able to enable fuller agency and autonomy in travel for the Blind and low vision community, then the tool will have reached self-actualization in a way that would make Maslow proud. Put another way, SceneScout theoretically could someday be as impactful to the Blind and low vision community for foot travel as Waymo’s autonomous vehicles are today for driving distances. While SceneScout and Waymo diverge in methodology, the common goal—greater accessibility for disabled people—do undoubtedly converge.
It’s also worth mentioning SceneScout’s scope is similar to that of Apple Design Award winner Oko, as well as NYU’s Commute Booster app for navigating New York City’s subway system. Both pieces of software leverage AI to varying degrees in order to make travel (and transit) for accessible to Blind and low vision people. In a nutshell, the Commute Booster app is designed to rectify the myriad issues inherent to the so-called “middle mile”—the oftentimes treacherous part of one’s journey between departure and destination, which can be really tricky for Blind people to navigate successfully.
Overcast Update Adds Double Tap on Apple Watch
Over the weekend, I came across this Mastodon post in which it was announced popular indie podcast client Overcast, version 2025.6, received a “minor update” that includes not only the boilerplate big fixes and performance improvements, but also one accessibility enhancement of particular note: Double Tap on Apple Watch. The feature, which debuted with Apple Watch Series 9 and Apple Watch Ultra 2 in 2023, is available in the watchOS app to start and stop Play/Pause. Overcast is made by Marco Arment.
The update is described by Arment as “not an exciting one, but good nonetheless!”
The update grabbed my attention for a few reasons. For one thing, Double Tap is, like the iPadOS pointer feature which was redesigned in iPadOS 26 such that the pointer is a Mac-like arrow rather than a circle, one feature whose origins trace back to AssistiveTouch. As with the pointer, it’s my understanding from sources the Double Tap accessibility feature—designed to help those with limited mobility, motor-wise—was handed off to the broader software engineering team within Apple Park to massage it into something with more mainstream sensibility. That Overcast users can now use Double Tap makes playback controls more accessible not only if one’s hands are full, but also if tapping the Watch’s screen would otherwise be burdensome or untenable altogether. For another reason, the advent on Double Tap in Overcast’s Watch app is yet another strong show of solidarity with the disability community by Arment. He’s been a staunch ally of disabled people dating back to days as Instapaper’s keeper, so this latest Overcast only strengthens his resolve in this regard. I interviewed Arment back in 2018 for a story for MacStories’ coverage of the App Store turning 10. He told me over email in part he considers working on accessibility “a peer to other aspects of my design and structure, such as the colors I choose or how interfaces are laid out.” Lastly, Arment has spoken positively of Apple’s new Speech APIs on ATP and his enthusiasm portends well for Overcast getting transcripts sometime in the not-too-distant future.
Overcast’s 2025.6 update is available now on the App Store.
Entrust Product Design Chief Mark Opland Talks the European Accessibility Act, More in Interview
A couple of weeks ago, a landmark law spearheaded by the European Commission went into effect. It’s called the European Accessibility Act (EAA) and it mandates digital goods be accessible to people with disabilities. The June 28, 2025 timeframe was the cutoff date for member states in the European Union to reach compliance, but the law was officially passed back in 2019. The deadline was imposed so as to achieve synchronicity concerning digital accessibility spanning the European Union. Companies whose products shipped to customers prior to June 28 have a grace period of 5 years to comply with the EAA. Hardware has double the time, with 10 years’ grace.
A cogent primer on the EAA was posted in late January on the AccessibleEU website.
One company that’s been thinking a lot about the EAA is Entrust. A digital identify firm based in the United Kingdom, Entrust offers a suite of products and services which cater to financial institutions like banks. Entrust builds everything from ID verification technology to encryption tech and more. In an interview conducted last week via videoconference, Entrust’s vice president of product design, Mark Opland, encapsulated his company’s scope as “[offering] an enormous amount of products and services to financial institutions, but really centered around identity and security.”
When asked about why accessibility matters to him and his team, Opland explained it has been “a huge part of the way we’ve built products for a long time,” adding accessibility has been personally pertinent for the better part of 15 years. To raise awareness for accessibility, he told me, not only aligns with his value system, but helps Entrust “deliver more successful products and services into the industry.” Accessibility, Opland went on to say, isn’t viewed as a “constraint” for Entrust; rather, the company views it as an opportunity to innovate and thus better build its business. Accessibility can be, and often proves to be, “an enabler of innovation” for Entrust, Opland said.
“If we fundamentally approach design problems and product problems thinking about the largest possible user base in mind, we ultimately build products that are more successful,” he said of Entrust’s philosophy on prioritizing accessibility in its work.
As to the EAA, Opland said the legislation is a directive aimed at “[making] a wide range of products and services more accessible to people with disabilities,” adding the European Union considered the things people used day-to-day in an effort to contribute to the betterment of society and wanted to find a way to “[encourage] greater inclusion and breaking down the barriers across the European Union for all people.” The EAA, he continued, touches myriad industries and, as such, while compliance to the EAA is compulsory, the byproduct of it is what Opland characterized as enabling businesses to “tap into a much larger customer base.” He pointed to a large bank in the United Kingdom who reported its total addressable market increases by more than 10% when they build products with accessibility in mind. “For the European Accessibility Act and the European Union, it’s not only about providing access, but about building their GDP and increasing the [gross domestic product] for all their member states,” Opland said.
For Entrust’s part, Opland made crystal clear his main job as it relates to the EAA is to ensure the company enters into, and then maintains, compliance with the law. Entrust must follow the law’s legal structure and, more pointedly, “we can’t be building and shipping products anymore that are not accessible.” Opland was forthright in telling me he cares not about being the “accessibility police” and running around into people’s offices internally to enforce abiding by the EAA. Instead, he told me the company has spent lots of time leading up to last month’s deadline auditing and doing remediation. Moreover, Entrust has focused its energies on prioritizing advocacy and evangelism with the goal of what Opland said is “building a culture of continuous improvement.”
“Our goal is to make sure every team at Entrust, whether it’s Human Resources or an engineering team, is focused on making sure they’re better this quarter than they were the last quarter and better next quarter than they were this quarter,” Opland said. “That advocacy has us out of the cycle of managing accessibility from audit to audit, and seeing the job is being done when we earn our accessibility accreditation. This focus on continuous improvement means it’s top of mind for everyone in the company and has now become part of our DNA… that’s been the secret to our success over time.”
Opland acknowledged coming into compliance with any sort of law has its challenges, but in context of the EAA, the economic and social benefits can make the headaches worth it. Especially from a social justice perspective, he said “it’s been fantastic” to work with the law at Entrust because it aligns with both his personal and the company’s institutional values. The main theme that threaded my conversation with Opland is that greater accessibility vis-a-vis the EAA is two-pronged: it benefits people obviously, but it also benefits businesses. The more people one markets to, the bigger its bottom line can become. Understanding those principles takes education, and Opland told me it can be challenging unto itself to teach people how to make accessibility happen. The EAA, as a law, and accessibility standards like WCAG aren’t necessarily congruent with one another; Opland said they “aren’t always black and white… in some places they’re gray.” Entrust understands “there’s always a tradeoff between usability and security,” according to Opland, which isn’t always a question with a black-and-white answer.
“What we’ve discovered is the more you have those conversations, the more you dig in and the more you learn, the stronger and more resilient you become,” Opland said of the company’s learnings. “Accessibility is a unique challenge in that there is often quite a lot of subjectivity and just a huge spectrum in human ability. There isn’t just sort of a one-size-fits-all solution that’s going to allow me to wave a magic wand to make everything accessible. I think it’s just a constant cycle of learning and improving.”
Entrust has been working on accessibility for close to a decade, or 9 years, now. This gave Opland and team a lot of runway in terms of comfort and confidence when the company felt the looming EAA deadline. The work on compliance, he told me, had been an 18-month effort into understanding EN 301–549—which is linked to the EAA—WCAG, and the EAA itself. Companies like Entrust who are generally concerned with the aforementioned WCAG standards, Opland told me, are “in a really good position to be compliant with the law with the exception of a few slightly more specific directives.”
“If you’ve been focused on WCAG, you set yourself up really well,” Opland said. “We’ve had a pretty big head start and have been positioned pretty well to be compliant.”
Opland is optimistic the EAA will help make accessibility more top of mind and more present in products. The European Union, he told me, has set a standard because to EAA applies to anybody who wants to do business in the Union, so products and websites must meet the new regulation. In the United States, Opland pointed to the Americans with Disabilities Act as greatly improving the quality of life for the disability community, but conceded there is more work yet to be done. He hopes businesses everywhere “continue to invest” in accessibility for the people—and for their business.
As to the future, Opland and Entrust are committed to walking the righteous path.
“Our hope is we are continuing to build products and services that enable more and more people to enact with their communities, to enact with the businesses around them, [and] to have more opportunities and greater advantages in the lives they lead,” he said of Entrust’s view of its future work. “There’s something really meaningful and deep in doing so. Identity is such a great vehicle to help advance underrepresented folks in all stations of life, and accessibility is one important aspect of that. If you track back to Entrust’s mission, it perfectly aligns with our mission. It perfectly aligns with our growth as a business. We were working on [accessibility] long before the law mandated we do it. We’ll continue to invest in accessibility, whether the law continues to mandate it, so it just aligns perfectly with our mission, with our values, and with our business.”
How Type to siri trumps courtesy and convenience
MacRumors contributor Aaron Perris posted on X today Apple has started airing a new Apple Intelligence ad which highlights Type to Siri and ChatGPT. The 15-second spot, which also features iPhone 16 Pro on T-Mobile’s network, takes place in a workplace elevator. As of this writing, the video isn’t (yet?) on Apple’s official YouTube channel.
I wouldn’t typically cover the advent of a new Apple commercial, but this particular one merits an exception. During the Apple Intelligence portion of last year’s WWDC keynote, senior vice president of software engineering Craig Federighi talked up Type to Siri as a feature of convenience: it’s a mode by which people can quietly interact with the virtual assistant so as not to be disruptive of others. The reality is Type to Siri is not an all-new feature; it’s existed as an accessibility feature on iPhone and iPad since iOS 12 in 2018.
This context matters greatly in the grand scheme. It is extremely noteworthy that Apple “graduated” what was once an ostensibly esoteric, niche assistive technology and expanded upon it so as to become more mainstream. Despite Federighi’s message to the masses that Type to Siri is about courtesy and convenience, the truth is the feature’s benefits for accessibility remain bountiful. Yes, courtesy and convenience are important factors, but Type to Siri is a great feature whereby a Deaf or hard-of-hearing person or, in my case, someone with a speech delay can interact with Siri with complete fidelity without the voice component. That isn’t at all trivial or ancillary to Apple’s core messaging. The overarching point is Type to Siri illustrates yet again that accessibility oftentimes is an incubator for innovation—it’s something Apple rarely, if ever, gets lauded for by those who comprise the mainstream technology commentariat.
As I alluded in the previous sentence, Type to Siri stands not alone. The pointer feature in iPadOS began life as an AssistiveTouch feature, of which Apple’s Sarah Herrlinger told me years ago “isn’t your typical cursor.” My understanding has long been the company’s Accessibility team handed off the AssistiveTouch feature to the broader iPadOS software group so they could massage it into something meant for more mass adoption. Likewise, the Double Tap feature on Apple Watch germinated as an AssistiveTouch feature in watchOS, was then similarly made over for broader applications. Many popularized modern technologies—audiobooks, speech-to-text, et al—were invented by disabled people for their unique needs, then adopted by the able-bodied masses for their enjoyment. As Dr. Victor Pineda told me last year, the disability community is chockfull of technologists out of sheer necessity. Technology makes the world more accessible to people like Dr. Pineda (and yours truly). Last December, Apple used its precious holiday ad space to highlight the hearing aid feature on AirPods Pro. My understanding is the ad, called “Heartstrings,” was the first time the company used an accessibility feature in the holiday campaign—and for good reason. It shows the profundity of assistive technologies truly being for everyone, with earbuds everyone uses every day. It’s a rare example of people being able to have their cake and eat it too.
So yeah, Type to Siri is highly significant—especially so, again, in a TV commercial.
Max Says ‘Sinners’ will Stream in Black American Sign Language in ‘Groundbreaking’ First
In a press release published on Monday, popular video streamer Max announced the Ryan Coogler-helmed movie Sinners will stream in Black American Sign Language (BASL) alongside the original version in the United States on Independence Day this Friday, July 4th. Max touts the release is “groundbreaking” while noting it “marks the first time a streaming platform will exclusively debut a film interpreted in BASL.”
The BASL interpreting of Sinners will be done by Nakia Smith. A trailer is on YouTube.
“The release of Sinners with BASL is a major step forward in accessibility, representation, and visibility in streaming. BASL is a distinct dialect of American Sign Language (ASL) with its own dynamic history and unique grammar, signing space, rhythm, facial expressions, and cultural nuances,” Max said in its announcement. “For the first time, the Black Deaf community will have streaming access to a more immersive experience in their language. Max subscribers, who sign in ASL but are unfamiliar with this dialect, will also be able to follow along with this interpretation.”
Max, like its peers in Apple and Netflix, reaffirmed its support of disabled people.
“Accessibility within streaming is not a one-size-fits-all approach. Our goal at Max is to make these great stories accessible to all audiences in a way that is authentic to the content and the communities we serve,” Naomi Waibel, senior vice president of global product management at Warner Bros. Discovery, said in a statement included in Max’s press release. “Sinners with Black American Sign Language is an example of how culturally nuanced access can enrich the viewing experience for our audiences.”
Today’s news comes after Max announced a similar initiative this past March to stream The Last Of Us in ASL. Elsewhere, the National Hockey League, which has a TV deal with Warner Bros. Discovery-owned TNT, has aired highly successful “NHL × ASL” broadcasts in partnership with PXP. They’ve proven so successful with fans the league earned Sports Emmy nominations for its work in furthering accessibility and inclusivity.
Max announced in May it’ll mercifully rebrand itself (back) to HBO Max “this summer.”
Accessibility Amplifies Apple Music’s first decade
Apple on Monday issued a press release in which the company celebrates Apple Music’s 10th birthday by sharing some big announcements about the service. The headliner is a new three-story campus based in Culver City, which Apple says sprawls more than 15,000 square feet and houses two radio studios and a 4,000 square foot soundstage. The new campus is designed to “anchor a global network of creative hubs” in other cities such as Berlin, Nashville, New York, Paris, and Tokyo, according to Apple.
“Apple Music Radio has always been a home for storytelling and artistry, serving as a space for bold conversations and surprising moments,” Rachel Newman, co-head of Apple Music, said in a statement included with today’s announcement. “With this new studio, we are furthering our commitment to creating a space for artists to create, connect, and share their vision.”
Amongst the other news is the advent of a playlist Apple calls Replay All Time, which the company describes as “a special version of the annual Replay experience that allows listeners to see and stream the songs they’ve played the most since joining Apple Music.” Replay All Time can be found in the Home tab in the Music app, Apple said.
As with Apple Podcasts, Apple and music have been constant in my everyday digital life since getting my first-ever Apple product, the original iPhone, in 2007. Until Apple Music debuted in 2015. I spent a lot of money buying songs and albums in iTunes and synced my music via cable to myriad iPods and iPhones of various vintage. Those purchases remain available to me today, of course, along with the streaming content Apple Music provides. From an accessibility standpoint, the “all-you-can-eat” model of streaming Apple Music is great because I no longer need to fiddle with a physical cable to sync data, which involves somewhat tortuous tests of my lackluster hand-eye coordination. It’s also easier on my wallet too, since I needn’t budget money on individual albums from my favorite artists. Likewise, that the iPhone subsumed the iPod’s functionality—the Music app in what was then known as iPhone OS 1 was literally named iPod—means I have an “all-in-one” device and needn’t carry a separate iPod along with a cell phone, a setup I contemplated prior to getting the iPhone. It’s more accessible for me to carry one small object than two (admittedly small) objects, especially when you have relatively compromised muscle tone in your hands and less strength overall to accommodate weightiness. Moreover, from a software standpoint, it’s meaningful how the Music app supports accessibility features such as VoiceOver, Dynamic Type, and most recently, Music Haptics. It makes the listening experience more accessible—and thus more enjoyable—that I can follow along to the words in a song in the app’s Lyrics View on the Now Playing screen, for example. And once again, with the nature of streaming and technologies like iCloud sync, I can move from my iPhone to my iMac and beyond and have all my music ready when I’m ready to listen.
I look forward to using Apple Music into 2035 and beyond.
Google Adds Captions to Gemini Live Conversations
Abner Li reported for 9to5 Google earlier this week Google has added support for captions to Gemini Live in the company’s eponymously-named Gemini app on iOS and Android. The captions began appearing for some users earlier in June, according to Li.
“When you launch Gemini Live on Android or iOS, a rectangular captions button appears in the top-right corner. Tapping will enable a floating box that provides a transcript of Gemini’s responses. (This does not show what you’re saying in real-time, but that remains available in the full text transcript after ending the conversation.),” Li wrote. “It appears near the middle of the fullscreen interface in audio mode, and at the top when video streaming is enabled. These three lines of text cannot be moved or resized. In Gemini > Settings, there’s a new ‘Caption preferences’ item underneath the ‘Interrupt Live responses’ on/off toggle that links to system settings on Android.”
The big takeaway is, obviously, conversations with Gemini will be more accessible.
As I’ve noted before, I have the Gemini app for iOS on my iPhone’s Home Screen, as well as a widget on the Lock Screen. I really enjoy Gemini as my preferred generative AI tool, and have found it has supplanted much, if not most, of my web searches via Safari. I find Gemini to be a way more accessible (and digestible) method to get quick bursts of information collated in one place rather than manage a half-dozen browser tabs. “Trust but verify” goes the axiom, of course, so I’m well aware Gemini will (and does!) hallucinate from time to time, but I’ve been more than satisfied with its performance overall. I have a ChatGPT Plus subscription too, since notably Apple Intelligence integrates with it, but I generally like the Gemini app experience better. Perhaps that’ll change once the Jony Ive-Sam Altman partnership bears more fruit, but for now, I’m a happy Gemini user. Despite the rapidity with which AI seemingly advances nowadays, the reality is the technology still is really early in the proverbial ballgame. That Google—and OpenAI, for that matter—is clearly committing to making its respective tools accessible is a ray of hope for the inclusiveness of AI’s ever-burgeoning capabilities.
Google’s Successor to the Nest × Yale Lock Arrives
Ben Schoon reports for 9to5 Google the first-ever “Google Home Preferred” product is here and it’s the successor to the Nest × Yale lock: the $189 Yale Smart Lock with Matter.
The Matter moniker is an important detail, as it means the Yale Smart Lock can be used with Google Home, Alexa—and, pertinent to my preferred ecosystem, Apple’s HomeKit.
“This lock is very much designed to fit into Google’s ecosystem and acts as a successor to the Nest × Yale Lock,” Schoon said of Yale’s newest smart lock. “The design of the lock is meant to match the finish of Google’s Nest Doorbell lineup, with ‘Snow’ and ‘Matte Black’ finishes available today and an ‘Ash’ colorway coming later on. The accents on each are meant to match common door hardware finishes. As a backup to your app or a keycode, there’s a keyhole which was missing on the Nest × Yale Lock.”
I’m writing about this because (a) Curb Cuts is my website; but (b) because I’ve been using the legacy Nest × Yale lock for a few years now. It still works with aplomb, but admittedly part of my allegiance to sticking with it is due to the fact I’m simultaneously clutching to the OG Nest app on iOS and iPadOS for dear life. I do have Google Home on my devices too, but the UI, design-wise, is inferior to that of the old Nest app. Someday the Nest app will be put out to pasture and I’ll begrudgingly have to adopt Google Home. But today is not that day, so I’ll be riding with the Nest app until the absolute very end.
Speaking of an end, this week’s news from Schoon on the new Yale Smart Lock means damn near every device in my smart home setup—all devices running through HomeKit via the Starling Home Hub—is, while remaining perfectly serviceable in a functional sense, is otherwise “antiquated” and summarily discontinued technologically.
Nest Hello doorbell
Nest E thermostat (with accompanying room sensors)
Nest Protect smoke and carbon dioxide detector
Nest Outdoor Cams
Nest × Yale door lock
The Nest × Yale lock in particular has been a game-changer for me in terms of accessibility. It only controls the deadbolt, however, as my partner still prefers a physical key for the actual doorknob. Nonetheless, not having to fiddle with the key on both locks is far more accessible; my lackluster hand-eye coordination makes it such that it can be hard to find the keyhole, insert the key, and turn. Especially when coming home with, say, a bag of groceries, that I can use my iPhone—or better yet, Siri—to unlock the door transcends sheer convenience. It makes my house more accessible.
As to a potential upgrade, I’m intrigued by another Yale product: the Assure Lock 2 Plus. On its website, the company describes it as “the smart lock made for Apple users” as it supports Apple’s Home Key feature. Released with iOS 15 in 2021, Home Key allows users to use their iPhone or Apple Watch as their “house keys” by integrating with the Wallet app on iOS and watchOS. The reason I’m so fascinated by Home Key is, of course, accessibility; instead of tapping a button, I could simply hold my device close to the Assure Lock 2 Plus and the door would unlock. This is exactly why I adore the Auto Unlock feature in Waymo, whereby the car doors unlock as you approach. The Waymo One app does have an Unlock button, but it’s far more accessible to not have to tap it.
For the foreseeable future, though, I’ll be clinging to my OG Nest × Yale lock.
My bit Part In Apple Podcasts’ two Decade story
Apple on Thursday celebrated a big anniversary: Apple Podcasts has turned 20! To mark the milestone, the company released a staff-selected list of “20 podcasts we love.”
“Since the medium came to iTunes in 2005, our team has dedicated countless hours to helping people discover new shows. To celebrate 20 years, here are 20 favorites that best exemplify how far podcasting has come—and where it can go in the next two decades,” Apple writes in the list’s introduction. “This list is a love letter to the podcasts that left a lasting impact on us and the ones we continue to recommend again and again. They are shows with hosts that feel like friends, and shows that make us press play immediately on the latest episode to hear what happens next. These shows have measurably improved our lives and helped define this medium we know and love.”
Of those on Apple’s list, only The Daily (started in 2017) is one I listen to religiously.
Launched in 2005, Apple Podcasts predates my usage by a couple years; my first-ever Apple product was the original iPhone two years later. Over the last 18 years, however, podcasts have remained a constant in my digital life. I love them for the background noise they provide as I work on stories like this very piece, for the ways newsy shows like the aforementioned The Daily keeps me informed, and for the ways they let me indulge in nerdery on shows from friends of mine in the Apple/tech media communities. Once upon a time, I even had a podcast of my own called Accessible. The show’s website/listing is long gone from the web, but it was a fortnightly program I co-hosted with my close friend Timothy Buck during which we discussed all things accessibility in tech. We had a good run, even interviewing Apple’s accessibility boss in Sarah Herrlinger in person at one San Jose-based WWDC. I’m decidedly not an active podcaster nowadays, but have guested on my share of shows since Accessible unceremoniously ended. I do think about getting back into the game from time to time, but for now, I think I’ll focus my energies into getting Curb Cuts featured in Apple News.
(If you’d like me on your podcast to talk disability inclusion and the like, get in touch.)
From 30,000 feet, Apple has generally been a strong steward of its Podcasts platform. From an accessibility perspective, it’s certainly damn notable how the company has invested time and resources in making podcasts more accessible through transcripts. As with Music Haptics in Apple Music, it’s not at all trivial that, as I’ve espoused many times recently, Apple is taking an ostensibly exclusionary medium to, say, Deaf and hard-of-hearing people and making it eminently more inclusive vis-a-vis transcripts.
Apple’s “20 Podcasts” list follows a “100 Best Albums” list shared on Apple Music.
Thoughts On AirPods, Cameras, and Accessibility
Ryan Christoffel reports for 9to5 Mac this week a new feature in iOS 26 is a harbinger of camera-equipped AirPods Pro. Among the enhancements coming to AirPods this year is a Camera Remote feature. The functionality is similar to that on Apple Watch, whereby users can use the Watch as a shutter button; as Christoffel writes, users can “use AirPods to capture photos and video on your iPhone in situations that’s helpful… either by pressing once on the AirPods stem or pressing and holding—your choice.”
Christoffel goes on to say the Camera Remote feature on AirPods is notable because rumors suggest the next generation AirPods Pro are, again, said to include cameras. A release timeframe is unknown outside of Apple Park, but Bloomberg’s Mark Gurman has reported the refreshed earbuds feature “external cameras and artificial intelligence to understand the outside world and provide information to the user.” That iOS 26 does include the Camera Remote feature is evidence Apple is getting its proverbial ducks in a row by ensuring its software can support its forthcoming hardware whenever it comes.
There’s much to extrapolate from Christoffel’s informed speculation, not the least of which is how intrepid observers (and spelunkers) oftentimes will notice Apple setting the stage in advance of grand reveals. The size class APIs introduced at WWDC 2014 presaged the iPhone 6 and 6 Plus announcement a few months later. HealthKit, also new in 2014, was destined for Apple Watch. At WWDC 2015, iPad multitasking by way of Slide Over and Split View were announced prior to the iPad Pro’s unveiling. And in more recent times, the advent of ARKit in 2017 seemingly provided clues Apple was working on something using augmented reality; that hunch proved prescient as the company eventually revealed Vision Pro at WWDC 2023. It could even be persuasively argued the all-new Liquid Glass design language, ostensibly meant for today’s devices, was created with an eye towards still-in-development AR-powered glasses for tomorrow.
In an accessibility context, the Camera Remote feature on AirPods strikes me as fascinating. To wit, it’s entirely plausible someone who wears AirPods also doesn’t have use of their arms/hands, whether wholly or in part. While it’s possible today for said person to use voice to control playback and the like, having Camera Remote on AirPods would give a person whom I mention another avenue through which to accessibly take pictures. Likewise, that Gurman said Apple reportedly plans to imbue these future AirPods Pro with AI chops such that people could better “understand the outside world” means, for example, navigation could become easier, as might people recognition. Imagine Siri telling a Blind or low vision person, Her-style, that the person approaching them to say hello is their brother or sister or someone else in their digital rolodex. There are other possible applications, of course, but these examples are intriguing because they potentially can make the world literally more accessible to disabled people. It’s cool tech, but also genuinely useful—and empowering to boot.
Such a sentiment is a common refrain with AirPods Pro lately.
Waymo is making its way to new York city
Andrew Romero reported late last week for 9to5 Google that Waymo has teased New York City as its next place it plans to “plant autonomous ride-sharing.” As Romero caveats, however, the proverbial seeds are going to take some time to bear fruit.
According to Romero, the Alphabet-owned Waymo took to X last week to share it has officially initiated the process to get its autonomous vehicles running on the streets of New York City. The company announced it has applied for a permit with the city’s Department of Transportation as a first step, noting a “specialist” from the agency will be sitting behind the wheel of Waymo’s Jaguar SUVs. Waymo also noted it’s working with state lawmakers to amend legislation so as to legalize fully autonomous vehicles.
“We want to serve more people in more places, including New York,” Waymo said.
The NYC news comes after Waymo announced expansion to Washington DC in March.
I’ll have more Waymo news on Curb Cuts in the coming weeks, but for now, any news of expansion is great for accessibility’s sake. Last week, I took part in a panel discussion at the Accessible Futures conference during which I spoke of the immense accessibility gains rideshare services have for Blind and low vision people such as myself. Of course Lyft and Uber have relevance here, but I spoke most enthusiastically of Waymo and the positive effects it has on my life. (Video and a transcript will be posted soon, I’m told.) As Waymo attests, there’s a helluva lot of bureaucratic stuff to tend to first, but any expansion news is heartening because it means (a) Waymo is doing well, business-wise; and (b) arguably more importantly, it means greater agency and autonomy in transit for myself and others like me across the country. The more places Waymo sets up shop, the more places we non-drivers can go—with greater independence, no less.
The Waymo-to-NYC news is joined by Tesla launching its robotaxi service in Austin.
Amazon’s Recent Kindle Software Update Adds More Line, Text Spacing options for accessibility
Andrew Liszewski reports for The Verge today on Amazon releasing a Kindle-based software update to users which, according to the release notes, includes the boilerplate performance enhancements and bug fixes, but notably includes upgrades for better visual accessibility. He writes the improvements come by way of “adjusting text and line spacing, improving legibility and accessibility for many users.” The new 5.18.3 update is supported by the Kindle Scribe, Kindle Colorsoft, as well as the 11th and 12th generation of the Kindle and Kindle Paperwhite models, according to Liszewski.
“Amazon is slowly rolling it out through the Kindle’s automatic updates system,” Liszewski said of the company’s recently released upgrade. “[If] you don’t want to wait, you can download the specific update file for your e-reader, copy it over to your device, and perform a manual update using the instructions Amazon has provided.”
I have a Paperwhite from 2018 and it remains a nice piece of kit, although I haven’t used it in quite some time. While I find e-ink displays to be generally accessible and easy on my eyes, I’ve actually come to favor using Apple Books on an iPad for reading books. The brightness and sharpness of the display—especially on that of the OLED screen on the M4 iPad Pro—is far nicer to look at and even more accessible. That said, I’m deeply intrigued about the aforementioned Kindle Colorsoft from Amazon; I’d love to try it out someday and then subsequently write about my experience using a color e-ink screen.
News of the Kindle 5.18.3 update was first reported by The eBook Reader.