Apple Honors Speechify, Art of Fauna for accessibility at this year’s Apple Design Awards
At WWDC 2025 this week, Apple recognized this year’s Apple Design Award (ADA) winners. The Cupertino-based company celebrated a dozen apps and games for the honor, with one app and one game winning in six categories: Delight and Fun, Innovation, Interaction, Inclusivity, Social Impact, and Visuals and Graphics.
Pertinent to accessibility is, of course, the Inclusivity category. The winners here are Speechify and Art of Fauna. Speechify transforms written text in audio, with support of hundreds of voices and over 50 languages. The app also features robust support for iOS accessibility stalwarts in Dynamic Type and VoiceOver, and is an indispensable tool for anyone who copes with conditions such as ADHD, dyslexia, and/or low vision. Regarding Art of Fauna, made by Austria-based Klemens Strasser, the game is a conservation-themed puzzle title which incorporates wildlife imagery. The game tasks players with rearranging visual elements or pieces of descriptive text, with the game supporting accessibility by way of rich screen reader and haptic feedback support.
Apple announced the ADA finalists and winners earlier this month. The honorees spanned the globe and were chosen as winners because their work demonstrated “excellence in innovation, ingenuity, and technical achievement,” according to Apple.
“Developers continue to push the boundaries of what’s possible, creating apps and games that are not only beautifully designed but also deeply impactful,” Susan Prescott, Apple’s vice president of worldwide developer relations, said in a statement. “We’re excited to celebrate this incredible group of winners and finalists at WWDC and spotlight the innovation and craftsmanship they bring to each experience.”
On a related note, the physical trophy Apple gives to ADA winners is truly something to behold. The hardware is a dense, substantially weighty cube milled from a single piece of aluminum. I bring it up because, at last year’s WWDC, I got to see (and hold!) an ADA trophy for the first time. It happened during a briefing at Apple Park with the makers of Oko, an iPhone app which leverages artificial intelligence to make street-crossing more accessible to Blind and low vision people. The team had the cube proudly displayed on a coffee table during our discussion, after which they asked if I’d be interested in picking up the cube and feeling it. I’m sure there exists a picture of me somewhere.
The salient point? The ADA trophy has every bit the fit and finish of an iPhone or iPad.
Inside Unlimited Play’s Mission to Make sure ‘No Child is left on the sidelines’ at the playground
Take a gander at the homepage of Unlimited Play’s website and you’ll see the nonprofit organization makes an unequivocal proclamation: “At our playgrounds, no child is left on the sidelines.” Dig deeper and you’ll notice Unlimited Play’s philosophy that “no matter their physical or cognitive abilities, [children deserve] to feel welcome and experience the joy of play without barriers.” Make no mistake, adaptive playgrounds are shining examples of assistive technologies existing, quite literally, in the real world.
Unlimited Play’s origin story begins with a 3-year-old boy named Zachary Blakemore. He lived with a rare genetic condition known as Pelizaeus-Merzbacher disease, or PMD, which required him to use a wheelchair. Blakemore, like any other kid, relished visiting the playground—the problem, however, was it was unbuilt for his needs. It was inaccessible. These experiences pushed Blakemore’s mom, Natalie Mackey, to start Unlimited Play in 2003. Four years and $750,000 later, the first inclusive space, aptly named Zachary’s Playground, opened in Lake Saint Louis, Missouri on April 21, 2007.
As Mackey told me recently, she never imagined Unlimited Play becoming a thing.
“We had reporters there that day [of Zachary Playground’s opening] and they asked me if I would continue the work into the future. I said, ‘Not a chance.’ It was four years of fundraising and I was excited to let other people follow what we had done,” she said. “But we’re talking today because families like mine and cities more and more have been calling to ask we help and continue the journey of providing inclusive play for everyone. We have over 100 projects from east coast to west coast, even into Canada. Every day, we’re contacted by families looking for ways to have their children play.”
It’s been over two decades now Mackey has served as the nonprofit’s founder and chief executive officer. She explained she’s worked with "so many families” while noting she and her team are “continually learning” along the way. Mackey shared an anecdote on a current project involving a boy named Teddy, who has dwarfism. Typical playgrounds, she told me, are inaccessible to him, adding Unlimited Play’s experience of working on his project has proven enlightening. “[It’s] taught us things about that population and how to better design playgrounds for individuals with dwarfism,” Mackey said.
When asked what exactly constitutes an adaptive and inclusive playground, Mackey said there are several elements. A lot of the design work is topographical, what with using soft turf on the ground, as well as ensuring wheelchair access by way of ramps and sidewalks. Similarly, communication boards are crucial for interaction, and the use of large type and bold colors can be beneficial for those with low vision. It’s quite the challenge, Mackey went on to tell me, to incorporate all these things while also bearing in mind typically developing children likely want to play along too. Everybody, she added, has their own needs and tolerances regardless of one’s ability level(s).
Of course, Unlimited Play constructing playgrounds isn’t like Apple constructing Apple Park. Unlike Apple, Mackey and Unlimited Play has not unlimited funds with which to procure everything needed to build these playgrounds. They need help, which Mackey and team found in the folks at Little Tikes Commercial. The subsidiary, a scaled-up offshoot of the children’s products company, manufacturers playground equipment—including inclusive ones like Zachary’s Playground. According to Mackey, Unlimited Play’s partnership with Little Tikes Commercial means her organization gains the ability to train “about 120 representatives” on the importance of inclusive play and, more pointedly, the ABCs of building accessible play-based environments. After said training, Mackey told me the trainees go out and effectively act as ambassadors to Unlimited Play; this enables the nonprofit to have “many more feet on the ground” doing this evangelism. Mackey’s role involves reporting to the Little Tikes team on the feedback she’s heard from communities “then work together to create new products,” she said.
To work with a well-known entity as Little Tikes is game-changing for Unlimited Play.
“I never imagined that any big company would care,” Mackey said. “Especially when my son was little, life felt very isolating… like nobody understood what I was going through and what I was desperately wanting to provide for my child. For Little Tikes Commercial to say, ‘We see you, we understand, and we want to be part of this’ was really exciting and meaningful for me. For a small nonprofit, it helps give credibility to our mission to say we have such a successful corporation believe in us and backing Unlimited Play.”
Looking towards the future, Mackey was succinct in sharing she hopes to continue pushing forward in fulfilling her organization’s mission. Ideally, she hopes playground standards are raised on a national level so as to reflect the needs of disabled children and their families. Mackey hopes this work raises enough awareness that it doesn’t always take a small-time nonprofit like Unlimited Play to do all the heavy lifting. More parks and recreation staffers should know kids like Zachary deserve open, welcoming, and accommodating spaces in which to play—the problem is most don’t recognize the barriers present in the majority of neighborhood playgrounds across America today.
Zachary died in September 2021, but his legacy lives on in his mom and her mission.
“I would say it’s important to tell people we must be creating environments where everyone can thrive at their very best,” Mackey said when asked to distill Unlimited Play’s raison d’être. “We grow as communities and grow as a nation. My son had a hard time doing typical things, but what he brought was so much. He brought creating environments where we all feel like we belong [and] we can all be our best… we only, as communities and nations, get better by doing that. Play, I truly believe, is the one language we all speak. It’s the international language we all spoke to begin with and how we learn from each other. We should care about creating environments that make that language possible for us to understand, grow, and develop even better.”
New ‘F1’ Trailer makes Movies More Accessible
Apple’s Greg Joswiak took to X earlier today to post about the new trailer Apple released for its upcoming Apple TV+ film F1. The executive, the company’s senior vice president of worldwide marketing, describes what Apple calls a “haptic trailer” as “the coolest trailer ever”—and his boastfulness is deserved. I just watched the 2:10 preview, available in the TV app on iPhone and it is incredibly cool and, dare I say, innovative.
From an accessibility perspective, applying haptic feedback to movie trailers is a genius-level move. It brings a level of access and immersion that transcends sheer coolness, as Joswiak said. To wit, someone who’s Deaf or hard-of-hearing, or Blind or low vision, could literally feel the rumble of the cars as they race around the speedway. Likewise, even for someone with typical hearing and vision, the added sensory input makes for a richer experience because, once again, a person is able to have a tactile approximation of the cars’ horsepower. When I first read about the F1 trailer, my mind immediately recalled the similar-styled Music Haptics accessibility feature Apple introduced last year in iOS 18. I wrote about Music Haptics again recently, and the advent of Apple’s novel “haptic trailer” stands not only as yet another example of potential accessibility, but of the company’s vaunted vertical integration as well.
I hope the F1 creative team appreciates how cool this technology is. Especially for people with limited hearing, if any at all, that haptics are present in the trailer makes it such that the cars’ power can be felt if not heard. This is exactly the value proposition for Music Haptics; a Deaf person, like Troy Kotsur’s Frank Rossi in CODA, can enjoy music because, as he says in the film, it “makes my ass shake.” It isn’t merely a funny, throwaway line: haptic feedback makes ostensibly exclusionary arts like music more inclusive to the Deaf and hard-of-hearing community. I don’t know the nitty-gritty technical details on the F1 trailer, but it’d be great to see Apple someday release a public API for App Store developers to hook into their own app(s). Imagine, for instance, sitting down to watch Star Wars on Disney+ and being able to feel explosions or the rumble of lightsabers. Haptics makes the content not only more immersive—it’s accessible too.
The Brad Pitt-led F1 opens in theaters on June 27. It’ll stream on TV+ later this year.
It’s Official: Yours Truly Needs a new Mac Soon
Amidst this week’s hubbub regarding Liquid Glass and more, one tidbit of news has me feeling melancholy. The next version of macOS, named Tahoe, is the last edition to support Intel-based Macs. This is important news because, as I suspected, I’ll need to upgrade my Mac pretty soon. My trusty Retina 4K iMac from 2019, the 21.5-inch model, is sated to stay on Sequoia forevermore. You’ve served me with great honor, old friend.
According to Apple, macOS Tahoe is “the last release for Intel-based Mac computers,” adding the machines will get three years of security updates. The news coincides with the company’s plans to sunset its Rosetta 2 translation system, introduced in 2020 to help developers with transitioning their apps from Intel to Apple silicon. “Rosetta was designed to make the transition to Apple silicon easier, and we plan to make it available for the next two major macOS releases—through macOS 27—as a general-purpose tool for Intel apps to help developers complete the migration of their apps,” Apple says. “Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.”
As with my beta-testing strategy for this summer, I’m still contemplating which Mac should supplant my 4K iMac. I have lots of options; I love the all-in-one design of the iMac and its anchoring of my workspace, but am also enamored by a more modular setup headlined by something like the big and bright Pro Display XDR. On the mobile side, I’m tempted to turn the 13-inch M4 iPad Pro I got for my birthday last September into my “laptop” paired with the Magic Keyboard. I’m really excited for the forthcoming (and arguably overdue) Mac-like features coming to iPadOS 26. In terms of my main work setting, macOS, the jump to using an Apple silicon Mac full-time is tantalizing because of Apple silicon-exclusive features such as iPhone Mirroring, as well as a larger display, and other quality-of-life improvements like faster processing and stuff.
My birthday is coming up in September, so perhaps my present will be a new computer.
Apple’s ‘Liquid Glass’ won’t make the sky fall
I was on the ground yesterday at Apple Park covering this year’s WWDC keynote. The star of the show was Apple’s introduction of its all-new Liquid Glass design language. The company has a great session on Liquid Glass in the Developer app. I highly suggest watching it (and other sessions) on a big screen television through an Apple TV 4K; I did so last night on my 77-inch LG C3 OLED last night and it was a blast, if terminally nerdy.
I’m still trying to devise a plan of action for testing iOS 26, et al, throughout the summer. In full transparency, I’m one of those rare birds in tech media circles who neither run the developer betas nor the public betas of the new operating systems. I’m a terrible tester, although I usually do jump to the iOS public beta on my iPhone late in the summer closer to its official release come September. With the advent of Liquid Glass, however, I feel it’s in my best journalistic interest to prioritize testing at least one beta. As I told some Apple employees following the presentation, Liquid Glass truly is a de-facto accessibility feature unto itself. While it’s undoubtedly true Apple’s stated goals of creating cohesiveness and harmony are important to accessibility, the reality is what really matters is how Liquid Glass performs in a practical sense. Practicality entails legibility, contrast, and motion. For people with low vision—and people with 20/20 vision, for that matter—the choices Apple has made with Liquid Glass, the proverbial proof in the pudding lies in usability. All the flowery, romanticized marketing bluster regarding harmony means zippo if Liquid Glass isn’t readable. Personally, I find Liquid Glass to look damn cool and quite beautiful; nonetheless, I’m predisposed to be skeptical as a lifelong disabled person and thus was alarmed by some of what I saw in Apple’s marketing video. Fortunately, my fears were quickly allayed by high confidence in Apple’s track record in accessibility and confirmation that changes will be coming.
I connected with Sarah Herrlinger, Apple’s senior director of global accessibility policy and initiatives, for a few minutes after the keynote ended. It was decidedly not an official, on the record interview, but I can confidently report Herrlinger told me her teams worked in tandem with the design team to build Liquid Glass and make it as accessible as possible. To that end, she noted Liquid Glass works with features such as Reduce Transparency, amongst others, in increasing legibility. I’m sure I’ll have more to report in the coming weeks and months. For now, I’m willing to take Herrlinger at her word, along with the reporter’s grain of salt, that Liquid Glass is accessible unto itself.
As I write this, it’s been roughly 24 hours since Apple introduced Liquid Glass. In that time, the timelines across my myriad social media services have been insufferable. There are so many insipidly bad takes on Liquid Glass from wannabe Apple designers who are posting hot takes to feed into social media’s worst impulses. There’s absolutely room for constructive criticism—👋🏼, journalist here—but there’s also room for common sense. Apple released the first beta of its new platforms yesterday. There is a whole summer yet for the company to tweak and refine Liquid Glass. Of course Apple engineers must reach a degree of “doneness” when readying the beta builds, but they’re betas for a reason: they’re essentially unfinished. The software will evolve before being publicly released later this year. Back in 2013, iOS 7 similarly overshot on the usability vector before dialing back to the mean before its final release alongside the iPhone 5C and 5S in the fall. There’s no doubt in my mind iOS 26 and Liquid Glass will walk the very same path in 2025, so crying Chicken Little seems utterly pointless.
Finally, a cool little personal postscript to Monday’s announcement. My first WWDC was in 2013, the year iOS was first redesigned. 13 years later, I was literally at Apple Park watching iOS 26 getting its Liquid Glass redesign. As a friend said to me after the presentation ended, 13 plus 13 equals 26. All told, I think it’s nice symmetry all around.
How ventura Charted Course to tahoe and beyond
Bloomberg’s Mark Gurman published his spoiler-filled report late this week wherein he reveals what Apple intends on unveiling at Monday’s WWDC25 keynote. Gurman has formed a habit of doing so, a practice my friend John Gruber described as “tradition.”
As with movies and TV shows, abstain from Gurman’s report if you dislike spoilers.
What caught my attention reading Gruber’s comments is the hotly-anticipated visual redesign of Apple’s platforms. Gurman first wrote about it back in March, saying Apple planned to make its myriad operating systems “look similar and more consistent” while adding styles differ widely in terms of iconography and more between platforms. He rightly pointed out said differences “can make it jarring to hop from one device to another.” Apple’s primary goal, Gurman said, is to prioritize consistency, design-wise.
This is the part that struck me hardest. As I said on Mastodon yesterday, the same people who are like kids on Christmas morning regarding the aforementioned design refresh are the exact same people who have bemoaned the redesigned macOS Settings app when it debuted a few years ago. The irony here is these people haven’t a clue they’re talking out of both sides of their mouth; there can be spirited debate surrounding Mac design idioms, as well as how much iOS should invade such entrenched territory, but macOS Settings blazed a trail. To wit, launch the Settings app on one’s Mac of choice and the inspiration is crystal clear: it looks highly similar to that on iOS and iPadOS. Why is that? The cynical view is to say it’s because Apple wants to deepen the so-called “iOS-ification” of the Mac, much the chagrin of diehards. The more charitable viewpoint, however—and I believe the more correct one—is Apple sought to provide (surprise, surprise!) more consistency and likeness between platforms. What the company reveals come Monday morning at Apple Park is taking that prior work on macOS Settings and expanding upon it such to scale it up big time.
Apple’s software engineering groups are bifurcated no longer. This isn’t the iPhone’s early era, circa 2007–2010, where the company built only two OSes: Mac OS X and iPhone OS. Since those days, the company has taken the core underpinnings of iOS and spooled off four more platforms in watchOS, iPadOS, tvOS, and visionOS—with rumors of yet another on the horizon. It makes complete sense for Apple to strive towards more “unity,” more consistency, across its panoply of platforms because the company makes a helluva lot more computers than it used to. I’ve long banged the drum that, from an accessibility standpoint, that Apple took iOS and pulled the proverbial string to build its progeny was a stroke of genius. Especially for people who cope with intellectual disabilities where cognition is atypical—however unstated, these are exactly the type of user Gurman alluded to this past spring—that iPadOS, watchOS, et al, look and behave so similarly to an iPhone is worth its weight in usability gold. It’s accessible in part because it’s consistent. Consistency should be lauded far more as a feature, not a bug. As I said earlier, the design snobs of the internet like to navel-gaze and gripe about idioms and implementation details. This kind of critique certainly does have its place, but particularly in context of the macOS Settings overhaul, the complainers routinely miss the forest for the trees. You’re free to niggle philosophical on Apple’s choices, but I’m here to tell you once more with feeling that accessibility matters. At 30,000 feet, that macOS Settings looks like iOS or whatever is a good thing for a not-insignificant swath of people in the disability community—anyone else’s precious pearl-clutching be damned. Likewise, that “iOS 26” and its compatriots will look and feel of a family is also a very good thing in the aggregate. For accessibility, the family resemblance is of crucial import when it comes to acclimation and comfort. That macOS Settings looks like iOS is a huge, if imperfect, win for legions of disabled people.
Global Accessibility Awareness Day (GAAD) co-founder Joe Devon was spot-on when he shared an anecdote in a recent interview with me about someone lamenting on social media about the “364 days of global accessibility oblivion.” So much attentionis paid to Apple’s annual GAAD announcement on blogs and podcasts, but so much of it smacks of tokenization and patronage. I bring this up because I can’t help but think were accessibility coverage more robust in the Apple sphere, it would be easier to connect the dots between, say, the redesigned macOS Settings app and the updates Apple is readying itself to announce next week. Alas, accessibility is more often than not relegated to 364 days of oblivion because the tech commentariat lack the perspective for it—and, even more frustratingly, the powers-that-be running the tech desks in newsrooms are apathetic towards seeking out the knowledge—with precious few exceptions. What you’re left with are people like myself, perpetually shouting into what feels like an ever-growing black hole with weekend think pieces such as this one.
Anyway, I’ll be in Cupertino on Monday covering all the news from the WWDC keynote.
PlayStation Store Gains support for Apple Pay
On my way home from a brief reporting trip/vacation in the Pacific Northwest, I came across this X post on Sony’s official PlayStation account sharing news that Apple Pay now is a supported payment method on the PlayStation Store for PS4 and PS5. Sony’s post includes a link to this support document with information on how to use Apple Pay.
Sony emphasizes one must ensure Apple Pay is set up for use on their iPhone or other iOS device, as well as there are valid payment methods saved to their Apple account.
News of the PlayStation Store adopting support Apple Pay grabbed my attention as Apple Pay certainly will make purchasing games more accessible. I’ve extolled the accessibility virtues of Apple Pay innumerable times over the years, and use it every chance I get—whether in a brick-and-mortar store or online. In case of the PlayStation Store, the irony here is my PlayStation 5 model is the disc version; most times, I tend to prefer disc-based games to their digital likenesses. This mindset is more sentimental than practical, as I also have a Mega Sg console alongside a cavalcade of game cartridges. My gaming heyday coincided with the Nintendo NES and Sega Genesis, coupled with mobile consoles like the Game Boy and Game Gear. What this means is I’m virtually hardwired to insert (and remove) physical media, (in)accessibility be damned. Ditto for DVDs and Blu-rays. Call it nostalgia—or negligence, given my affinity towards, and need for, greater accessibility—but there’s just something about physical game media. More ironic is my Xbox Series S can only download games from the internet.
Anyway, there lies a schism in my media. Sony’s announcement reminded me of that.
I may need a new Mac Soon, Report Suggests
If this rumor becomes reality, the end is nigh for my trusty iMac.
I became a bit crestfallen today when I read this report by MacRumors’ Hartley Charlton, who cites a report from AppleInsider, that the forthcoming update to macOS—macOS 26; more below—drops support for five older Macs. Charlton says the 2017 iMac Pro, 2018 MacBook Air, 2018 Mac mini, 2020 Intel-based MacBook Air, as well as my aforementioned Retina 4K iMac, all are (purportedly) incompatible with the new version.
I’d love to know how many words I’ve churned out on this machine in the nearly 6 years I’ve had it. This iMac came to me in 2019, which feels almost quite literally like a lifetime ago now. It was a few months after I had a personal tragedy, and a few months into becoming a pig parent. The pandemic was several months away, unbeknownst to me and the rest of the planet. In 2019, I was covering Apple exclusively; the powers-that-be at Forbes at the time wouldn’t approach me about joining its invite-only contributor network until close to Halloween. At the time, I never would’ve dreamed I’d be on the verge of career opportunities that would eventually push my reporting into the stratosphere. My Forbes column opened those doors for me, and I’ll forever be grateful.
All the while, my iMac has been here to see me through it all.
I don’t mean to wax overly poetic about a computer. I know electronics have only a finite lifespan, and it appears increasingly likely that it will reach its end—defined by the stoppage of support for the latest software—which makes it befitting that I eulogize my iMac in advance. I’ve been thinking about upgrading my desk setup for some time now, especially lusting over the also-new-in–2019 Pro Display XDR with either a docked MacBook or perhaps a Mac mini of some sort. As someone whose work primarily involves videoconferencing and plain text files for writing, my spartan compute needs belie my nerdy desire for beefier hardware like the tricked-out Mac Studio my friend and former collaborator Federico Viticci has been testing lately. I’m still weighing my options for my next move, but suffice it to say I’ve greatly enjoyed the all-in-one lifestyle afforded by my iMac. I like that I have a central, dedicated location for work and thus I’m (tentatively) inclined to make a lateral move to the latest one. At the very least, I’m enthused by the prospect of using an Apple silicon-based Mac as my daily driver, despite my Intel iMac still being more than capable of doing what I need for my job.
Okay, about “macOS 26.” Bloomberg’s Mark Gurman has a helluva scoop this week in which he reports Apple plans to change the versioning scheme for its platforms to reflect calendar years instead of version numbers. Lots of people on the internet have taken umbrage over the decision, some asininely so, but I think the reaction is a show of humanity’s adverseness to change. As I said on Mastodon yesterday, EA Sports has used the decades-old practice, dating back to the heyday of the Sega Genesis, to use the upcoming calendar year for its games. For example, the company’s college football title, College Football ’26, is due to come out on July 10 of 2025 and no one is batting an eye. Ditto for its pro game, Madden ’26, out August 14. From an accessibility point of view, it should prove easier for those with intellectual disabilities to know their software is current because it’s based in years. Even for the “normal” non-nerds in people’s lives, the change should be easier to grok; not many of my family and friends are inclined to dive into Settings → General → About to see their iPhone is running iOS 18.5.
As for me, I’m inclined to say I’ll have a new Mac for macOS 26 when it drops this fall.
Inside AXS Labs’ Mission to make the Real world a more accessible place to All disabled People
Two decades have passed since Jason DaSilva was diagnosed with multiple sclerosis (MS) at age 25. A filmmaker known for works such as 2013’s When I Walk, DaSilva explained during an interview last week he was “able-bodied” back in 2005 when doctors informed him of his MS diagnosis. He was living in New York City and began noticing problems with walking and blurry vision. DaSilva has primary progressive multiple sclerosis, or PPMS. It’s described by the National Multiple Sclerosis Society as “an unpredictable disease of the central nervous system that disrupts the flow of information within the brain and between the brain and body,” adding “if you have PPMS, you will experience gradually worsening neurologic symptoms and an accumulation of disability [and] you will not have relapses early in the disease course. You also will not have remissions.” 10% to 15% of people have DaSilva’s type of MS.
“I didn’t know what to do,” DaSilva said of the aftermath of getting his diagnosis. “Then I worked on a couple more films, then turned it around and said, ‘Well, what I really should do is continue my career, but in a way that embraces the MS I have now.’”
What he chose to do was launch a nonprofit organization called AXS Labs. On its website, AXS Labs says its mission is devoted to “building tools, telling the stories of accessibility and inclusion through media, journalism, news and technology [and] [serving] people with disabilities through media and technology.” DaSilva told me his organization’s first project is the eponymously named AXS Map. The impetus for AXS Maps, is obviously accessibility; as a disabled person, DaSilva has long lamented the lack of maps which cater to the disability community. There are “all these maps from Google and Yelp,” but none truly dedicated to providing crucial inclusionary information on accommodations such as wheelchair access and more. DaSilva was resolute in his belief such a tool “needs to be done,” so that’s what he and his team did. AXS Map isn’t new, with work beginning in 2010 before launching to the public two years later.
“[AXS Maps] has been going ever since,” DaSilva said. “We have a big database now, but it’s been going since 2012.”
DaSilva reiterated the sore need for something like AXS Map to exist for the disability community and its allies. He again lamented how there are apps like the aforementioned Yelp, replete with listings and reviews of businesses near and far, but skimps on accessibility information for people like himself—and yours truly, for that matter. Even now, such information remains sparse, but DaSilva said the work is evergreen. AXS Map has grown considerably in 13 years, with the software reaching a point where “we have so many reviews… we keep going and creating new things.”
DaSilva shared an anecdote about living in Manhattan’s East Village and wanting to go to a bar or restaurant. A wheelchair user, DaSilva would venture out for the proverbial night on the town on Friday nights with friends only to sullenly discover a place would have stairs or steps, making it harder to get in, if at all. Many times, it would be downright impossible and the night would end prematurely because of inaccessibility.
“I said, ‘Well, this is obviously a need that needs to be dealt with from a personal perspective,’” DaSilva said. “But I realized it was [also] something that could help a lot of people with whatever they need in terms of accessibility.”
In a technical terms, AXS Map’s data is based on the Google Places API. According to DaSilva, “any place on Google is going to be available on [AXS Map] as long as they’re a business that’s registered with Google.” He emphasized the notion that AXS Maps stands to “provide another layer of information” which Google may not have, calling accessibility information “critical” for so many like himself and others. In fact, lots of disabled people have expressed gratitude to DaSilva and team for offering such an invaluable tool; DaSilva said people are excited to learn AXS Map exists and subsequently are excited to spread the good word about it to everyone else out there.
“It’s something that needed to be done, but there was no way for people to actually do it,” DaSilva said of the motivation to build AXS Maps. “That’s it. I saw something that I could pull the trigger on and get the word out there.”
He continued: “People really like that [AXS Map] exists. I get a lot of feedback from people who wouldn’t otherwise be able to go to places. They wouldn’t know if they’re accessible or not, so AXS Maps really helps.”
As to the effectualness of AXS Map to people’s everyday lives, DaSilva told me it boils down to two things. One, the software allows people to talk about whether places they know are accessible (or not). And two, it enables people to be explorers by pushing them to venture to new parts of their neighborhood or city. “Even if they go to a new city, there are some places they wouldn’t have otherwise known about,” DaSilva said.
In a broad scope, DaSilva said it’s his experience that an increasing number of businesses have become disability-friendly over the years. In New York City, he noted the bar is “certainly getting higher” for prioritizing accessibility—but caveated a big barrier is infrastructure. Most buildings there, he told me, are legacy and thus pretty old; this means their very construction means upgrading to make them “ADA-friendly,” as the colloquialism goes, is a slow (and expensive) process for city leaders and their budgets. But it isn’t an issue solely confined to New York, as DaSilva also cited other east coast metros such as Philadelphia and Toronto also slogging through relative inaccessibility largely because they, too, are older cities filled with older buildings.
However problematic buildings are, DaSilva finds people are keen to help him.
“They’re helpful as they can be,” he said. “They’re helpful… they tell me where to go to [and] tell me if it’s an accessible place. They tell me if they have another entryway or whatever the case may be. They do as much as they can do for me.”
Looking towards the future for AXS Labs, DaSilva said AXS Map in particular is more “database” than anything else. He’s scheduled to soon give a presentation to the United Nations on AXS Maps: how it works and how best to use it. His talk coincides with the UN’s Convention On The Rights Of Persons With Disabilities. Beyond AXS Map, he’s also poised to discuss how artificial intelligence can positively impact the lives of disabled people and how AXS Map fits into the ever-burgeoning era of AI. The technology, he added, has enormous potential to not only map accessible places, but help people in the community get to those places. What’s more, the “robots,” as DaSilva characterized AI, could go in and verify whether places are accessible or not.
PopSockets Announces Kick-Out Grip and stand
MacRumors’ Joe Rossignol reports this week PopSockets has released its newest product, the Kick-Out grip and stand for iPhone. The $40 accessory, which supports Apple’s MagSafe technology, is touted by PopSockets as “[rocking] multiple angles.”
“Unlike other PopSockets, the Kick-Out model offers the long-awaited ability to prop up an iPhone in a vertical Portrait Mode position. This added functionality is useful for watching vertical videos in apps like TikTok, Instagram, and YouTube,” Rossignol wrote in describing the new accessory. “You can twist the built-in MagSafe ring, and then pop open the hinged stand to prop up the iPhone horizontally or vertically on a table.”
Seeing this news immediately took my mind to my interview late last year with PopSockets’ chief executive officer Jiayu Lin. Lin, who’s coming up on her 1-year anniversary leading the company, told me in part PopSockets “[sits] at the intersection of fashion and functionality.” My conversation with Lin coincided with the announcement that PopSockets worked with Apple such that the Cupertino-based captain of industry exclusively carry a collection of MagSafe phone grips in both its online store and retail outposts. According to Lin, PopSockets was “really excited” about the opportunity to work closely with Apple to reach “a new generation of customers,” adding it was a big step towards “forming strong relationships with partners and collaborators and [finding] new ways to get the brand into new locations.”
As I wrote in November—editorializing is a key part of reporting on accessibility, in my strong opinion—PopSockets’ origin story indeed lies in accessibility. Back in 2010, company founder David Barnett, who last fall handed the proverbial reins to Lin to be the company’s next chief executive, grew frustrated by his EarPods’ cords becoming tangled, so he decided to concoct a DIY remedy by gluing two buttons to the back of his phone before wrapping the cord around them. A Kickstarter project followed in 2012, with Barnett directing the money generated from the successfully-funded campaign to humbly start PopSockets from his Boulder, Colorado garage over a decade ago, in 2014.
Lin extolled PopSockets’ virtues by highlighting its position sitting at the intersection of fashion and functionality. As ever, it’s about something else: accessibility. Not only does the grippy nature make it easier to hold and prop up on a table, both especially important to those with muscle tone problems, the MagSafe integration makes it such that applying (or removing) the PopSocket itself is more accessible by virtue of the laws of physics. Ergo, a product like the aforementioned Kick-Out may well be immensely appealing to someone who copes with motor disabilities. In fact, as someone who does have motor disabilities, grip and friction are the primary reasons I insist on using a case on my iPhones. A case may obstruct from admiring the industrial design, but it’s a price I must pay for usability’s sake. Such is life for a nerd who lives with multiple conditions.
PopSockets’ new Kick-Out grip is available on its website now.
Presley Alexander Talks being an autistic actor, Disability Representation in Hollywood in interview
When I asked Presley Alexander recently how they got into acting, they told me it happened as a “total coincidence.” Alexander, who identifies as autistic and queer and is based in Los Angeles, told me they initially wanted to work as a YouTuber and figured why not try dabbling in Hollywood while in town. It turned out they “really fell in love with it” as someone who self-describes as “always [being] a person who has a million different hobbies and likes to try different things.” As an actor, Alexander is deeply appreciative of the amount of latitude they’re afforded to “live as many lives as I want.”
“I would never be happy in just one job… unless it was this one where I get to be a bunch of people,” they said.
The impetus for our conversation began with last month’s Autism Acceptance Month, with Alexander telling me they have a “big passion for autistic representation.” When asked about the current state of disability representation in Hollywood, Alexander lamented they don’t see “a lot” of it, adding the representation that does exist isn’t depicted by actual disabled people, which they characterized as “unfortunate.” Alexander attributed a “big factor” to said underrepresentation as many people going about their lives undiagnosed with condition(s). Many people, they added, “just don’t know” what they’re living with day-to-day, with lots of diagnoses not occurring until much later in a person’s life. Alexander expressed frustration at the commonly-held practice of disability being portrayed by people who literally aren’t disabled in life.
“For the most part, a lot of disabled characters—especially physically disabled ones—are played by actors who don’t have that disability,” they said. “There’s a level of understanding that only someone with a physical disability can bring to a physically disabled part. I have some physical conditions, and I still wouldn’t play somebody who had an amputation or something like that… it’s a very different lived experience to look comfortable knowing that’s how your body works and how you interact with the world.”
Alexander emphasized Hollywood is “definitely moving to a better place” in terms of authentic disability representation. They cited their role as Lane in the 2025 Ben Affleck-led thriller The Accountant 2. The creative team, they said, deliberately put out casting calls for autistic actors to play autistic characters after originally hiring actors who weren’t neurodivergent and wanted to right that wrong. “I think it’s a big step in the right direction, and I’m hoping a lot of productions will follow that,” Alexander said.
Alexander described themself as “a lot of things,” saying their autism affects “basically every part” of their livelihood. They added they believe there exist more people out there who are more similar to them than most realize; Alexander noted how some in the entertainment press bemoaned how the cast in CODA were effectively playing caricatures of themselves—but Alexander stressed that’s okay. They went on to say it’s important for people to understand an actor’s craft isn’t diminished or downplayed when they play characters true to their real-world persona. In fact, they said it can actually be “a lot harder” since “I have to be very self aware of those kinds of things.”
As an autistic actor, Alexander told me autistic representation in film and television remains “in a little bit of a weird place.” On one end of the spectrum, there are shows like Netflix’s Love on the Spectrum, but on the other end, there are shows which entail “gawking at people with autism and how they view the world differently.” Much of their favorite autistic representation on screen, Alexander said, involves characters unintentionally “coming off as autistic,” because, in recalling the previous point about diagnoses, they’ve gone undiagnosed and thus don’t know about their condition(s).
“The thing about autistic people is we think everybody else is struggling just as much as we are—until we get diagnosed and realize there’s something else going on,” Alexander said. “I’ve seen, as an autistic actor, more casting calls coming towards me specifically asking for disabled actors, which I think is really nice. But I think what we really need to be moving towards, in the industry, is characters who happen to be disabled or characters played by disabled actors [and] have us be equally included—characters played by disabled actors just because, not because the plot focuses on it or it’s a central part of their character. They just happen to have a disability.”
Alexander’s prior experience being a content creator on YouTube helped immensely when acclimating to being in front of a camera. It also gave them “a very realistic idea of what the entertainment industry is,” adding “I was very convinced and very headstrong about doing YouTube… it was 100% coming from me.” By contrast, they got to meet lots of people over time who weren’t creating content for their own sake. Such realizations gave Alexander a lot of self-awareness “about the specific and special place I’m in to be able to handle that kind of thing and to be comfortable with the weird life you have to live when you’re a public figure.” Moreover, doing press (like this very interview) has been helpful to Alexander as well, if somewhat “disorienting” as someone who used to be on the other side of the proverbial table. Nonetheless, Alexander said the experiences have helped them learn a lot about what people want to hear and how to talk properly; most of all, they’ve come to realize “most of us are just kind people.”
“A lot of actors and directors and people who work on movies are just regular guys who happen to work in this industry,” Alexander said.
As to feedback, Alexander said she enjoys strong support from their network of family and friends. Their immediate family in particular is “very, very supportive” and they “couldn’t do this” without their backing. Likewise, most of their friends are “grounded in real life” and knew them before Alexander got into acting. Their friends still think of them as “Presley from class,” which is “really nice” because “it keeps me from going a little bit too insane, and they’re all very supportive of what I do.” Many of Alexander’s friends are autistic too, which means they appreciate the representational angle having been diagnosed with autism and ADHD. Alexander conceded there is some “pushback” from family who don’t understand the Hollywood way of business, but do think it’s cool to see them in whatever project(s) they’re working on. Hollywood, Alexander told me, is a “really difficult industry to understand if you don’t live in it and know people in it.”
Looking towards the future, Alexander said “the world is really scary for autistic people right now.” They’re heartened, however, by the strong show of support they’ve received from within the industry, with Alexander saying people have been “very kind to me” and she considers themself “very lucky.” Alexander is optimistic more opportunities will come her way as time wears on, telling me they “don’t plan on being quiet” in terms of their amplification of, and advocacy for, themself and others in their community.
“My main goal is just to continue existing as an openly disabled person and just show that ‘Hey, we’re here too and we’re just as good as anyone else.’ I think my autism makes me better at certain aspects of my job, because I’m very, very analytical when it comes to behavior and pretty good at copying people,” Alexander said of their hopes and dreams for the future. “It’s called parroting for autistics, and I think that helped me in my career. I’m hoping that I can bring—especially at this time where people with autism being seriously targeted—I can be a bit of hope it will be okay. We are so much more than what we’re being told we are, and we deserve to have these flags too.”
Jason Momoa’s New Show Serves As a Reminder entertainment Journalism needs disabled critics
Last week, I stumbled upon a story from 9to5 Mac’s Marcus Mendes about a forthcoming new drama coming to Apple TV+ this summer, at the beginning of August. The series, called Chief Of War chronicles the events of the unification of the Hawaiian islands. The drama stars Jason Momoa in the lead role as Ka’iana. Notably, Chief Of War marks his second time receiving top billing in a guy-kicking-ass Apple TV+ property.
Momoa, of course, played the main character, Boba Voss, in See. The show, about a post-apocalyptic civilization in which everybody has no eyesight, was one of the original titles when TV+ launched in November 2019 and ran for 3 seasons. News of Chief Of War grabbed my attention because of Momoa’s involvement with See, which, in turn, reminded me of its reception. Rotten Tomatoes gives it 63% rating, which admittedly isn’t that great. Likewise, Daniel Fienberg at The Hollywood Reporter said the show lacked “enough depth or vision” in his review while Brian Tallerico of RogerEbert.com said See “[suffers] from a tone that can’t push through the world-building to give us characters or a story to care about.” The message is clear: most pundits, including influential people in the media like critics, widely panned See.
Nearly 3 years later, I’m still pissed about the reception to See. And I’m not the only one: Joe Strechay, a Blind film consultant, told me during a 2021 interview he was “disappointed” by the critical response to the show. (Strechay also worked on the acclaimed 2023 Netflix limited series All The Light We Cannot See, which I covered too.)
I acknowledge it’s a nuanced position. As an entertainment vehicle, it’s obvious no one is obligated to like a show’s conceit; it’s perfectly valid to find something—See very much included—boring or, as Tallerico called it, “a slog.” People like what they like. The lingering bitterness over the reaction to See is less about the show as art and more about the show as representation. To wit, See remains close to my heart in large part because of the ways in which it amplifies awareness of the Blind and low vision community, as well as how it smashes through the societal stereotypes of people like me. Blindness is at the core of the show, not taking a backseat in the storyline in a tokenized way. While Momoa himself is sighted, there were many Blind and low vision people who worked on See, both on screen and behind it, and that matters a whole lot in terms of disability representation. In an industry—Hollywood—where disabled people historically have been pitied and portrayed as hapless and helpless, lauded only when we “overcome” our own bodies, that See puts blindness at the forefront is not insignificant and deserves not to be downplayed merely because it’s part of an ostensibly shitty show. The aforementioned critics may be fine journalists, but they lack the perspective necessary to appreciate the substantiality of what Apple pioneered with See. That the company took from its massive war chest to help fund production is very much another manifestation of its commitment to accessibility—applied to the big screen instead of the screens in people’s pockets, on their wrists, or on their desks.
As I said, it’s perfectly fine to not like See for its entertainment value—but it’s not okay to dismiss it while also being dismissive of the aforementioned representational gains. My friend Kristen Lopez, who covers Hollywood as an entertainment reporter, has lamented this concept in context of yet one more Apple TV+ title in Deaf President Now. As she writes, Deaf or disabled critics are nowhere to be found covering the newly-released documentary. Yours truly, of course, is one exception earlier this month.
“This is something that happens far too often when the few disabled-centric movies Hollywood deigns to give us come out,” Lopez said on her The Film Maven publication, hosted on Substack. “Most outlets, especially trade-based ones, don’t have Deaf or disabled writers on staff, and with freelance budgets all but non-existent these days it’s far easier to just task a hearing/abled journalist on staff with it. But what makes everything more upsetting is hearing from Deaf and disabled entertainment journos who have tried to get ahead of the game and actively pitch covering these films.”
It’s cool not to like See—or Deaf President Now—but you should respect the game.
My Six Degrees of Separation to Sam and Sir Jony
My pal Jason Snell posted a link on Six Colors today to this SF Standard piece breaking down the 9-minute video, released earlier this week, announcing the much-ballyhooed collaboration between OpenAI’s Sam Altman and legendary former Apple design boss Jony Ive. The Standard’s story is a fun little read in its entirety; what caught my eye, however, was the section about the end of the video and the credits, at the 8:56 mark.
“Let’s take a look at the ‘special thanks,’ or credits(?),“ Sophie Bearman, the Standard’s head of audio, said in the blurb. “Davis Guggenheim, the screenwriter, director, and producer known for “Training Day” (2001), “Waiting for Superman” (2010), “An Inconvenient Truth” (2006) … and “Sam & Jony introduce io” (2025). And you can’t ignore the music: Also thanked is composer Harry Gregson-Williams, who most recently scored “Gladiator 2.” This seems fitting.”
Guggenheim, of course, is also known as the producer and director of the new Apple TV+ documentary Deaf President Now. The film, released last Friday, chronicles what Apple says are the “eight tumultuous days in 1988” during which students at Washington DC’s Gallaudet University, the world’s only collegiate institution for the Deaf and hard-of-hearing, led a protest over the board of trustees’ decision to name a hearing person over two Deaf candidates. The fury was understandable: until I. King Jordan, who took over in the wake of the protest, there had never been a Deaf leader at Gallaudet—again, a place devoted to deaf people—in the school’s 124-year history.
I watched Deaf President Now again last night and loved it even more than I already do. The reason I’m writing about the aforementioned OpenAI video and Davis Guggenheim is because of the six degrees of separation here. Earlier this month, I interviewed him, along with Nyle DiMarco, about making Deaf President Now and what it meant for Guggenheim, who’s hearing, to learn about such an important show of disabled people fighting for their civil rights and their representation, in addition to being exposed to an intimate look at a seminal moment in Deaf history and, more poignantly, Deaf culture.
“I’m embarrassed to say I didn’t know about this [DPN protest] story,” Guggenheim said. “It’s meaningful to me… I’ll never fully understand Deaf culture, so it was a privilege to be invited [to make Deaf President Now] by Nyle to tell this story together.”
He added: “We both realized the story must be understood from both audiences: a Deaf audience and a hearing audience, and that the neglect [of the history] from the hearing side. If you’re Deaf, most people know this story. If you’re hearing, most people don’t know this story. For me, [working on Deaf President Now] was correcting history.”
Anyway, it’s cool to see Guggenheim apparently hobnobbed with Sam and Sir Jony.
Sonos’ New Speech Enhancement Feature uses AI to make dialogue more intelligible—and accessible
Earlier this month, Chris Hall reported for Digital Trends about a software update from Sonos which brought with it a new feature for its soundbars that uses artificial intelligence to make dialogue more intelligible—and accessible. According to Hall, Sonos’ Speech Enhancement feature is meant to “ensure that you can hear every word that’s spoken, so the important dialogue isn’t lost within the rest of the soundtrack.”
The crux of the problem Sonos is attempting to fix is, again, intelligibility.
“Clarity has been a growing problem for TV watchers, with increasing emphasis on that pounding bass or immersive soundtrack, sometimes the spoken elements get lost,” Hall wrote about Speech Enhancement’s raison d'être. “That’s a particular frustration for those with any sort of hearing loss, because you might not be able to follow the action at all, instead resorting to subtitles—which are often of varying quality.”
For its part, Sonos is describing its Speech Enhancement functionality as a “breakthrough” thanks to the capabilities of AI. The technology, Hall said, “[allows] the speech to be separated from other audio in the centre channel, so that it can be emphasized.” He adds the overarching goal isn’t so much about “pushing the speech harder” as it is “making it clear while still preserving the rest of the sound experience.” Sonos collaborated with the UK-based Royal National Institute for Deaf People (RNID) to develop Speech Enhancement. Hall notes the feature has four levels—Low, Medium, High, and Max—the topmost one being specifically designed for those with hearing loss. Max, according to Hall, prioritizes dialogue clarity above all else by “[furthering control of] the dynamic range of non-speech elements, placing dialogue firmly at the forefront of the experience.” These settings are configurable in the Sonos app.
Hall sats Speech Enhancement isn’t about more volume—it’s about more speech.
“One in three adults in the UK experience hearing loss, and it is reported that just under one in four adults in the USA do too,” Lauren Ward, lead researcher at the RNID, said to Hall in a statement. “This tool has the potential to impact a large number of people.”
Speech Enhancement is available now.
Color me Skeptical over The Altman × Ive Merger
The New York Times reported on Wednesday OpenAI, maker of ChatGPT, is acquiring Jony Ive’s startup, called IO, for the astronomical sum of $6.5 billion. As part of the deal, Ive will assume total creative control over design at OpenAI—hardware and software. The acquisition was celebrated with a flowery announcement that included a 9-minute video featuring commentary from both Altman and Ive on their grand new partnership.
At a high level, I have three big takeaways from the Altman-Ive collaboration:
I think the folks saying this a harbinger of Apple’s irrelevancy doth protest way too much. Monetarily alone, Apple is nowhere remotely in danger of becoming passé.
I think two wealthy, white, abled men waxing romantic about living in San Francisco, let alone building technology to empower people, feels really pompous.
I think people generally really like screens—and especially their iPhones—and don’t foresee a clamor to buy whatever it is whenever it starts shipping.
It, of course, is a reference to the prototype device Altman and Ive speak about in the aforementioned video, with Altman saying in part he believes “it is the coolest piece of technology that the world will have ever seen.” The Wall Street Journal reports Altman told OpenAI workers the forthcoming product is “a third core device” between one’s iPhone and MacBook. What’s more, supply chain whisperer Ming Chi-Kuo posted on X he believes the device is slated to enter mass production in 2027, with its form factor “as compact and elegant as an iPod Shuffle” and meant to be worn around the neck.
Given Kuo’s information, I have three more takeaways:
OpenAI’s device seems like it’ll eschew a screen, adopting a voice-first UI.
Design notwithstanding, this feels awfully akin to Humane’s failed AI Pin.
This thing better support accessibility features.
No. 3 is obviously most crucial from my perspective, both as a journalist and as a user. Readers of my old Forbes column may recall I wasn’t kind to Humane co-founders Imran Chaudhri and Bethany Bongiorno about what I characterized as their company’s “lack of clarity” over the accessibleness of its AI Pin. I tried vainly in the last couple years, several times in fact, to get Bongiorno, Humane’s CEO, to speak with me on the record about the product’s accessibility to disabled people like me. I got no response.
You can understand my concern here; I worry Altman and Ive’s fancy new bauble will prove inaccessible too. To its credit, OpenAI has been far more transparent in its support for accessibility, evidenced by its work with Be My Eyes and the work of its software engineering teams to make the mobile app accessible. Nonetheless, accessibility is a master shapeshifter and takes many forms. There are a lot of unanswered questions. If the prototype indeed is voice-centric, how does it accommodate those with non-standard speech or who are nonverbal altogether? If the prototype indeed is neck-worn, how easily does it clasp in terms of fine-motor skills? For those with sensory integration disabilities, how heavy is it? What kind of firmware does the device run? Apple surely isn’t licensing iOS, so is whatever OpenAI’s using under the proverbial hood built with accessibility in mind? These all are mission critical questions that the social media peanut gallery has thus far (predictably) ignored in their zeal to celebrate, and pontificate, over Altman and Ive announcing their joint venture.
I don’t mean to imply Altman and Ive are unfeeling, although I maintain the aura of the introductory video reeks of pretentiousness and an utter lack of self-awareness of each other’s immense privilege. Maybe my worries are misplaced… maybe OpenAI’s so-called “family of devices” will be accessible to all. But therein lies the rub: nobody knows. This is exactly the reason for the disabled community’s general apprehension towards new technology. I felt this way in 2023 about Apple Vision Pro, albeit buoyed by Apple’s proven track record in the accessibility arena. The disabled community are technologists at heart, as Dr. Victor Pineda said to me, but we also realize we are the minorities’ minority. As such, we’re naturally skeptical the abled powers-that-be will be mindful that building technology for ostensibly everyone to feel empowered—as Altman and Ive do in their video—in actuality should include people with disabilities.
I’ve neither met nor interviewed Altman. The same goes for Ive. I’d love to interview both of them, ideally simultaneously, and pepper them with the very questions I’ve laid out in this piece. Covering technology is unlike covering, say, the president as a member of the White House press corps. My friends such as CNN’s Alayna Treene absolutely are upholding the journalistic value of holding power to account because what an administration does obviously has enormous effect on the everyday lives of the citizenry. The stakes in tech journalism are markedly lower, but the journalistic value remains unchanged. In my case, I like to think my work is holding truth to power by questioning (and thus reporting on) whether a device like Altman and Ive’s will be accessible to those who need accessibility for usability. In other words, OpenAI ought to be held accountable for ensuring “everyone” is much more practice than platitude.
I’m happy to Waymo myself across town anytime to find out firsthand.
How AI Makes Coding More Accessible
Popular tech YouTuber Quinn Nelson of Snazzy Labs fame posted this on X recently:
Nelson’s sentiments struck a chord because he and I feel similarly about artificial intelligence and coding. As I’ve built out Curb Cuts, refining and tweaking its design, I’ve leaned on some custom CSS code to do things the otherwise robust tools Squarespace provides doesn’t allow. These are reflected in things like the tagline in the site’s header, as well as the title casing in the archives. I’m no web developer, so I’m not fluent in HTML or CSS; I know the building blocks, but admittedly need help doing anything requiring heavier lifting. This is where Nelson’s comment on AI and coding is relevant, as I used Google Gemini to help me with generating the CSS code I wanted.
Using Gemini in this way is genius from an accessibility standpoint. For one thing, typing up a quick description of what I need for Gemini is far more accessible than using Google proper to manually search for solutions. In my case, it isn’t so much that I can’t use Google to find a Reddit thread or GitHub repository with what I need; I certainly can, but it comes at a cost: namely, it saps a lot of energy from my eyes and hands from all the scanning and typing. Eye strain and fatigue is more prevalent for me as someone with low vision, since obviously my eyes need to work harder in order to see stuff on my computer—and that’s with accessibility features like Hover Text enabled on my iMac. Likewise, the partial paralysis on the right side of my body, caused by cerebral palsy, makes it that I’m decidedly not a touch typist. I’m more of a hunt-and-peck typist, which means I naturally must be looking at the keyboard to find the letter(s) I want to press.
A chatbot like Gemini is, again, worth its weight in gold given this context. All I need to do is cobble together a sentence or two with what I want to accomplish and send Gemini my prompt. Within a few seconds, it spits out the requested code and, in a nice fit of user interface design, a handy little “Copy Code” button in the top-right corner of the chat window. What’s more, there’s a bonus accessibility win: rather than doing the ⌘-C/V shuffle with my fingers, I instead click the aforementioned “Copy Code” control and easily paste it into my site’s CMS. No muss, no fuss. From a cognition perspective, Gemini’s assistance here has the potential to be even more profound for those who are neurodiverse or cope with other intellectual conditions. A person with a cognitive disability, who may not be able to search Google or write code without being overwhelmed by the how, what, and where involved in such tasks, may find tools like Gemini (or ChatGPT or whatever) invaluable to, in this case, building a website or doing research for various projects. This isn’t conjecture on my part; Jenny Lay-Flurrie, vice president and chief accessibility officer at Microsoft, told me in an interview last year about her teenage daughter, who’s neurodivergent, using the ChatGPT-powered Bing to do research for school essays because it’s more accessible. There surely are other examples, but the salient point is, whether for coding or something else, AI chatbots are bonafide assistive technologies for legions of people in the disability community. All the handwringing over chatbots in classrooms, what with concerns over cheating and an existential threat to pedagogy, fail—predictably so—to see not every student (or teacher) uses these AI tools out of sheer laziness or, more nefariously, a crave to cheat.
So it goes with software development. Whether a blog or iOS development in Xcode, using AI tools to generate code is not merely convenient or expedient—it makes coding downright more accessible too. That’s not at all trivial, especially if you’re an aspiring developer who copes with a disability of some sort that makes writing code difficult.
Speaking of code, while I’m not well-versed in HTML or CSS, I am versed in Markdown. With few exception, everything I write for the internet is written using Markdown—including this very article. I wrote about Markdown and accessibility for TidBITS a little over 12 years ago (!) now. What I wrote in June 2013 stands equally strong in May 2025.
“Markdown has changed my life for the better. Not only is it easier to work with than graphical interfaces given the limitations of my vision, but it has caused me to embrace plain text for nearly all of my documents. No longer do I have to work in bloated word processors with toolbars galore, or worry about rich-text formatting. Discovering Markdown has been liberating in the truest sense of the word,” I wrote of the syntax’s (lasting) influence on my writing. “Given Markdown’s nature, I came to the realization that it, however unintentionally, is in fact a wonderful accessibility tool, because it reduces eye strain while writing. The simplicity of Markdown’s syntax makes it possible to not have to look at the screen every time I want to italicize a word or insert a link.”
It was thrilling, soaking wet behind the ears as I was, to read I made John Gruber’s day.
“Vibe coding” is en vogue right now in the software development space. For me, it isn’t for the reason most assume. In my case, it’s accessibility—which is a vibe all its own.
Microsoft, Xbox Mark GAAD with Updates
In celebrating Global Accessibility Awareness Day last week, Redmond-based Microsoft shared a bunch of updates on the continued work it’s doing to amplify awareness of the disability community. The company’s vice president and chief accessibility officer, Jenny Lay-Flurrie, wrote about this in a blog post. The thrust of her piece is the technology du jour in artificial intelligence and how it impacts accessibility.
“Today we celebrate Global Accessibility Awareness Day (GAAD) and work across the industry to make technology easier for everyone. At Microsoft, our journey with accessibility started in the 90’s, and is a cornerstone to our mission. We are committed to engraining accessibility into our culture, to build for all, and innovate to empower people around the world,” Flurrie said in the post’s introduction. “AI has been a game changer for accessibility. It is accelerating the accessibility journey in exciting ways. Making it easier to do everyday tasks and tackling some of the toughest problems of our times. Launching some new technologies and partnerships today. Let’s dig in!”
Flurrie’s first point highlights how disability-centric data “unlocks new opportunities for AI,” adding “high-quality and representative data can lead to more reliable outcomes from trustworthy AI systems.” She goes on to say Microsoft is “proud” to support two projects that are using disability-focused data to “drive change.” One is the Disability Data Hub run by World Bank Group, which Flurrie describes as “the first open data initiative to provide disability-disaggregated development data across 63 global economies [which] addresses the need for a single, comprehensive global dashboard to close data gaps that have historically excluded disabled individuals from development agendas.” Another is Answer ALS and ALS Therapy Development Institute, who’s working on finding a cure and therapies for ALS, known as Lou Gehrig’s disease.
Elsewhere, Flurrie writes about the importance of authentic disability representation in AI systems. “One of the most pressing challenges is that generated content, such as images, can misrepresent or stereotype disability, leading to harmful inaccuracies or even the exclusion of certain identities,” she said. “These gaps in representation data can reinforce bias and erode trust.” Flurries notes Microsoft’s Bing Image Creator now is capable of generating “more accurate depictions of disabilities” such as autism and Down syndrome. Microsoft, Flurrie went on to say, “collaborated with individuals with lived experience, trusted external partners, and AI researchers to better understand how disability is portrayed—both accurately and not—within AI models.”
Lastly, Flurrie mentions a few software enhancements that make products like Microsoft 365 more accessible to disabled people. For instance, the Accessibility Assistant is available in the Microsoft 365 web apps, as well as in Visio and OneNote.
In other news, Microsoft-owned Xbox last week announced updates which “[welcome] more players by increasing accessibility in games.” There are “new and exciting accessibility features” in titles such as DOOM: The Dark Ages, Candy Crush Soda Saga, and World of Warcraft. The company also shared news of its work in building the Accessible Games Initiative (AGI), as well as an accessibility-minded peripheral, the Xbox Adaptive Joystick, being available to buy. As to the AGI, I covered it back in early April with an interview with Entertainment Software Association SVP Aubrey Quinn.
Slack Gives Shoutout to Simplified Layout Mode
Last Thursday, Global Accessibility Awareness Day, I was alerted to this X post by Slack:
The post links to this page on Slack’s website wherein the Salesforce-owned company details its Simplified Layout mode in its desktop app. As Slack’s post says, the streamlined mode has been built with accessibility in mind—particularly helpful to those who are neurodivergent or cope with intellectual conditions affecting cognition.
“Simplified layout mode for the Slack desktop app helps you focus by showing one section of Slack at a time,” Slack writes about Simplified Layout on its website. “This mode provides simplified layouts and minimizes distractions, which may benefit single-taskers and people using assistive technology.”
In broad strokes, what Slack is doing here is neither novel nor revolutionary. Even for people without disabilities, the Slack user interface, whether on the desktop or on the web, can be inscrutable and incongruous at times. Companies such as Apple, what with its Assistive Access feature on iOS, have rightly recognized there exists a subset of users for whom their ostensibly “simple” UI paradigms remain complex and out of reach in terms of comprehensibility. Hence, that tools like Assistive Access—or, in this case, Slack’s Simplified Layout—have cropped up in the last few years is a conscious choice by platform owners to remedy the inaccessibility for a portion, however tiny it may be in absolute number, by stripping down its software to make it even more conceptually simpler. It’s also worth noting this particular nod to inclusivity is a prime example of accessibility’s return on investment being immaterial; to wit, companies like Apple and Slack care not about the financial coats it incurs to allocate resources to building something like Simplified Layout. It’s obvious the target demographic for the functionality is a fraction of the fraction who use accessibility software, but that doesn’t matter. What matters is something like Simplified Layout (or Assistive Access) worthwhile because it diversifies the platform even further by providing a service to those who can truly benefit from it. Put another way, tools like Simplified Layout exemplify what GAAD co-founder Joe Devon recently told me about why accessibility awareness is so crucial: it’s not only good for users, it’s also good for business. The more flexible and richer one’s product is, the more users one attracts—and the disability community comprises a lot of potential users to which companies can cater.
Assistive Access, by the way, is coming “later this year” to Apple’s TV app.
Google Celebrates GAAD With New Enhancements to TalkBack, Expressive Captions, More
Google marked this year’s Global Accessibility Awareness Day late last week by publishing a blog post wherein the Mountain View-based company announced numerous accessibility-oriented updates for its myriad platforms. The post was written by Angana Ghosh, who’s Google’s director of product management for Android.
“Advances in AI continue to make our world more and more accessible,” Ghosh wrote in the post’s introduction. “Today, in honor of Global Accessibility Awareness Day, we’re rolling out new updates to our products across Android and Chrome, as well as adding new resources for developers building speech recognition tools.”
Ghosh’s post begins by discussing “more AI-powered innovation with Android,” with Google’s screen reader, known as TalkBack, getting expanded Gemini integration such that users can ask the chatbot about imagery and get answers. Ghosh cites an example of a Blind user asking about a picture of a friend’s guitar, writing the user can ask for details about the musical instrument such as its color and manufacturer. Likewise, users also are able to query Gemini about product sales in their favorite shopping app(s) so they can be more informed about discounts and their overall buying power.
Google first brought Gemini to TalkBack last year, according to Ghosh.
Elsewhere, Expressive Captions, which uses AI to not only telegraph what people say but how they say it, is being updated such that Deaf and hard-of-hearing people can “understand mooooore of the emotion behind captions.” Ghosh notes Google has added a new “duration” feature to Expressive Captions that’s useful for times when, for instance, a sports announcer excitedly boasting about an “amaaazing shot” during a game. What’s more, there are new labels for sounds like whistling or throat-clearing. The updated version is available on devices running Android 15 or higher, with localization in English in Australia, Canada, the United Kingdom, and the United States.
In other news, Ghosh writes Google has expanded availability of its Project Euphonia, announced in 2019 as a way to make speech recognition more accessible for those who have non-standard speech pattern (like yours truly). Google is making its own-source codebase available of the Project’s GitHub repo, as well as working with the University College London’s Centre for Digital Language Inclusion to strengthen speech recognition technology for non-English speakers in Africa. On the educational front, Google announced accessibility improvements to ChromeOS and the Chrome web browser, including more accessible PDF reading and page-zooming functionality.
I interviewed Ghosh back in December about building Expressive Captions.
New Video shows magnifier for Mac In Action
In complementing its Music Haptics video, Apple earlier this week posted a video to its YouTube channel which demonstrates the forthcoming Magnifier for Mac app in use. The software is a headliner amongst the slew of accessibility-focused enhancements the company previewed as part of its Global Accessibility Awareness Day celebration.
The Magnifier for Mac video, embedded here, shows a student using it during a lecture.
The quick glimpses of the new-to-macOS Magnifier app reveal the software to be quite robust. At a technical level, it’s also abundantly clear Apple took the building blocks for Continuity Camera to assemble Magnifier for Mac. I’m excited to try it out for myself on my M2 MacBook Air, but do wonder about clipping my iPhone to the laptop’s display. Will Apple be selling a first-party mount? My guess is no, considering the company already sells a Belkin-branded mount for Mac notebooks. There’s a similar accessory for Apple TV 4K to use for FaceTime calls. Whatever the case, it’ll be interesting to see how accessible these mounts are to manipulate, motor-wise. It’s important people realize not everyone can attach their phone to the mount, then to a display, so easily.
The moral is Magnifier for Mac has a multi-layered accessibility story that goes beyond sheer software. The app seems eminently capable, but is usable only if the mount is too.