You should Get the pBS App this weekend
I have a couple loosely-related content recommendations for you, dear reader.
First, before bed last night, I watched the latest American Experience documentary about the history of the Americans with Disabilities Act (ADA). The hour-long film, called Change Not Charity: The Americans with Disabilities Act, offers a detailed look into disability inclusion in this country and the disability community’s (ongoing) fight for our recognizance and civil rights. The film includes a bunch of good interviews, including with renowned disability activist Judy Heumann and retired senator Tom Harkin (D-IA), who authored the Americans with Disabilities Act and was its chief sponsor. He delivered parts of his speech introducing the bill on the senate floor in ASL such that his Deaf brother could understand him. As to Heumann, she passed away in March 2023.
Watching the documentary obviously resonated with me; it was a poignant reminder of how very much spiritually and actively aligned I am with the missions of people like Heumann, Harkin, and more. Disability rights sits at the core of what I’ve done for the past dozen years as a technology reporter. Back in 2013, I carved out a beat, virtually all by my lonesome, covering accessibility and assistive technologies that I knew was risky because it was—and remains—abstract and esoteric and misunderstood by the majority of folks in the press. Believe me, disability decidedly isn’t a hot topic at tech desks in newsrooms; despite the proliferation of equally narrow beats like AI, social media, and media, accessibility sits at the narrowest of margins. With precious few exceptions, like my good friend at The Washington Post in Amanda Morris, disability simply isn’t a priority like race and sexuality when it comes to social justice reporting. I’ve lamented this before, as well as hawking my own wares on LinkedIn, both because I love working in journalism and, more pointedly, I’m utterly devoted to the representational cause. It’s precisely why, with a reporting résumé that includes interviewing Apple CEO Tim Cook, one of the feathers in the proverbial cap of my career I’m most proud of is interviewing Tony Coelho in 2020. Coelho, the retired Democratic congressman from California, copes with epilepsy (as my mother did) and is perhaps best known as being the ADA’s primary sponsor alongside the aforementioned Harkin.
It’s hard to believe the ADA turns only 35 years old come late July. Harder still, the ADA was signed into law by a Republican president in George H.W. Bush. I cannot fathom today’s ghoulish Republican Party, let alone Trump’s signature, even passing muster in Congress. As with other marginalized communities, it isn’t a stretch to surmise today’s Republicans give not even the slightest of shits about people like me. I’ve no doubt they would institutionalize us posthaste were it possible and politically expedient to do so.
Which brings me to my second recommendation. Capitol Hill has been in a tizzy this past week over the “Signalgate” scandal involving sensitive military operations and the accidental addition of Atlantic editor-in-chief Jeffrey Goldberg in the group text chat. Goldberg is host and moderator of one of my favorite nerdy shows in Washington Week which, like American Experience, airs on PBS. I watch it faithfully every Friday evening, usually in the living room over dinner. This week’s episode promised to be must-see TV given the show (a) covers the hottest political stories of the week; and (b) Goldberg himself is a central figure in the story. I thought it would be riveting television, and it turned out I was exactly right. To me, it was just as enthralling and entertaining as the Season 2 finale of Severance that, coincidentally, dropped just last Friday on Apple TV+.
Anyway, if you want good things to watch this weekend, the PBS app is your friend.
What the lumon terminal Pro says about accessible Computers, past and Present
In a break from the norm, Apple’s latest Mac doesn’t look to be very accessible.
Earlier this week, the company launched the Lumon Terminal Pro. The Terminal Pro is obviously a fictitious product and thus not actually for sale; it’s the machine Mark S, Helly R, Dylan G.—and presumably Tim C.—use to do their “mysterious and important” work as part of the Macrodata Refinement, or MDR, group on the cultural phenomenon show Severance on Apple TV+. On the Terminal Pro’s page, Apple links to a spoiler-filled video detailing how Severance is edited on the iMac, MacBook Pro, and Mac mini. It’s a good 12-minute watch, if not somewhat meta insofar as while watching it last night, I chuckled at the thought someone at Apple had to edit the video about editing video.
The other thought I had while watching the making-of video was how juxtaposed the Terminal Pro is to these aforementioned decidedly more modern Macs. To wit, seeing the Terminal Pro on the show evokes strong memories within me of my years in elementary school using the Apple IIe during computer lab. I played a helluva lot of Oregon Trail and Odell Lake on those things—even when I was supposed to be doing something else decidedly more pedagogical in nature. As a kid with low vision, I found the Apple IIe’s screen hard to see, despite the contrast being high with its green text against a black background. The full complexity of accessibility as a concept was intellectually lost on me at that age, although I suppose I had an innate sense of it given how I struggled to see the screen (and the glyphs on the keyboard). Were the Terminal Pro a real shipping product, my spidey sense strongly suggests how accessible it likely would be on par with my Apple IIe experiences of yesteryear—which is to say, not very.
By the same token, however, it’s remarkable how the technological advances of the last few decades have enabled me to not only enjoy computers in a nerdy sense, but to thrive with them in a productivity sense. Come May, I will have been an independent tech journalist for a dozen years, all of which have been spent from home and on the various Apple computers which have festooned the newsroom known as my desk in the little corner of my dining room. (Yes, I’m very privileged to have a house with a bespoke dining room.) None of my reporting would have been possible without, amongst other technical marvels, an accessible computer. My current machine is likely careening towards the end of its life come June, but it has treated me so well since July 2019. I’d love to know how many millions of words I’ve banged out on this thing over the years. It’s a testament to not merely how accessible the Mac is to me personally, but to Apple’s commitment to software support beyond older machines’ ostensible modernity. Sure, Apple Silicon is the shit, but my iMac is still steamrolling along on Intel and suits my workflow just fine. The 4K panel, in particular, remains glorious to look at every single day. What I’m saying is, the value proposition has been astronomical.
Beyond computers, it’s ironic in a way that, for all the praise I heap unto Apple for prioritizing disability inclusion in the company’s entertainment arm, Severance is absolutely one of my all-time favorite television shows. It has absolutely nothing to do whatsoever with disability, but I love it so much. I was admittedly late to jumping on the proverbial bandwagon, but I’m firmly riding it now. Although I subscribe to umpteenth streaming services, Apple TV+ is unquestionably my favorite not only because of Severance and For All Mankind and The Morning Show and Dickinson and Ted Lasso, but because it’s also home to what I believe is the most compelling catalog of earnest, genuine disability representation found in Hollywood today. As a lifelong marginalized and underrepresented person, I feel as though I can truly have my cake and eat it too.
Terminal Pro’s appearance on Apple’s Mac website is the latest marketing ploy for Severance. Back in mid-January, cast members appeared at a promotional event held at Apple’s Grand Central Station retail outpost ahead of the show’s Season 2 premiere. Parker Ortolani wrote about the event for his blog, replete with splendid photography.
I can’t wait for Season 3 of Severance, praise Kier. The first two are on Apple TV+ now.
Waymo Touts It’s ‘Laying the groundwork’ for Coming Washington DC Expansion
Waymo announced on Tuesday it’ll be “ready for riders” in Washington DC come 2026.
The Alphabet-owned autonomous vehicle company, which boasts it’s “the world’s leading fully autonomous ride-hailing service” and provides over 200,000 fully autonomous paid trips each week, said in its announcement it plans to “continue introducing ourselves to DC’s communities and emergency responders over the coming months,” as well as working closely with policymakers in an effort to “formalize the regulations needed to operate without a human behind the wheel in the District.”
“Waymo One is making fully autonomous driving a reality for millions of people across the [United States],” Tekedra Mawakana, Waymo’s co-CEO, said in a statement for the company’s announcement. “We’re excited to bring the comfort, consistency, and safety of Waymo One to Washingtonians, those who work and play in the city every day, and the millions of people from around the world who travel to the District every year.”
In reacting to Waymo’s expansion news, the American Council of the Blind (ACB) today posted on LinkedIn it is “excited” by the new development and is “proud” to continue its work with Waymo to “[push] toward a future of greater independence and freedom of mobility by making this technology available to the thousands of people in DC who are Blind or have low vision.” The ACB included in its post a photo of executive director Scott Thornhill and director of advocacy and governmental affairs Claire Stanley standing in front of a Waymo SUV. With Stanley is her guide dog named Tulane.
I’ve covered Waymo both extensively and from extremely close range over the last few years. They have service here in San Francisco, and I’ve been a Waymo One user since before the app went public. The best recommendation I can give Waymo is to say its app has cemented a place on the Home Screen of my iPhone 16 Pro Max. Without even the slightest hyperbole, Waymo has utterly revolutionized how I get around my city. While I continue to walk around my neighborhood for quick jaunts, anything further away and I’m using my phone to summon a Waymo. Likewise, while I continue to be a staunch proponent of public transit for myriad reasons, the reality is, for me, Waymo has proven itself far more accessible. The on-demand nature of it is arguably the best accessibility feature, with the fact I’m traveling alone coming in a close second. For better or worse, Waymo plays to my introverted nature; I needn’t have to worry about a crowded bus—where, even with my Blind cane in tow as identification, I’m not guaranteed a seat up front, however entitled—nor do I have to contend with a chatty Lyft or Uber driver. What’s more, Waymo’s technical attributes—its phone-centric design and self-driving technologies—are both seriously strong plays to my soul as an unabashed tech nerd.
To that point, the ACB’s sentiments about Waymo engendering greater freedom and independence for disabled people and their mobility resonate greatly with me. While I’m in no way proclaiming the company’s service is perfect, I am comfortable in making the proclamation that the anti-robotaxi brigade is myopic in its fervor. Safety and technical competence are obviously important; it’s disingenuous, however, to claim autonomous vehicles, be they Waymo’s or the competition’s, are utterly detrimental. To suggest so is to be willfully obtuse about their potential for genuine good. To wit, that Waymo has been so transformative for me (and others) isn’t the least bit trivial or niche.
Beyond San Francisco (and next year, Washington DC), Waymo offers service in Austin, Los Angeles, and Phoenix. The company plans to add Atlanta and Miami in the future.
Looming AirPods Max Update Reminds Accessibility Applies to hardware too
Apple on Monday made a surprise product announcement: the Cupertino-based company shared on its Newsroom website that, come iOS 18.4 next month, AirPods Max will gain support for lossless audio and ultra-low latency audio. In a corresponding move, the company today also announced the USB-C to 3.5mm audio cable. Apple notes the new $39 (!) accessory is capable of “[connecting] AirPods Max to 3.5 mm audio sources like the audio-out port on an airplane,” as well as connecting iOS and iPadOS devices to 3.5mm audio ports such as car stereos and other types of speakers.
With the forthcoming iOS 18.4 update, Apple boasts AirPods Max “will become the only headphones that enable musicians to both create and mix in Personalized Spatial Audio with head tracking.” The company adds AirPods Max will also provide “the ultimate listening experience and even greater performance for music production.” At a technical level, iOS 18.4 will enable “24-bit, 48 kHz lossless audio,” specifications which Apple says are key to “preserving the integrity of original recordings and allowing listeners to experience music the way the artist created it in the studio.” More than 100 million songs on Apple Music support high-fidelity, lossless audio, according to Apple.
Apple’s SVP of product marketing Greg Joswiak called it “the ultimate audio upgrade.”
As Jason Snell observes, the soon-to-come updates make for welcome news for AirPods Max users—although it’s somewhat perplexing why this stuff couldn’t have been announced months ago. Whatever Apple’s rationale for the delay, the reason I’m covering the ostensibly esoteric and unrelated bit of AirPods Max news is because it dawned on me while reading Apple’s announcement that I’ve never written about them in an accessibility context. I’ve mentioned having used them tangentially, but never have I expounded more fully on their accessibleness to me as a disabled person.
For starters, AirPods Max are heavy. John Gruber titled his 2020 review of the original perfectly when he said heavy is the head that wears AirPods Max. He was also spot-on when he wrote AirPods Max are “indisputably nice.” I got a blue pair of the Lightning version as a birthday gift a few years ago, and I wholly concur with my friend Gruber that AirPods Max are both heavy and super nice. I’ve mainly used them on my desk to listen to music and podcasts when my partner is home and I don’t want to disturb her. They’re also useful for putting on instrumental music for when I need to hunker down and be productive. (A current favorite for this is Linkin Park’s Papercuts greatest hits anthology.) Beyond that, however, I don’t travel with my AirPods Max; they’re simply too heavy and I’m not fond of the carrying case. By contrast, on Amazon’s Prime Day last summer, I snagged a pair of the Beats Studio Pro (in white) and was immediately impressed. To my ears, audio quality seems on par with AirPods Max. It’s hard to fathom, though, the same company that designed the AirPods Max “case” owns the brand that made a case for the Studio Pro that’s truly superior in every possible respect.
As a pair of headphones, AirPods Max are terrific. The sound is incredible.
I never bothered to swap my Lightning model for USB-C last fall because, aside from some new colors and a new connector, what I have on my desk right now is essentially the exact same thing. What’s more, from an accessibility standpoint, going from Lightning to USB-C is mostly a lateral move. For my lackluster hand-eye coordination—thanks to low vision and partial paralysis in my hands due to cerebral palsy—I’d merely be trading one inaccessible port for another. Although there is a cogent argument to be made that having one’s devices all using the same connector is more accessible from a cognitive perspective, where that falls off the rails is if you have multiple disabilities as I do. It would be more accessible to have my iPhone, AirPods Pro, and AirPods Max use USB-C—certainly more convenient too. But those benefits extend only so far if, as a practical matter, I struggle to plug the cable into the port. This is an issue I’ve written about innumerable times before: to be crystal clear, I can plug things in… just not without a good degree of friction and a grand degree of expletives while I’m doing it. As I’ve often said, the win is not merely to put USB-C in all the things, as the tech media likes to espouse. That isn’t innovative. What would be true innovation is, in Apple’s case, the company using its engineering prowess to somehow fuse MagSafe with USB-C. That the port is supposed to be an open standard and universal is (mostly) immaterial for accessibility purposes. As someone who’s predominantly an Apple user, I’d rather take the company’s proprietary spin on USB-C technology if it means AirPods Max end up being more usable by me. The salient point is simply that—surprise, surprise!—not everyone has the ability to plug things in so mundanely. A magnetic USB-C port surely would go a long way in shaping an even better user experience for myself (and others).
As ever, it’s the seemingly little things that make the biggest difference for disabled people. It’s why robust reporting on disability and technology matters so very much.
Microsoft Accessibility Exec Jessica Rafuse Talks Ability Summit 2025, More In Interview
March has been a big month for Microsoft in the accessibility arena.
Besides launching the Xbox Adaptive Joystick, the Redmond-based tech titan last week held its 15th annual Ability Summit. The event, which I’ve covered with regularity for the last several years for my old Forbes column, was hyped up in a blog post written by Microsoft’s chief accessibility officer in Jenny Lay-Flurrie. She noted this year’s edition of the Ability Summit featured a whopping 20,000 attendees hailing from 164 countries to gather together to discuss what she described as “the future of AI and accessibility.”
Microsoft, Flurrie wrote in the lede, is “innovating faster than ever before.” Disabled people, she added, are standing right there at the forefront, helping to lead the charge.
While I’ve sat down virtually with Flurrie numerous times for candid, on the record conversations about Ability Summit and much more, I spoke with Jessica Rafuse about this year’s event. Rafuse, who’s been with the company almost 9 years, is Microsoft’s director of accessibility strategic partnerships and policy and, like Flurrie, a disabled person herself. Rafuse copes with muscular dystrophy and uses a wheelchair, and is a mother of children with disabilities. She characterized her role as helping lead the company’s work with entities outside of the organization in its efforts to “accelerate accessibility.” The work, she added, is important because it gives Microsoft an opportunity to “gain feedback and insights” on how best to support its customers with disabilities. The company does this in three ways, Rafuse said: through building trust, spurring adoption, and leveraging AI. A lot of the so-called NGOs, or non-governmental organizations, are unaware Microsoft ships products—cf. the Xbox Adaptive Joystick—which are designed to help disabled people. In terms of AI, Rafuse said much of the work in building awareness is educating how AI can be a profound assistive technology.
As to Ability Summit, Rafuse called it an “all day event” for Microsoft, streamed online but also featuring an in-person component at the company’s headquarters. Company executives and community leaders, she added, came together to bask in the excitement of “[sharing] the latest greatest in accessibility and AI.” According to Rafuse, Ability Summit is “one of the few moments we all get together physically here in Redmond,” adding “we just celebrate a little of the work that our accessibility professionals are doing.” Put another way, Ability Summit is one forum in which Microsoft can deservedly put accessibility (and disabled people) in the spotlight.
“[Ability Summit] is a time for us to celebrate the work that is sometimes hard but always really rewarding,” Rafuse said of Microsoft’s chance to amplify accessibility.
When asked about the current state of accessibility in the industry and in society writ large, Rafuse said Ability Summit exists in large part to answer that very question. More pointedly, the sheer fact tens of thousands of people attended this year’s event—“more folks than we’ve ever had,” Rafuse told me—is a reflection of two bigger trends. First, she said, is the sociopolitical climate in the United States at the moment. There’s a lot that’s unknown right now; many people are looking to captains of industry like Microsoft to guide them through what Rafuse called the “regulatory unknown.” To that end, Rafuse said this year’s Summit saw an increase of European attendees, which she said was unsurprising given the looming European Accessibility Act scheduled to take effect come June. On its website, the European Commission describes the legislation as “a directive that aims to improve the functioning of the internal market for accessible products and services by removing barriers created by divergent rules in Member States.” There’s “an appetite” from Europeans, Rafuse explained, to glean Microsoft’s feelings on the coming bill. Beyond the European Accessibility Act, Rafuse said people are keenly interested in how artificial intelligence can influence and impact accessibility; attendees, she went on to say, are interested in how Microsoft is leveraging artificial intelligence in this realm, most particularly in responsible manners.
“There’s a lot of hope [around accessibility] right now,” Rafuse said.
Rafuse reemphasized the cruciality of community in furthering Microsoft’s accessibility efforts, telling me the company has a strong contingent of disabled employees internally. Again, she noted soliciting feedback is critical for the company, so a big part of the strategy is to partner with third parties externally. Microsoft engineers, Rafuse said, talk endlessly about building accessibility into design so as to be part and parcel of the developmental process. That necessitates “co-creation” with members of the disability community, she added, and Microsoft actively seeks out, and summarily cherishes, the feedback loop with the broader community. To wit, working on accessibility is decidedly not a solo endeavor—as the saying goes, it takes a village.
Overall, Rafuse said Microsoft loves Ability Summit because they love accessibility. It gives the company a platform atop of which to raise awareness—to educate and evangelize. Accessibility, whether in product or events, is “an ongoing process” for the company. Ability Summit is one thing, but then the proverbial page turns to the next big date on the accessibility calendar: Global Accessibility Awareness Day in mid-May. Put simply, accessibility is always on the minds of Rafuse, Flurrie, and scores of other folks who work on accessibility within Redmond’s walls. Rafuse told me this year’s Ability Summit was special because it was the first one to be held in person post-pandemic, as the last time was 6 years ago, in 2019 at Microsoft’s on-campus conference center.
“It was nice to see and meet with customers at Ability Summit in person,” Rafuse said.
Beyond introducing this year’s Ability Summit, Flurrie’s blog post included highlights of Microsoft’s work in accessibility of late. Amongst her callouts include Microsoft’s hardware shipping in accessible packaging, Copilot helping those in the neurodivergent community, speech recognition seeing a 60% improvement, and more.
Instacart Makes grocery shopping more personal, accessible with new AI Features
San Francisco-based grocery delivery company Instacart issued a press release earlier this week in which the company announced its so-called “Smart Shop” feature. The functionality, which Instacart says is powered by artificial intelligence, is touted to “[offer] more personalized recommendations based on preferences and provides clear nutrition and dietary information at consumers’ fingertips.” With Smart Shop, Instacart said, the company is “making online grocery shopping more intuitive by analyzing customer habits and dietary preferences to surface the most relevant products faster.”
Besides Smart Shop, Instacart introduced something it calls Health Tags. The product of a collaboration with the American Diabetes Association, Instacart describes the feature as “[providing] detailed and transparent nutritional information across the catalog, and Inspiration Pages, curated destinations within the Instacart experience featuring expert-backed health recommendations and shoppable recipes.” The company noted Health Tags, which it says uses evidence-based guidance and recommendations, is being released coincident with National Nutrition Month. According to Instacart, are purposely designed to assist customers with “[discovering] relevant products based on their unique health and lifestyle preferences on Instacart.”
“At Instacart, we want to turn the ordinary task of grocery shopping into a delightful, personalized shopping experience that takes the mental load out of finding the exact items that meet your preferences,” Daniel Danker, Instacart’s chief product officer, said in a statement for the company’s announcement. “By combining our new Smart Shop technology, Health Tags and Inspiration Pages, we’re not just improving online grocery shopping—we’re reimagining it, making it seamless to go from intention to action. By customizing your shopping journey to match your personal health goals or fit your dietary restrictions, we can unlock possibilities that weren’t even on the table before.”
Smart Shop and Health Tags “enable consumers to shop according to their unique dietary and household preferences, [which simplify] the process of finding relevant products and making more informed grocery choices.” As to Smart Shop specifically, Instacart says it’s powered by an “industry-leading catalog” of 17 million unique items, alongside a “proprietary dataset” comprising “millions of grocery shopping journeys.”
Over 70% have at least one dietary preference, according to a recent Instacart survey.
This week’s news of Smart Shop serves as a good reminder of how Instacart is beneficial to accessibility in multiple ways. I’ve covered the company in the past, and the most obvious benefit to using them is the fact someone does the shopping for you before delivering the goods to your doorstep. Although certainly convenient, the reality is Instacart can be a godsend to those in the disability community who are immobile—whether physically, logistically, or some combination thereof—and thus literally cannot (or should not) venture out to do their own shopping. This use case extends far beyond sheer amenity; Instacart’s existence means a disabled person has access to the things they need for day-to-day sustenance and survival. By the same token, Instacart’s guidance and recommendation engine is yet another example of AI’s profound potential to do genuine good for people. It’s entirely plausible these new features from Instacart helps a disabled person make better purchasing decisions so as to be in better alignment with whatever condition(s) they cope with on the daily. Moreover, from a cognition standpoint, the personalized information work wonders in alleviating the mental load of not only having a shopping list, but more crucially, finding the items within the store’s digital aisles. In-person grocery shopping isn’t a task for the weary.
Of course, the overarching reason for Instacart’s potential for greater accessibility lies in technology—specifically the smartphone. Disabled people use smartphones too, and Instacart’s app—available on iOS and Android—is one more tool with which to make life easier and more accessible. In very much the same ways Facebook makes socialization with far-flung family and friends more accessible, or how on-demand Uber and Waymo rides makes getting around town more accessible, Instacart can make getting groceries less burdensome due to the modern smartphone’s superpowers.
If Apple gets apple intelligence right, the biggest Beneficiary will be accessibility
Although I’ve covered it before, I’ve admittedly heretofore been reticent to write more about Apple Intelligence because I’m skeptical my viewpoints will be heard. The seemingly cacophonous opinion, industry-wide, is Apple Intelligence—and Siri in particular—is shit and irredeemable. Accessibility isn’t exactly juicy headline fodder.
But then, compulsion. Bloomberg’s Mark Gurman put out a blockbuster scoop today in which he reports Apple chief executive Tim Cook has tapped Vision Pro boss Mike Rockwell to take over Siri from ostensible AI leader John Giannandrea. Gurman notes Cook purportedly has “lost confidence” in Giannandrea’s ability to come through in product development, adding Rockwell will report to the company’s software boss in senior vice president of software engineering Craig Federighi. Gurman writes Apple’s so-called “Top 100” leaders met in a secretive offsite retreat to discuss, amongst other things, the existential crisis regarding Apple’s languishing place in the AI standings.
At a macro level, the utter contempt for Apple Intelligence (and, by extension, Siri) at this point is so thick it’s hard to find the proverbial silver lining within this darkest of clouds. But I say it is doable—the good within Apple Intelligence is there, whether tech journalists and armchair analysts want to acknowledge it or not. In my opinion, Apple Intelligence is one of those products which exemplify why more robust reporting on disability inclusion vis-a-vis accessibility is so sorely needed at tech desks in media organizations everywhere. By my estimation, there has been a pittance of focus on Apple Intelligence and accessibility; by contrast, the lion’s share of the coverage has been, while constructive in spots, has been mostly overwhelmingly negative in tone.
Take Image Playgrounds, for instance. Most observers in the Apple community loathe the feature for being goofy and generally useless, also pointing to the general distaste over how robots can now create. What this perspective lacks is, of course, empathy for disabled people. Whatever you, able-bodied reader, may think of artificial intelligence and tools like Midjourney, for example, the reality is it’s extremely plausible the advent of Image Playgrounds gives an aspiring artist with disabilities—someone who may not be able to use an Apple Pencil on iPad Pro—a conduit through which to unleash their creativity and self-expression. This is not at all trivial, regardless of one’s philosophical views on art or their views on the quality of Image Playgrounds’ output. It’s perfectly okay for Image Playgrounds to not be your jam, but to sneer at it wholesale reeks of elitism and dishonesty. On the contrary, Image Playgrounds very well could be a disabled person’s jam by empowering them with an accessible way to build things.
The same argument applies to Writing Tools. Perhaps Stephen King needn’t use it; I know I don’t need to use it. Still, the fact the feature exists at all is a net positive. To wit, someone who has certain cognitive conditions, or fine-motor skills which hinders their typing ability (or some combination thereof) may find Writing Tools eminently useful at making creating prose more accessible—and more cogently understood. Like Image Playgrounds, it’s absolutely fair to critique Writing Tools in how performant it is, but it’s critical to bear in mind other use cases beyond one’s own. The problem is, obviously, most people focus on the most people—those whom decidedly aren’t disabled people.
Beyond Image Playgrounds and Writing Tools, there’s more to be appreciative of in an accessibility context. The ability to double-tap the bottom edge of one’s iPhone to type to Siri is a huge deal. That functionality was born out of the longstanding Type to Siri accessibility feature; Apple positions this new version as a way to use Siri stealthily so as to not cause disturbances. The truth is it makes Siri more accessible for people like me who stutter and for those in the Deaf and hard-of-hearing community. More crucially, that Apple’s software engineering groups expanded an accessibility feature for the mainstream is a shining example of how accessibility oftentimes is an incubator for innovation. The pointer in iPadOS? It originated in AssistiveTouch. My years-long understanding from sources has been it was handed off internally to the wider iPadOS software teams so they could massage it into a feature for the masses. Likewise with Double Tap on Apple Watch, as it too began life as part of the suite of accessibility options in watchOS. The salient point is simple: if one wants evidence of innovation at Apple, look no further than in accessibility. The examples I’ve illustrated here give the utmost credence to the company’s mantra that accessibility truly can be for everyone.
From an accessibility angle, what makes Siri so frustrating goes beyond aptitude—it’s functional. Just this week, I ran into an issue when, after coming home from a walk around the neighborhood, Siri insisted it couldn’t unlock my front door because my accessories don’t support it. They do! I can enter the passcode on my Nest × Yale lock just fine, but using Siri was more accessible because my hands were full. I’ve written at length about how voice-first computing can do so much for the disability community that goes further than sheer world knowledge and trivial bits like who won Super Bowls.
All of this goes without mentioning the fact Apple Intelligence integrates with system accessibility stalwarts such as VoiceOver. It’s an exclusive advantage for the company (and its users!) much in the same way Apple silicon-based hardware can be in running large-scale LLMs. These advantages are, again, non-trivial—especially in accessibility.
I wholeheartedly believe my friend and peer John Gruber when he said last week something is rotten down 280 in Cupertino. There’s surely some amalgamation of hubris, lack of preparedness, and incompetence at play to explain Apple’s floundering with Apple Intelligence as it tries to catch up with the rest of the industry. Zac Hall at 9to5 Mac, another friend of mine, described the company’s problems in a recent op-ed; he writes, quite eloquently, in part “the floor for what’s expected of a system like Siri is quickly rising [while] Siri is waiting for someone to decide if maintenance can feasibly repair the elevator while we all take the stairs to the top of the world’s tallest building.”
To be clear, I don’t discount the idea that Apple Park is figuratively on fire right now given the bone-chilly reception to Apple Intelligence since it debuted back in October. Apple is neither beyond reproach nor above criticism. By the same token, the sheer existence of this very piece pointing out the positives with Apple Intelligence is equally valid and important—not to soothe Apple’s pain, but to boost representation of people like me who use Apple products on the margins. This moment is a perfect opportunity to remind people why earnest disability coverage in tech journalism matters. Accessibility matters—and not in the “gee whiz, that’s great for folks” ways that, in all honesty, I personally find patronizing. The fact is Apple Intelligence is chockfull of de-facto accessibility features that, like Apple Pay, aren’t designed expressly for accessibility’s sake but nevertheless have applicability to enabling people with disabilities to do stuff.
Unlike Gruber, I didn’t get the statement from Apple that it’s delaying Siri features. I’m not mad about it in the slightest; it’s just a little beyond my purview. That said, I do use Apple Intelligence on a daily basis and have been attentive to what’s going on. I don’t have rose-colored glasses on. But with this morning’s news from Gurman comes (more) optimism that, should Rockwell and Federighi and troops right the ship, Apple Intelligence won’t merely improve Apple’s play and catapult it in the standings—it’ll make the company’s plethora of platforms that much more accessible for everyone.
New York City’s 504 democratic Club endorses Comptroller brad lander for mayor
In an announcement made on Thursday, New York City’s 504 Democratic Club issued a press release wherein the organization officially signaled its endorsement of the city’s comptroller, Brad Lander, to be the next mayor of New York City. The 504 Democrats, as the group is colloquially known, boasts it’s “the first political club in the country focusing on the issues of concern to the community of people with disabilities.”
The 504 Democrats’ support for Lander bolsters that from fellow local organizations such as the New York Progressive Action Network and the NYC Organization of Public Service Retirees, as well as public officials in the city’s public advocate Jumaane Williams and Brooklyn borough president Antonio Reynoso. The announcement makes note that Lander’s total campaign fund nearing $7 million, sitting at over $6.71 million.
“From his office’s MTA bus audits to their regular disability justice roundtables, Brad
Lander has been the fiercest advocate for New Yorkers with disabilities, which is why
we’re proud to endorse him as the next Mayor of this City,” Mike Schweinsburg, president of 504 Democrats, said in a statement included in the organization’s press release. “We need a mayor not only with strong management experience and the brains for the job, but also one with integrity and decency that New Yorkers can feel proud to get behind. Brad Lander is that candidate—for New Yorkers of all abilities.”
For his part, Lander said in his own statement he’s “deeply honored” for the backing.
“I’m deeply honored to have earned the endorsement of 504 Democrats, who have long
championed the inclusion of people with disabilities in the political and social fabric of
New York,” he said. “My entire career in non-profits and public service has been driven by a central mission to make our City work for every New Yorker. When I’m mayor, I’ll build on the amazing work that I’ve done as City Councilmember and Comptroller with 504 Democrats and deliver a safer, more affordable, better run New York City for all.”
Readers of my Forbes column may recognize Lander’s name. I interviewed him last July about New York City’s then-new disability employment report. In my story, I described the report, which Lander said was driven by an advisory board comprising 25 to 30 people, as a skunkworks project of sorts; the report was put together with “neither assistance nor feedback from mayor Eric Adams’ office or anyone else in City Hall.” At a high level, the report—which, incidentally, was released during Disability Pride Month—showed 1 in 6 New Yorkers identify with some sort of disability. The number is described as “[comprising] a significant proportion of New York City’s population and labor force,” with 1 in 13 New Yorkers ages 25 to 55 who identify with coping with some disability.
Lander explained to me his office’s employment report was enlightening to him and his team because “it’ll lay the groundwork for us to move forward to do additional work to look at the city’s programs and think about the impact they’re having and how they could be more effective.” Moreover, he shared sentiments which are echoed by today’s news, telling me in part he’s hopeful accessibility and disability justice will be viewed with same lens as language access and racial disparities. He expressed his desire that equality and inclusivity vis-a-vis accessibility will, with time, eventually become "one of the important accountability lenses we bring to New York City’s budget and agencies.”
“That’s how I’ll feel good about what we did for people,” Lander said.
The 504 Democrats describes itself on its website as believers in the notion that “the full integration can only be accomplished by demanding ADA compliance and work with our colleagues and representatives in public office to effect positive change,” adding “all must come to understand that anything less than an equal place at the table for disabled people is unacceptable” in politics and in society writ large. The organization, founded in 1983 and taking its name from Section 504 of the Rehabilitation Act of 1973, further notes it’s "the first political club in the country focusing on the issues of concern to the community of people with disabilities [and remains] the only citywide political club dedicated to the civil rights of people with disabilities.” Furthermore, the 504 Democrats plainly states its mission as “[educating and informing] political representatives about disability rights and identifies candidates who align with the issues of concern to the community of people with disabilities; all disabilities.”
As the 504 Democrats note, the Section 504 of the Rehabilitation Act of 1973 was the forebearer of the Americans with Disabilities Act of 1990. I interviewed the ADA’s pioneer, retired congressman Tony Coelho (D-CA), for my Forbes column back in 2020.
Xbox Adaptive Joystick Available to buy now
Microsoft this week put its Xbox Adaptive Joystick on sale. A Microsoft Store exclusive, the $30 peripheral is positioned as a “companion for Xbox controllers” which Microsoft says can be plugged into an Xbox or PC, and is configurable with custom button remapping. The company has also posted a support document on the new device.
The Xbox Adaptive Joystick, a complementary product to Microsoft’s critically acclaimed Xbox Adaptive Controller, is designed for people with limited mobility—particularly in terms of fine-motor skills. On the product’s webpage, Microsoft says the Xbox Adaptive Joystick “helps make gaming more accessible for however you play.” Moreover, the company notes the accessory exists as a companion to the aforementioned Xbox Adaptive Controller and the bog standard Xbox controller.
Microsoft’s accessories for Xbox are analogous to that from Sony. I covered the Access Controller for PlayStation 5 extensively over the last couple years for my Forbes column; it’s good to see gaming heavyweights in Microsoft and Sony level up play in the accessibility arena. In an exclusive profile of disability-in-gaming nonprofit organization AbleGamers posted last May, then-chief operations officer and community outreach director Steve Spohn told me in part in an interview it’s a “strange time” seeing ostensible rivals in Microsoft and Sony banding together in an effort to “try to push the world forward on gaming accessibility.” Spohn, along with AbleGamers’ founder and executive director Mark Barlet, lauded the massive increases in inclusivity of the video game industry in the last several years. Nevertheless, Barlet said there remain “dark spots,” but overall the confluence of progressively-minded development studio and the prominence of social media has enabled people with disabilities to “advocate for themselves in a way we haven’t seen before,” adding technology’s ever-burgeoning capabilities have enabled game makers to “really lean into” creating more accessible and equitable user experiences for members of the disability community.
“We’re seeing new companies that haven’t even released their first game investing in making sure the experience is accessible,” Barlet said about the rise of accessibility in the video game industry. “Then on the flip side, we have studios that aren’t doing much at all. It’s getting better. It’s better than it’s ever been, for sure. But it’s not perfect.”
HBO Announces ‘The Last Of Us’ Will soon stream in american sign language on max
HBO on Wednesday announced its original series The Last Of Us will soon be available to stream on its Max streaming service in American Sign Language (ASL). The special version will debut with next month’s Season 2 premiere, which drops on Sunday, April 13. HBO boasts Max is “the first streaming platform to offer [an] ASL version alongside premiere episodes of a major series.” Season 1 in ASL will be available on March 31.
The ASL translation in The Last Of Us is performed by Daniel Durant.
According to HBO, today’s news represents an “expansion in availability of ASL programming continues to build on Max’s commitment to create a premium and accessible streaming experience for all subscribers.” The Last Of Us in ASL follows ASL versions of Warner Brothers’ films such as Barbie and Beetlejuice Beetlejuice. HBO notes its so-called “With ASL” titles are featured in the Max app alongside key art with the sign language symbol. Beyond ASL, HBO says Max supports accessibility features such as audio descriptions, closed captions, screen reader support, and much more.
“We are thrilled to expand our ASL program and debut our first HBO Original series in ASL with The Last Of Us,” Naomi Waibel, Warner Brothers Discovery’s senior vice president of global product management, said in a statement. “This debut brings the show to life in an authentic and fully accessible way for Deaf audiences and is another meaningful step towards our goal of offering an inclusive streaming experience.”
That The Last Of Us is getting the ASL glow-up is, while certainly notable, not exactly new or novel. The National Hockey League, or NHL, has worked with Deaf advocacy company PXP to build ASL-centric broadcasts of the league’s games for Deaf fans. Last year, I posted an interview with PXP founder and CEO Brice Christianson about the partnership, which includes the first ASL version of the annual Winter Classic game. A fellow CODA, Christianson called the NHL “pioneers” for its trailblazing work on the “NHL × ASL” series. This year’s Winter Classic, played on New Year’s Eve, saw the Chicago Blackhawks host the St. Louis Blues at Wrigley Field. The Blues won, 6-2.
The Last Of Us series is an adaptation of the popular video game (for PlayStation and Windows) franchise of the same name. It’s developed by Naughty Dog and published by Sony Interactive Entertainment. A third season of the show is currently in development.
How ‘Wonder Pets: In the City’ Carries the torch for empathy, inclusion at Apple TV+
Back in mid-December, Apple TV+ put out a press release announcing the then-new animated children’s series Wonder Pets: In the City. The show, produced by Nickelodeon Animation and developed by Emmy-winning author, illustrator, and director Jennifer Oxley, is described as “adorable” and meant to encourage children and their families to “come together to meet charming new characters and go on exciting adventures that spark curiosity and celebrate our unique differences.” Wonder Pets: In the City chronicles the adventures of heroic characters Izzy the Guinea Pig, Tate the Snake, and Zuri the Bunny. By day, the creatures live in a New York City kindergarten class, but by night, they travel the globe in their so-called “Jetcar” to rescue fellow animals in musical, operatic adventures. The show is a spinoff series of Wonder Pets.
I sat down with Oxley via videoconference late last year, not long before Forbes let me loose and my column ended, to discuss making Wonder Pets: In the City. She explained she likes to believe the series should be enjoyable to “everybody,” but said its target demographic is preschoolers. The premise of the show, she told me, was the class pets spring to life once the students and teachers leave school for the day. A craft project, a telephone made from a juice box, starts ringing with a distress cal from an animal in trouble. Oxley characterized Wonder Pets: In the City as having a “music-forward [and] mini-operetta” format, adding dialogue weaves between spoken lines and music. As the heroes are saving animals, they’re singing all the while, according to Oxley.
Notably, Wonder Pets: In the City includes characters created with disability in mind. There’s a snake who’s disabled, as well as an elephant who’s visually impaired. Oxley explained these classroom pets aren’t superheroes in the classical sense—they have no superpowers—but nevertheless “they bring their unique differences and points of view… they are powerful and can do anything,” she said. Teamwork and collaboration, Oxley added, is the trio’s superpower. In an effort to challenger herself and the rest of her creative team, Oxley told me she wanted to “push the storytelling forward” by trying to create stories with more socio-emotional tugs and inclusivity at their heart. The aforementioned snake embodies the push for greater inclusiveness. As Oxley said, he slithers rather than walks. He has no arms. His appearance, she told me, is a “great vehicle for telling stories.” She shares an anecdote about an episode in which a mother chicken has a problem in a runaway egg. When the heroes get to the farm to try to help locate the egg on the lam, the mother chicken wants nothing to do with Tate the Snake because, as Oxley said, he’s “slimy and sneaky.” Tate has to sit out the adventure due to her reticence, but the mother chicken eventually comes to realize “he’s not what she expected at all [and] she was wrong to judge him without getting to know him.”
The axiom to not judge a book by its cover absolutely applies to disabled people.
“Ultimately, we’ve got these three pets who have very different personalities and they’re different types of animals—yet they’re best friends and they come together as one,” Oxley said of her protagonists. “They can work together and find a way to bring all of their strengths to save the day. I’m hoping audience will feel that sort of love and heart and joy of helping others. I’m hoping that will be a takeaway message for them.”
However ostensibly tardy this piece is in being published because I spoke with Oxley before the holidays—running a one-man newsroom ain’t easy—the fact the story is running this week is fortuitous. Apple TV+ is widely known, and critically acclaimed, for cultural phenomenons such as Severance and Ted Lasso. CODA became the first streaming film to win a Best Picture Oscar in 2022. Severance is showing the hotly-anticipated Season 2 finale on Friday. Ted Lasso is coming back for a fourth season. I love both shows myself. Where Apple TV+ gets far less adulation lies in what Oxley and Wonder Pets: In the City have embraced: inclusivity. To wit, it’s extremely meaningful Apple TV+ is home to a slew of shows featuring disability prominently and matter-of-factly. From See to Little Voice to Best Foot Forward to El Deafo—incidentally, all of which I’ve covered in the past—Apple TV+ has an impressive roster of shows that put disabled people in the spotlight. None of them are active in terms of new episodes, and it’s fair to perhaps not like them as entertainment, but all are worth a watch if only to see people a lot like Tate the Snake: to see someone who look different actually be normal and, in the case of See, seeing blind people seriously kicking ass while simultaneously paying homage to the norms of the Blind and low vision community.
In a society where disability is commonly seen as a fate worse than death, and disabled people portrayed as moribund and generally hapless, that Apple TV+ is home to so many shows which depict the polar opposite is significant and downright triumphant. Although Severance gets the glitz and glam, and deservedly so, it means something, as a lifelong disabled person, to see others who look like me loom large in big budget Hollywood productions. Apple’s TV+ leaders in Jamie Erlicht and Zack Van Amburg both are deserving of the utmost credit for, amongst other things, taking the company’s ethos on accessibility with its consumer tech products and applying it to the myriad projects which emanate from Cupertino’s ever-burgeoning entertainment division.
For her part, Oxley credited pre-existing relationships with Apple TV+ executives such as Tara Sorensen, who leads children’s programming for the streaming service, as one reason Wonder Pets: In the City came to be. Apple, Oxley told me, has its own sensibility; she called it a “fun challenge and collaboration” to work with them in bringing Wonder Pets: In the City to life. Oxley wanted to stay true to the work, but make it pair well with Apple’s vision for shepherding its nearly 6-year-old entertainment arm.
When asked about the future, Oxley expressed enthusiasm and optimism. She confessed to not being the type who’s terminally online, toiling over umpteenth Reddit threads, but she’s aware there’s always going to be feedback, both good and bad. It’s her hope Wonder Pets: In the City will be as well-received as the original, as she knows it’s hard for reboots to pass muster with diehard fans. She’s confident, however, the new series stays true to the old and thinks audiences will pick up on that. Oxley wishes audiences will be appreciative of how “a lot of the original DNA is still in this new series.” She’s also hopeful Wonder Pets: In the City will attract new viewers, telling me the music in particular, what with recording a live orchestra for each episode, should resonate pretty deeply with veterans and rookies of the Wonder Pets canon alike.
All 13 episodes of Wonder Pets: In the City are available on Apple TV+ now.
the New M4 MacBook Air’s killer feature isn’t Apple silicon—it’s accessibility
Following yet another teaser tweet from CEO Tim Cook, Apple earlier this week announced refreshed iPad Air and MacBook Air models. The computer, which Apple touts is “the world’s most popular laptop,” is powered by the M4 chip, sports an all-new—and very pretty—sky blue finish, and starts at $999. Technologically speaking, there’s absolutely nothing bad or wrong with my M2 MacBook Air—but I’m really loving that blue (my favorite color!) and can’t wait to get my hands on one in person at some point.
The M4 MacBook Air (and the refreshed Mac Studio) is available to pre-order now. It goes on sale beginning Wednesday, March 12, according to Apple’s announcement.
A blue laptop notwithstanding, color isn’t the most interesting aspect of the new Air.
From an accessibility perspective, what’s most interesting about Apple’s latest and greatest MacBook is mentioned later in the company’s press release. By way of April’s public release of macOS Sequoia 15.4, Apple says Mac users will gain the ability to set up their shiny blue laptop with only their iPhone. According to 9to5 Mac’s Jeff Benjamin, the so-called “proximity setup” functionality is present in the iOS 18.4 beta. He reports the proximity setup feature works “just like” it does when setting up a new iPhone or iPad. To wit, bringing one’s phone nearest a Mac will display the setup card akin to, for instance, setting up AirPods or even a new Apple TV box. The masses will claim this feature is convenient, and it is, but it arguably matters more in terms of accessibility.
As someone who regularly gets Apple review units—some embargoed, others not—I can attest to the “problem” with configuring so many new devices. While I heartily acknowledge my position of privilege in the tech media and know it’s the quintessential first-world problem, nevertheless there are practical concerns. Namely, it would be extremely annoying (and inaccessible) to put all this review hardware through their proverbial paces without help from Apple’s proximity setup feature. It makes things so much easier—I needn’t have to sign into my iCloud account nor give my Wi-Fi credentials; my iPhone does all the requisite heavy lifting for me. Why this matters from a disability point of view is obvious: it takes a relatively considerable amount of cognitive load and visual/motor skills to remember, say, one’s iCloud or Wi-Fi information. Then a person must type it all in, which can be taxing both in terms of visual and motor acuity depending on one’s needs and tolerances. Granted, the system(s) do prompt users to enable accessibility features during the maiden voyage of sorts, but the point remains valid. As with other aspects of the Apple Experience, proximity setup is as much a de-facto accessibility feature as Apple Pay or Apple using iCloud to propagate AirPods pairing with people’s constellation of devices. It’s these ostensibly mundane implementation details that make Apple devices beloved by so many in the disability community. Like Boyz II Men once said, the little things mean a lot.
Given my intimate familiarity with the iOS/iPadOS setup process, I imagine what’s coming in the aforementioned software updates will be just as accessible and useful in setting up new Macs. I, for one, am thrilled to see Apple bring the functionality to macOS, not merely as a gadget reviewer—but especially as a lifelong disabled person.
Inside Pittsburgh International Airport’s Efforts to make Air Travel Accessible to All
Take a glance at my Flighty statistics from 2024 and you’ll notice I flew a lot last year. According to the app on my iPhone, I took 15 flights—two of them cross-country trips—spanning nearly 17,000 miles, 9 airports, and 4 airlines. It was the most I’ve ever flown in my life—which is significant because I didn’t start flying with regularity until 2014. However frequent I have flown in the past year, however, one thing is assured: I have utter disdain for airports. The actual flying, I have no problem with; this aligns with my similar experience riding in autonomous vehicles like Waymo. My problem is the rigamarole of traversing the airport as a lifelong disabled person. Especially with security, it’s been my experience that, of the many airports I’ve been through in the last decade or so of getting on airplanes, none of them are particularly accommodating nor empathetic of the disability community. As someone whose anxiety and depression already is sky-high on the ground, the stress meter routinely runneth over each and every time I leave my house for the closest airport in San Francisco International.
It’s these personal experiences which attracted me to telling the story of Jason Rudge and his family. A heavy equipment operator at Pittsburgh International Airport, Rudge’s son, Presley, is disabled. In an interview with me conducted late last year via videoconference, Rudge explained Presley was put into a preschool readiness class when he was 2 years old. The class was designed for children with disabilities, and Presley had a hard time being there at first. He would tolerate being there only 15 minutes before having a meltdown. Rudge and his wife were prepared to leave with Presley before the teacher encouraged them to stay. There was a room, the teacher said, where Presley could go to calm down and readjust himself. Rudge likened the room to essentially being a “big closet” with amenities like bean bags, string lights, and a disco ball. The sensory room, as it’s known, proved to be revelatory for Presley, with his dad saying he “loved it in there” and eventually got to a place where he could actually be back in the classroom, now ready and willing to engage with his peers.
Working at Pittsburgh International is more than a 9-to-5 job for Rudge. It has a sensory room of its own called Presley’s Place—not to be confused with Pesky’s Pole in Boston. On its website, Pittsburgh International describes Presley’s Place as a “calming respite for travelers with sensory sensitivities and their families to de-escalate prior to getting on a plane or even after landing.” The airport has a video on its YouTube channel. According to Rudge, Presley’s Place is situated next to an accessible restroom, replete with sinks that can move lower or higher so as to accommodate wheelchair users.
Crucially, Rudge emphasized Presley’s Place isn’t solely for children or the disabled.
“It’s for everybody,” he said of his son’s namesake room. “[It’s for] first-time flyers [or] military with PTSD who doesn’t like to be in crowds. We’re trying to let everybody know this isn’t just for children. It’s not just for people with disabilities. It’s for everybody that really needs it… for people scared of flying or who never flown before and is nervous. You can go in there and calm down and get away from everything for your flight.”
Christina Cassotis, who’s chief executive of Pittsburgh International, explained to me Rudge came to airport leaders with the idea the place could benefit from having a special room similar to the one his son thrived in at school. She met with Rudge in person to discuss the concept, coming away so impressed by his thoroughness she told him “we were doing this” right then and there. What eventually would become Presley’s Place was a natural extension of what Cassotis and team were doing to further inclusivity, as she said accessibility and the notion of “travel for all” already was an area of intense focus at Pittsburgh International. Presley’s Place, Cassotis said, “really put us on the map nationally” when it came to accessibility and inclusivity.
“We believe very strongly in the idea of travel for all,” Cassotis said. “Pittsburgh International Airport is focused on improving the passenger experience, particularly for communities that haven’t always been at the forefront of the industry’s mind.”
Cassotis underscored Rudge’s sentiments that Presley’s Place is welcoming to literally anyone who needs to be there, telling me the room is “not limited to any single group.” She did concede, however, Presley’s Place is “geared towards” individuals with sensory sensitivities such as those coping with autism, as well as others in the neurodivergent community. What’s more, there’s even a cabin installation, complete with jetway, so that nervous passengers are able to “understand what a flight is like.”
“Sensory rooms like [Presley’s Place] mean the difference between an individual or a family being able to travel at all,” Cassotis said.
Presley’s Place celebrated its 5-year anniversary not long ago, with Rudge and Cassotis both marveling at the room’s success. Rudge said it’s a “great thing” to consider how far Presley’s Place has come in the last few years, telling me the airport in Grand Rapids, Michigan (and in San Francisco) have incorporated similarly-modeled sensory rooms. Cassotis reiterated Pittsburgh International being a “proud national leader” in accessibility and inclusivity, telling me the airport has received numerous inquiries from other airports on how Presley’s Place was designed and developed.
“Every aspect of the design was considered because we spoke to affected individuals directly and got their input—adults, children and families,” Cassotis said.
She added: “Presley’s Place is industry-leading. [It’s] been recognized across the world as the most comprehensive sensory-friendly space anywhere in the travel industry.”
Rudge firmly believes Pittsburgh International “hit the nail on the head” with building Presley’s Place. It’s the best room of its kind he’s seen anywhere; he told me he’s heartened to know other airports are following Pittsburgh’s innovative lead in this realm. He hopes every airport in the world can someday have its own Presley’s Place.
“The feedback on [Presley’s Place] sensory room in particular has been fantastic,” Cassotis said of the room’s reception. “We hear from travelers all the time thanking us for the room and our role in helping make someone’s trip better. Travel for all is really at the heart of what we do as part of passenger experience.”
As to the future, Cassotis said. Pittsburgh International will “continue to be a leader in accessibility,” adding she’s “so proud” of staff who worked so diligently on Presley’s Place. She called Rusge’s brainstorm “truly grassroots” and lauded the airport’s work in working with autism awareness groups to carefully select appropriate fixtures such as furniture, lighting, and more. Funds for the project came from “sizable donations” from local groups, as well as stuff like furniture being happily donated for the room.
Hundreds of passengers visit Presley’s Place annually, according to Cassotis. She added airport leaders are “constantly [hearing] from them how key” Presley’s Place is to shaping a positive experience while passing through Pittsburgh International.
How Asisat Oshoala and the GSMA is Making Technology more accessible to everyone
I like to believe 2024 was a seminal year for growing my lifelong sports fanaticism. To wit, last year saw the widening of its aperture so as to focus strongly on women’s sports. In March, I travelled to Las Vegas for my first in-person women’s event in the NCAA basketball tournament; I got to see Cameron Brink play for Stanford not long before the Los Angeles Sparks picked her second overall in the WNBA draft. Then in May, I went to see what was the NWSL’s newest franchise in the expansion Bay FC club play a home game at San Jose’s PayPal Park. And just last month, I excitedly bought a JuJu Watkins t-shirt because I’ve become such a huge fan of Watkins’ play for USC.
Given this, you’d understand my excitement over recently interviewing Asisat Oshoala.
I sat down virtually with the Nigeria-born Oshoala, who plays for the aforementioned Bay FC, earlier this month to discuss her career and her ambitions off the pitch. The latter is anchored by the eponymous Asisat Oshoala Foundation, of which Oshoala described as establishing “a couple years ago” as a way to “create [soccer] playing opportunities for young girls in Nigeria.” Her organization, Oshoala told me, gives young girls “[an] opportunity [and] give them a platform to showcase their talent.” To that end, Oshoala invites high-profile players such as those who play for the Nigerian national team to play against in matches; these opportunities give the girls in Nigeria “hope for the future and have the confidence and know that they can also reach greater heights.”
Another philanthropic effort from Oshoala involves her association with GSMA. The nonprofit organization describes its mission on its website as “unifying the mobile ecosystem to discover, develop and deliver innovation foundational to positive business environments and societal change,” adding its goal is to “unlock the full power of connectivity so that people, industry and society thrive.” One of the areas in which the GSMA wishes to make the world a better place—and which is endearing to Oshoala—is closing what’s called the Usage Gap. According to the GSMA, the Usage Gap “prevents individuals from being able to access critical digital services such as healthcare, education, ecommerce, financial services, and income-generating opportunities.” For her part, the GSMA tapped Oshoala as the organization’s spokesperson for the Breaking Barriers campaign aimed at bridging the Usage Gap.
As the GSMA soberingly notes on its website, 3.1 billion people, or 39% of the world’s population, reside in areas serviced by mobile broadband—but do not use mobile internet service. Likewise, almost 90% of unconnected people do have access to mobile broadband, yet face other barriers which the GSMA says “[prevents] them from using digital services.” This, the organization notes, represents the Usage Gap.
Oshoala explained the girls she serves through her foundation not only receive soccer (or, football) training. To technology and the GSMA’s mission, she told me the girls also receive computer training and other work-related skills because, as Oshoala said, “we like to teach them other things as well.” Conversations between Oshoala’s team and the GSMA centered on how technology can better the lives of girls in Africa, with Oshoala telling me the tech space is “a great one” for the girls and her team felt strongly the collaboration with GSMA made perfect sense in terms of shared values.
“[The work with GSMA] is going to be helpful in society, especially in Africa,” Oshoala said. “A lot of people don’t have access to the internet… this is going to be a great step.”
When asked about access to technology in Africa, Oshoala told me it’s gotten way better than it used to be but there’s still a ways to go. A lot of people on the continent do have access to the internet, but many do not—and those are the people Oshoala and GSMA are trying to reach. There are other limiting factors, including infrastructure and product costs. Financially, Oshoala said, cell phones in Africa are still pretty expensive; it’s also true internet access in itself is too pricey for many. A byproduct of this lack of access in Africa is it engenders a commensurate lack of interest amongst many folks.
“We have millions of people who have access to the internet, but we still have millions who do not have access to technology,” Oshoala said.
At a high level, what Oshoala and GSMA are endeavoring to do is all about accessibility. Oshoala and everyone is addressing accessibility in the literal sense, working to get technology accessible to people in Africa. Of course, I’d be remiss not to point out the obvious by saying accessibility matters in the disability sense as well. To wit, surely there are disabled people in Africa who want to use technology and the internet. It’s worth it even for social media alone, as it’s highly plausible social media is a primary way they socialize with others. Likewise, technology may be necessary in order to connect with one’s medical team. The salient point is simple: technology is truly like electricity insofar as it’s an essential good for sustenance beyond even nerdier facets such as learning to code, for example. This is deeply resonant with Oshoala, as technology has helped her become the renowned professional athlete she is today.
Cliche as it sounds, Oshoala truly is giving back to the people. She ultimately wants to empower young girls to pursue their dreams and provide better lives for their families.
“I know there are lots of people who could have been in a better position as well if they had access to the internet,” she said. “It’s about connection. When there is lack of connection, that [limits] opportunities. I just look at my journey, and I feel like there could have been a million and one Africans on the Internet in the world if the connection was easier [and] accessible to everybody and everyone has access to the internet. It makes me feel great that I have this opportunity to work with a company like GSMA to help me achieve my dream of helping more people in society get internet access.”
Oshoala said her family in Africa is “happy to support” her in all her pursuits.
Looking towards the future, Oshoala expressed optimism. She told me she wants to continue her work and believes 2025 will be an exciting year. The collaboration with GSMA will enable her to push even harder on helping people in Africa, as she wants to equip people with the tools they need to go into the future ready for success. Technology is integral to achieving such success, and Oshoala plans to do things like establish free Wi-Fi networks in Africa and give away free cell phones for people to use.
“We’re going to do a lot and make [people] see the world differently,” Oshoala said.
Apple Announces AirPods Pro Hearing Aid Functionality expands to united Kingdom
Apple on Monday put out a press release in which the company announced its hearing aid feature for AirPods Pro 2 is now available in the United Kingdom. News of the functionality’s expansion comes a few months after Apple launched the hearing aid feature in the United States; it was part of the iOS 18.1 update—the very same that unleashed Apple Intelligence unto the world—delivered to users at the end of October.
Apple boasts the hearing aid feature is “clinical-grade,” but simultaneously stresses the software is intended only for people who cope with mild-to-moderate hearing loss.
“At Apple, we believe that technology can help people live healthier lives, and we’re delighted to bring the Hearing Aid feature to the [United Kingdom], offering our users an end-to-end hearing health experience with AirPods Pro 2,” Dr. Sumbul Desai, Apple’s vice president of health, said in a statement included in the company’s announcement.
As I said when commenting on the company’s 2024, it’s my firm belief that the advent of the hearing aid feature exemplifies the Apple’s oft-stated ambition to make products designed to facilitate the betterment of the world. Nowhere is this illustrated more clearly (and poignantly) than in Apple’s Heartstrings holiday ad which ran towards the end of last year. The spot depicts a family gathering on Christmas morning to open their gifts from Santa Claus. The patriarch of the clan enables the hearing aid function on his AirPods Pro so he can listen to his young adult daughter serenade him after hearing the muffled sounds of everyone’s excited conversation and the ruffling of wrapping paper.
As I wrote back in November, it’s significant that an accessibility feature be thrust into the spotlight for a prominent campaign such as Apple’s annual holiday commercial. Of course there’s marketing and consumerism angles—Apple shrewdly relies on emotional appeal to goad people into buying AirPods partly because they have the potential to better lives—but the arguably more salient point is how disability is put at the forefront. In this sense, Heartstrings is practically a cookie cutter piece, conceptually speaking, to others launched in recent years from Apple. To wit, last year’s The Relay short film comes to mind, as do others like The Greatest and The Lost Voice.
I interviewed Apple’s Sarah Herrlinger about AirPods and hearing aids in December.
Apple’s Newest Budget Phone Brings Accessibility in more ways than one
Apple this week announced the iPhone 16e. The company put out a press release on it, but it also posted a video to YouTube. I chose to indulge in the latter medium for entertainment value. Wednesday’s news came after CEO Tim Cook took to X to tease what he coyly characterized as introducing the “newest member” of the Apple family.
The entry level iPhone, which Apple says costs $599 and ships on February 28, replaces the iPhone SE as the “low end” of the iPhone product line. There’s a certain fortuitousness with the launch of the 16e, as it comes soon after Forbes let me loose late last month after a four-year stint as part of its contributor network. My first story for my dearly departed column, posted in April 2020, was about the then-new iPhone SE. What’s more, the release of the SE model the 16e is supplanting, released in 2022, was the subject of my first-ever embargoed iPhone review—also published to my column.
After a day or so of digesting the iPhone 16e news, I believe it largely compels from an accessibility angle. iPhones, regardless of their place in Apple’s pecking order, are unabashedly premium, top-tier smartphones. They’re unquestionably expensive. By and large, lots of disabled people cannot afford anything but the budget-conscious iPhone. Whatever niggles the nerds have about the 16e’s feature set—more on that below—the bird’s eye take in an accessibility context is the 16e holds a tremendous value proposition. To wit, not only can a disabled person get an iPhone and all the prestige and capability that comes with it, they gain access to what’s arguably the industry’s best all-around suite of accessibility features. This isn’t a trivial matter; I, what with my highest-end iPhone 16 Pro Max, live and breathe through my phone. It is, without question, my most important and oft-used computer. My life is on there.
The bottom line: if all you have to spend is $600 and you want an iPhone, the 16e is it.
Now for the particulars, beginning with the most important metric of all: price. At $599, the 16e is significantly costlier than the aforementioned $429 iPhone SE that was put out to pasture. That difference is nearly $200, which is a lot of money to many folks. It’s highly plausible a disabled person eyeing an iPhone may need to look elsewhere—perhaps the refurbished market or somewhere like Amazon or Best Buy. There’s no shame in shopping the secondary market—I’ve seen reports of iPhone 15 Pro devices on Amazon for roughly the same cost as a brand-new 16e—many people find it comforting to buy new from the vendor for the same reason Linus loves carrying around his blanket everywhere. Especially from a quality assurance perspective, it can be a crapshoot at times in knowing the actual working condition of secondhand electronics.
Now onto the technical attributes. Most curious to me amongst Apple’s choices in building the iPhone 16e is to leave the Dynamic Island and MagSafe on the proverbial cutting room floor. To the former, the company just a few short months ago crowed about the base iPhone 16 getting the Dynamic Island. In the almost three years of its existence, I’ve found the Dynamic Island to be a greatly accessible way to keep tabs on information like kitchen timers and, catering to my avowed fanaticism of all things sports, game updates from Apple Sports. Component costs notwithstanding, I’d imagine a person with low vision contemplating the 16e could be disappointed by its lack of the Dynamic Island—causing them to look elsewhere for a similarly priced iPhone with the feature. I remember speaking with Alan Dye, who helps lead Apple’s industrial design group, following the iPhone 14 media event in September 2022. He was enamored with my brief comments on how the Dynamic Island could impact accessibility for the disability community, telling me it was a priority for his team.
As for the latter, the absence of MagSafe in the 16e is conspicuous as all hell. I’ve gone on the record innumerable times over time in which I extol the virtues of magnets in tech like MagSafe and the iPad’s Smart Covers. While the 16 does support Qi charging, MagSafe is better because, by virtue of physics, the magnetic attraction makes it such that a person needn’t fiddle with alignment to get the phone in the right spot to begin charging. Qi charging does have accessibility merit of its own—namely, a disabled person with questionable hand-eye coordination is saved from plugging in a cable into the iPhone’s USB-C port—but MagSafe talks the general concept a step further by using magnets to expedite the process. Like I said about the Dynamic Island, the omission of MagSafe in the 16e is yet another addition to the con column that could cause a disabled person to look elsewhere—and for a perfectly legitimate reason.
Lastly, some (more) thoughts on ProMotion. The 16e has an ostensibly lowly 60Hz display compared to the standard 120Hz on its more beefed-up brethren. The nerd set and the mainstream tech media like to say it’s bad Apple continues to sell phones with 60Hz displays, but I say the consternation is somewhat overwrought. This is a hill I’ve died on many times over the last several years; I wholly support technological progress and thus bumping the baseline to 90Hz. My problem with the community’s collective stance on high refresh rate screens is they make it sound like it’s table stakes for computers to be 120Hz. In other words, to have 60Hz on, for instance, the 16e will make for a demonstrably worse user experience. I just don’t buy it. What the press and YouTubers fail to realize is not everyone has the ability to really appreciate smoother scrolling and more fluid animations. It isn’t a matter of taking away ProMotion from people who like it; it’s a matter of understanding literally not everyone benefits. It’s disingenuous (and dare I say, privileged) to bemoan the “there’s still 60Hz in 2025” when the reality is it isn’t a big deal in the grand scheme of things. Let me be crystal clear in saying, sure, Apple should boost the baseline to 90Hz in the name of progress in the same way they finally moved to 16GB of RAM in MacBooks. At the end of the day, however, ProMotion is not a make-or-break feature; tech enthusiasts would do well to temper their expectations. While I intellectually acknowledge ProMotion is on my 16 Pro Max, my low vision is so bad that ProMotion effectively doesn’t exist in practice. It may as well be the basic 60Hz. Is my usage made worse by this? Not one iota, I promise you.
Anyway, the iPhone 16e certainly looks the part of an eminently capable, modern budget iPhone. It’s perplexing in places, but remains an iPhone. That counts for a lot.
Inside Nvidia’s New ‘Signs’ Platform and the Meeting of ASL And Artificial Intelligence
I’ve been fortunate to cover a bevy of tech companies in my career as a journalist—Intel, OpenAI, Salesforce, amongst countless others—that, were you to play a quick game of word association, accessibility likely wouldn’t be one you’d blurt out. As it turns out, however, all three Bay Area-based companies indeed do care an awful lot about making technology usable by disabled people. In fact, to discover that a chipmaker like Intel and an enterprise software company in Salesforce—both areas which ostensibly have absolutely zero pertinence to accessibility—works so hard to make their wares inclusive to the disability community is simultaneously enlightening and heartening.
So it goes for Nvidia.
In a blog post published on Thursday, the Santa Clara-based company announced its new Signs platform. The software, which Nvidia describes as a “validated dataset for sign language learners and developers of ASL-based AI applications,” was conceived and developed in collaboration with the American Society for Deaf Children and creative agency Hello Monday, in an effort to increase representation of American Sign Language in AI-powered datasets. Nvidia notes ASL ranks third in the United States in terms of prevalence, behind only English and Spanish, yet there exist “vastly fewer AI tools developed with ASL data” compared to the aforementioned top two languages.
Nvidia has posted a video demonstrating Signs on its YouTube channel.
“Sign language learners can access the platform’s validated library of ASL signs to expand their vocabulary with the help of a 3D avatar that demonstrates signs—and use an AI tool that analyzes webcam footage to receive real-time feedback on their signing. Signers of any skill level can contribute by signing specific words to help build an open-source video dataset for ASL,” Nvidia wrote in part in its announcement. “The dataset—which NVIDIA aims to grow to 400,000 video clips representing 1,000 signed words—is being validated by fluent ASL users and interpreters to ensure the accuracy of each sign, resulting in a high-quality visual dictionary and teaching tool.”
In a brief interview conducted earlier this week ahead of today’s news, Nvidia’s manager of trustworthy AI product Michael Boone—coincidentally, he’s credited with the byline for the company’s blog post—explained to me Nvidia decided to work on the Signs project because they saw a need for it. A “large majority” of parents who have deaf children, Boone said, don’t know ASL and aren’t learning it; children are developing their signing skills outside of the home, he added, but Nvidia seized on an opportunity to help bridge the proverbial gap in terms of communicating with one’s nuclear family.
“We want [Signs] to help bridge the communication gap,” Boone said. “Looking at what had been done for [signing] individual letters, we figured it would be helpful to take it to the next step and create a database and user dictionary for words and short phrases.”
When asked about the technical aspects of Signs, Boone told me the state-of-the art for learning ASL is to watch a bunch of YouTube videos and maybe work with a live interpreter. What makes Nvidia’s work so novel, and so interesting, he said to me, is there heretofore hasn’t existed a way for ASL learners to garner real-time feedback on their language for free. According to Boone, Signs is an example of computer vision: using artificial intelligence, Signs has the ability not only to detect where the different parts of the body are, it’s also able to understand how a user is placing their hand as well as the “sweeping movements” of signs. All told, Boone said Nvidia’s overarching goal, technologically speaking, is increasing fluidity and ensuring the software is properly instructing the user to become proficient at speaking sign language.
At a macro level, Boone said the primary goal with Signs is twofold. The first, of course, is pedagogical. Nvidia (and its partner organizations) wants to teach people ASL. Secondly, the platform also exists as a conduit through which Boone a team can “curate a data set that can then be used to enable more accessible AI technology.” Crucially, Boone said Signs has been purposefully built by and for the ASL community, telling me the platform is expressly designed to involve and engage the community.
As I wrote at the outset, the reality is Nvidia is not an institution necessarily revered for assistive technologies. For his part, Boone acknowledged the company’s relatively nascent reputation in this realm and said Signs indeed does align with Nvidia’s “founding principles” that “enable everyone.” From ideation to delivery, he told me, Nvidia’s North Star always is to build technology which “enables as many groups as possible to benefit.” Specific to Signs, Boone said it’s something that’s “high quality [and] has robust data and [involves] all stakeholders within the community, from research to product and our future partners who will benefit from using this data.”
Besides Boone, I also had the chance to connect with Cheri Dowling. Dowling serves as executive director of the American Society for Deaf Children. In a short interview with me, she called Signs “a great tool” and said when she first learned of the project, she immediately knew her organization should back it. Dowling expounded further, saying Signs is a tool with which ASL learners can hone their fluency through what she characterized as a “fun website.” She emphasized how Signs gives users grace if they forget a sign, as they’re allowed to look it up and practice while taking solace they’re getting the mechanics right. As to future versions, Dowling said she’d like Signs to grow from single words to phrases and someday even full-on sentences. “With technology the way it is these days, the possibilities [for improvement] are endless,” she said.
When asked about feedback on the new software, Boone told me Signs has undergone extensive testing both internally and externally “for several months” now. He’s been nearly anticipating today’s public launch, telling me Signs marks the beginning of a journey that sees great potential to “create additional, accessible technologies.” Boone further noted Nvidia has partnered with the Rochester Institute of Technology for help with Signs’ user interface and user experience. The big idea here, he said, was to ensure the creation of a “transparent, safe, and explainable solution” for learning ASL. “I’m excited for what this dataset will be able to be used—not just for the user dictionary, but also an eye towards [making] future technologies as well,” Boone said.
For Dowling’s part, she said her team is “really excited” to see Signs set forth unto the world. Her organization offers online ASL classes and she noted Signs will be heartily recommended henceforth as a way to bolster and supplement the classwork. Dowling said the Nvidia team has been “great” to work alongside on making Signs a reality.
“It’s so important for families who have children using ASL to learn the language,” she said. “This is a fun way to do it together.”
Looking towards the future, Boone told me he portends it “full of possibilities.” He reiterated Nvidia’s raison d’être of building technologies for everyone, saying Nvidia was founded to “solve the world’s greatest challenges.” He also expressed appreciation of, and enthusiasm for, artificial intelligence’s capacity for doing genuine good. The advent of Signs, Boone told me, is an exciting development for Nvidia that “not only teaches but is also helping to enable other members of the ecosystem.”
Apple and Netflix Must Do Right and Finally Make TV App Peace for accessibility’s Sake
The internet was abuzz on Friday when it appeared Netflix and Apple had buried the proverbial hatchet with regards to integrating with the TV app when it was reported—many times over, at that—Netflix material was miraculously appearing in the TV app’s Watchlist section. This part of the software is where Apple allows users to see which TV shows and movies are currently in progress; clicking on something will immediately take you to whatever app—Prime Video, for example—you used to play the content. Companies such as Amazon and ESPN, to name but two participants, must to elect to support said Watchlist integration. The main thrust of today’s jubilance over Netflix’s ostensible acquiescence is because the Los Gatos-based streaming giant is the elephant-sized sore thumb on the many hands of TV app supporters on tvOS.
Alas, The Verge’s Chris Welch dutifully reports the integration was “a mistake.”
“Netflix spokesperson MoMo Zhou has told The Verge that this morning’s window where Netflix appeared as a ‘participating’ service in Apple TV—including temporary support for the watchlist and ‘continue watching’ features—was an error and has now been rolled back,” he wrote earlier today in a followup story. “That’s shame. The jubilation in our comments on the original story was palpable.”
The point of the Watchlist integration is, of course, about greater convenience. It’s more convenient to have one’s still-in-progress watches, across an expanse of services, collated into one neat and tidy location. As I’ve argued innumerable times over the years, however, convenience and accessibility aren’t one and the same. Beyond sheer convenience, the reality is the TV app’s Watchlist feature is a de-facto accessibility feature; instead of app-hopping, trying to figure out which movie or television show streams where, it’s more accessible to launch the TV app to find it there. This is especially beneficial for someone with cognitive disabilities, as the mental load associated with trying to remember where—and how—to find a particular piece of content is the furthest thing from trivial. Do it enough and it sullies the overall user experience. It makes watching Netflix inaccessible if only because it’s highly plausible a person has trouble remembering that, for instance, A Man On The Inside is on Netflix. (A series, I should add, is set in San Francisco and well worth watching.) Personally, I have no such trouble usually remembering what streams where, but nonetheless appreciate the TV app integration for accessibility’s sake. I was as jubilant as everyone else at today’s news for that very reason, and it’s a major letdown to learn from Welch (and Bloomberg’s Mark Gurman) that the integration with the TV app indeed was a bug.
This news serves as yet another reminder that Apple and Netflix need to make up posthaste. Apple should pay Netflix whatever it takes to make the integration happen for its customers, myself included. Moreover, the news also is a reminder of how much work Apple should do in improving tvOS. As I said in the Six Colors Report Card, I’m still waiting for the company to give tvOS its iOS 7 moment. Which is to say, rethink and redesign it all to make it more than a static grid of app icons. There is so much potential for greater information density on screens as big as a television’s that Apple is seemingly reticent to tap. Of course, the grid of icons has its benefits too, but I’ve long maintained the tvOS interface is backwards. The stuff within the TV app—most notably the Watchlist—shouldn’t live in a silo. It should be akin to Google TV, easily discoverable and flanked by a bunch of content recommendations. By comparison, Google does this extremely well; I’m a happy subscriber to both YouTube Premium and YouTube TV and am really happy with the suggestions I get on what I should be watching. With the advent of Apple Intelligence a few months ago, there lies a huge opportunity for Apple to flex its AI muscle in this realm. In fact, spend a few minutes scrolling the TV app and you’ll notice there are a bunch of recommendations—in my case, most of them quite good. The problem is, again, this stuff is buried and siloed when it should be exposed and out in the open. Most people will claim it convenient, but the truth of the matter is it’s greater accessibility too. This is not an insignificant point for legions of people.
If only Eddy Cue would make the drive to Netflix headquarters with fat check in tow.
Amazon Updates Prime Video on Apple TV with New accessibility features, More
According to The Verge’s Emma Roth, Amazon this week released a substantial update to its Prime Video app on tvOS. The enhancements include higher resolution poster art, integration with the touchpad on the Apple TV’s Siri Remote, and most interestingly for me, support for accessibility features such as VoiceOver, Hover Text, and Bold Text.
The update runs on all Apple TV 4K models, as well as the decommissioned Apple TV HD.
After reading Roth’s story late yesterday, I installed the update on the Apple TV in the living room and played around in the user interface before watching the first few minutes of Christopher Nolan’s Oppenheimer. My first impressions are Amazon did a really nice job with this upgrade; the artwork looks spectacular on my brand-new 77” LG C3 OLED television and the trackpad gestures work swimmingly. The only bad thing I noticed was, best to my knowledge, the new Prime Video app seemingly doesn’t support Amazon’s X-Ray feature any longer. Maybe it does have it, but I missed it.
The Prime Video news came as a surprise to me, as I’ve been content with the version Amazon rolled out several months ago. Last July, I published a piece on the new app after going down to Culver City on a reporting trip and visiting Amazon MGM Studios for a small-ish media event around Prime Video. My reporting of the event featured in-person (!) interviews with Prime Video executives Raf Soltanovich and Kam Keshmiri. Both men talked with me about how the then-unreleased new Prime Video app had been built with accessibility top of mind, as accessibility is important to Amazon.
Jason Snell put it well when linking to Roth’s report: “There was a time when Prime Video was one of the worst major streamer apps on tvOS, but those days are over.”
Pondering The New Powerbeats Pro’s potential for hearing Accessibility
Apple-owned Beats released refreshed Powerbeats Pro this week. The earbuds, priced at $250 and described by Beats as “built for athletes,” are ideal for fitness and working out. Megastar athletes such as LeBron James, Shohei Ohtani, and Lionel Messi have been seen wearing them. The new Powerbeats Pro feature an H2 chip, a heart rate sensor, and come with a case Beats says is a third smaller than the previous version.
Chance Miller at 9to5 Mac posted a review worth your eyeballs.
The Powerbeats Pro’s most distinctive attribute, however, is its ear hooks. Whereas something like the Beats Studio Buds+—which I snagged at a discount on Amazon during last year’s Prime Day—fit more into one’s ears, the Powerbeats uses an ear hook that fits outside the ear for security. As I wrote on Mastodon, it occurred to me the aforementioned ear hooks means Powerbeats Pro bear a strong resemblance to many prescription hearing aids available from audiologists. Stylistically speaking, the Powerbeats Pro, what with colors like “hyper purple” and “electric orange,” are far more pleasing compared to the drab, staid appearance of prescription hearing aids.
It got me thinking about the hearing health features in AirPods Pro 2. The hearing aid feature, released as part of iOS 18.1 last October, is currently exclusive to AirPods Pro; obviously subsequent models will support it, but what of Powerbeats Pro? I know nothing about the underlying technical requirements specific to AirPods Pro such that the hearing test/hearing aid is exclusive to it, but it’s with pondering whether Apple could—or, perhaps better put, arguably should—propagate it its other earbuds. After all, the new Powerbeats Pro are powered by the H2 chip. This is where the ear hooks have more relevance. To wit, not only do the hooks make the earbuds reminiscent of conventional hearing aids, they act as an accessibility aid unto itself. It’s entirely plausible a disabled person—or anyone else, for that matter—may choose Powerbeats Pro over its cousin in AirPods due to them being a literal better fit, despite the fact they may not be fitness-inclined. More pointedly, it’s also entirely plausible the ear hooks on Powerbeats Pro make it more accessible for someone to get them in and out of someone’s ears. What I’m saying is, there’s an argument to be made that someone might want AirPods Pro for the hearing aid feature—but can’t buy them because they don’t fit or, in a nod to sensory conditions, they’re uncomfortable to wear. The Powerbeats Pro seemingly would be an attractive alternative, given Beats is an Apple subsidiary and are virtually identical to AirPods Pro in terms of their general function.
If anything, expanding the hearing aid feature would give Apple another feather to put in its cap when it comes to transforming the over-the-counter hearing aid market and, by extension, shattering the stigmas associated with hearing loss and wearing hearing aids. This is exactly the kind of thing that aligns with the company’s purported purpose to use its technologies as agents of change in making the world a better place. It’s also not insignificant that AirPods, as well as Beats, are entrenched in the cultural zeitgeist in ways hearing aids decidedly aren’t. Expanding the hearing aid feature’s literal accessibility (in the access sense of the word) would serve only to reinforce that notion.