Inside Nvidia’s New ‘Signs’ Platform and the Meeting of ASL And Artificial Intelligence
I’ve been fortunate to cover a bevy of tech companies in my career as a journalist—Intel, OpenAI, Salesforce, amongst countless others—that, were you to play a quick game of word association, accessibility likely wouldn’t be one you’d blurt out. As it turns out, however, all three Bay Area-based companies indeed do care an awful lot about making technology usable by disabled people. In fact, to discover that a chipmaker like Intel and an enterprise software company in Salesforce—both areas which ostensibly have absolutely zero pertinence to accessibility—works so hard to make their wares inclusive to the disability community is simultaneously enlightening and heartening.
So it goes for Nvidia.
In a blog post published on Thursday, the Santa Clara-based company announced its new Signs platform. The software, which Nvidia describes as a “validated dataset for sign language learners and developers of ASL-based AI applications,” was conceived and developed in collaboration with the American Society for Deaf Children and creative agency Hello Monday, in an effort to increase representation of American Sign Language in AI-powered datasets. Nvidia notes ASL ranks third in the United States in terms of prevalence, behind only English and Spanish, yet there exist “vastly fewer AI tools developed with ASL data” compared to the aforementioned top two languages.
Nvidia has posted a video demonstrating Signs on its YouTube channel.
“Sign language learners can access the platform’s validated library of ASL signs to expand their vocabulary with the help of a 3D avatar that demonstrates signs—and use an AI tool that analyzes webcam footage to receive real-time feedback on their signing. Signers of any skill level can contribute by signing specific words to help build an open-source video dataset for ASL,” Nvidia wrote in part in its announcement. “The dataset—which NVIDIA aims to grow to 400,000 video clips representing 1,000 signed words—is being validated by fluent ASL users and interpreters to ensure the accuracy of each sign, resulting in a high-quality visual dictionary and teaching tool.”
In a brief interview conducted earlier this week ahead of today’s news, Nvidia’s manager of trustworthy AI product Michael Boone—coincidentally, he’s credited with the byline for the company’s blog post—explained to me Nvidia decided to work on the Signs project because they saw a need for it. A “large majority” of parents who have deaf children, Boone said, don’t know ASL and aren’t learning it; children are developing their signing skills outside of the home, he added, but Nvidia seized on an opportunity to help bridge the proverbial gap in terms of communicating with one’s nuclear family.
“We want [Signs] to help bridge the communication gap,” Boone said. “Looking at what had been done for [signing] individual letters, we figured it would be helpful to take it to the next step and create a database and user dictionary for words and short phrases.”
When asked about the technical aspects of Signs, Boone told me the state-of-the art for learning ASL is to watch a bunch of YouTube videos and maybe work with a live interpreter. What makes Nvidia’s work so novel, and so interesting, he said to me, is there heretofore hasn’t existed a way for ASL learners to garner real-time feedback on their language for free. According to Boone, Signs is an example of computer vision: using artificial intelligence, Signs has the ability not only to detect where the different parts of the body are, it’s also able to understand how a user is placing their hand as well as the “sweeping movements” of signs. All told, Boone said Nvidia’s overarching goal, technologically speaking, is increasing fluidity and ensuring the software is properly instructing the user to become proficient at speaking sign language.
At a macro level, Boone said the primary goal with Signs is twofold. The first, of course, is pedagogical. Nvidia (and its partner organizations) wants to teach people ASL. Secondly, the platform also exists as a conduit through which Boone a team can “curate a data set that can then be used to enable more accessible AI technology.” Crucially, Boone said Signs has been purposefully built by and for the ASL community, telling me the platform is expressly designed to involve and engage the community.
As I wrote at the outset, the reality is Nvidia is not an institution necessarily revered for assistive technologies. For his part, Boone acknowledged the company’s relatively nascent reputation in this realm and said Signs indeed does align with Nvidia’s “founding principles” that “enable everyone.” From ideation to delivery, he told me, Nvidia’s North Star always is to build technology which “enables as many groups as possible to benefit.” Specific to Signs, Boone said it’s something that’s “high quality [and] has robust data and [involves] all stakeholders within the community, from research to product and our future partners who will benefit from using this data.”
Besides Boone, I also had the chance to connect with Cheri Dowling. Dowling serves as executive director of the American Society for Deaf Children. In a short interview with me, she called Signs “a great tool” and said when she first learned of the project, she immediately knew her organization should back it. Dowling expounded further, saying Signs is a tool with which ASL learners can hone their fluency through what she characterized as a “fun website.” She emphasized how Signs gives users grace if they forget a sign, as they’re allowed to look it up and practice while taking solace they’re getting the mechanics right. As to future versions, Dowling said she’d like Signs to grow from single words to phrases and someday even full-on sentences. “With technology the way it is these days, the possibilities [for improvement] are endless,” she said.
When asked about feedback on the new software, Boone told me Signs has undergone extensive testing both internally and externally “for several months” now. He’s been nearly anticipating today’s public launch, telling me Signs marks the beginning of a journey that sees great potential to “create additional, accessible technologies.” Boone further noted Nvidia has partnered with the Rochester Institute of Technology for help with Signs’ user interface and user experience. The big idea here, he said, was to ensure the creation of a “transparent, safe, and explainable solution” for learning ASL. “I’m excited for what this dataset will be able to be used—not just for the user dictionary, but also an eye towards [making] future technologies as well,” Boone said.
For Dowling’s part, she said her team is “really excited” to see Signs set forth unto the world. Her organization offers online ASL classes and she noted Signs will be heartily recommended henceforth as a way to bolster and supplement the classwork. Dowling said the Nvidia team has been “great” to work alongside on making Signs a reality.
“It’s so important for families who have children using ASL to learn the language,” she said. “This is a fun way to do it together.”
Looking towards the future, Boone told me he portends it “full of possibilities.” He reiterated Nvidia’s raison d’être of building technologies for everyone, saying Nvidia was founded to “solve the world’s greatest challenges.” He also expressed appreciation of, and enthusiasm for, artificial intelligence’s capacity for doing genuine good. The advent of Signs, Boone told me, is an exciting development for Nvidia that “not only teaches but is also helping to enable other members of the ecosystem.”
Apple and Netflix Must Do Right and Finally Make TV App Peace for accessibility’s Sake
The internet was abuzz on Friday when it appeared Netflix and Apple had buried the proverbial hatchet with regards to integrating with the TV app when it was reported—many times over, at that—Netflix material was miraculously appearing in the TV app’s Watchlist section. This part of the software is where Apple allows users to see which TV shows and movies are currently in progress; clicking on something will immediately take you to whatever app—Prime Video, for example—you used to play the content. Companies such as Amazon and ESPN, to name but two participants, must to elect to support said Watchlist integration. The main thrust of today’s jubilance over Netflix’s ostensible acquiescence is because the Los Gatos-based streaming giant is the elephant-sized sore thumb on the many hands of TV app supporters on tvOS.
Alas, The Verge’s Chris Welch dutifully reports the integration was “a mistake.”
“Netflix spokesperson MoMo Zhou has told The Verge that this morning’s window where Netflix appeared as a ‘participating’ service in Apple TV—including temporary support for the watchlist and ‘continue watching’ features—was an error and has now been rolled back,” he wrote earlier today in a followup story. “That’s shame. The jubilation in our comments on the original story was palpable.”
The point of the Watchlist integration is, of course, about greater convenience. It’s more convenient to have one’s still-in-progress watches, across an expanse of services, collated into one neat and tidy location. As I’ve argued innumerable times over the years, however, convenience and accessibility aren’t one and the same. Beyond sheer convenience, the reality is the TV app’s Watchlist feature is a de-facto accessibility feature; instead of app-hopping, trying to figure out which movie or television show streams where, it’s more accessible to launch the TV app to find it there. This is especially beneficial for someone with cognitive disabilities, as the mental load associated with trying to remember where—and how—to find a particular piece of content is the furthest thing from trivial. Do it enough and it sullies the overall user experience. It makes watching Netflix inaccessible if only because it’s highly plausible a person has trouble remembering that, for instance, A Man On The Inside is on Netflix. (A series, I should add, is set in San Francisco and well worth watching.) Personally, I have no such trouble usually remembering what streams where, but nonetheless appreciate the TV app integration for accessibility’s sake. I was as jubilant as everyone else at today’s news for that very reason, and it’s a major letdown to learn from Welch (and Bloomberg’s Mark Gurman) that the integration with the TV app indeed was a bug.
This news serves as yet another reminder that Apple and Netflix need to make up posthaste. Apple should pay Netflix whatever it takes to make the integration happen for its customers, myself included. Moreover, the news also is a reminder of how much work Apple should do in improving tvOS. As I said in the Six Colors Report Card, I’m still waiting for the company to give tvOS its iOS 7 moment. Which is to say, rethink and redesign it all to make it more than a static grid of app icons. There is so much potential for greater information density on screens as big as a television’s that Apple is seemingly reticent to tap. Of course, the grid of icons has its benefits too, but I’ve long maintained the tvOS interface is backwards. The stuff within the TV app—most notably the Watchlist—shouldn’t live in a silo. It should be akin to Google TV, easily discoverable and flanked by a bunch of content recommendations. By comparison, Google does this extremely well; I’m a happy subscriber to both YouTube Premium and YouTube TV and am really happy with the suggestions I get on what I should be watching. With the advent of Apple Intelligence a few months ago, there lies a huge opportunity for Apple to flex its AI muscle in this realm. In fact, spend a few minutes scrolling the TV app and you’ll notice there are a bunch of recommendations—in my case, most of them quite good. The problem is, again, this stuff is buried and siloed when it should be exposed and out in the open. Most people will claim it convenient, but the truth of the matter is it’s greater accessibility too. This is not an insignificant point for legions of people.
If only Eddy Cue would make the drive to Netflix headquarters with fat check in tow.
Amazon Updates Prime Video on Apple TV with New accessibility features, More
According to The Verge’s Emma Roth, Amazon this week released a substantial update to its Prime Video app on tvOS. The enhancements include higher resolution poster art, integration with the touchpad on the Apple TV’s Siri Remote, and most interestingly for me, support for accessibility features such as VoiceOver, Hover Text, and Bold Text.
The update runs on all Apple TV 4K models, as well as the decommissioned Apple TV HD.
After reading Roth’s story late yesterday, I installed the update on the Apple TV in the living room and played around in the user interface before watching the first few minutes of Christopher Nolan’s Oppenheimer. My first impressions are Amazon did a really nice job with this upgrade; the artwork looks spectacular on my brand-new 77” LG C3 OLED television and the trackpad gestures work swimmingly. The only bad thing I noticed was, best to my knowledge, the new Prime Video app seemingly doesn’t support Amazon’s X-Ray feature any longer. Maybe it does have it, but I missed it.
The Prime Video news came as a surprise to me, as I’ve been content with the version Amazon rolled out several months ago. Last July, I published a piece on the new app after going down to Culver City on a reporting trip and visiting Amazon MGM Studios for a small-ish media event around Prime Video. My reporting of the event featured in-person (!) interviews with Prime Video executives Raf Soltanovich and Kam Keshmiri. Both men talked with me about how the then-unreleased new Prime Video app had been built with accessibility top of mind, as accessibility is important to Amazon.
Jason Snell put it well when linking to Roth’s report: “There was a time when Prime Video was one of the worst major streamer apps on tvOS, but those days are over.”
Pondering The New Powerbeats Pro’s potential for hearing Accessibility
Apple-owned Beats released refreshed Powerbeats Pro this week. The earbuds, priced at $250 and described by Beats as “built for athletes,” are ideal for fitness and working out. Megastar athletes such as LeBron James, Shohei Ohtani, and Lionel Messi have been seen wearing them. The new Powerbeats Pro feature an H2 chip, a heart rate sensor, and come with a case Beats says is a third smaller than the previous version.
Chance Miller at 9to5 Mac posted a review worth your eyeballs.
The Powerbeats Pro’s most distinctive attribute, however, is its ear hooks. Whereas something like the Beats Studio Buds+—which I snagged at a discount on Amazon during last year’s Prime Day—fit more into one’s ears, the Powerbeats uses an ear hook that fits outside the ear for security. As I wrote on Mastodon, it occurred to me the aforementioned ear hooks means Powerbeats Pro bear a strong resemblance to many prescription hearing aids available from audiologists. Stylistically speaking, the Powerbeats Pro, what with colors like “hyper purple” and “electric orange,” are far more pleasing compared to the drab, staid appearance of prescription hearing aids.
It got me thinking about the hearing health features in AirPods Pro 2. The hearing aid feature, released as part of iOS 18.1 last October, is currently exclusive to AirPods Pro; obviously subsequent models will support it, but what of Powerbeats Pro? I know nothing about the underlying technical requirements specific to AirPods Pro such that the hearing test/hearing aid is exclusive to it, but it’s with pondering whether Apple could—or, perhaps better put, arguably should—propagate it its other earbuds. After all, the new Powerbeats Pro are powered by the H2 chip. This is where the ear hooks have more relevance. To wit, not only do the hooks make the earbuds reminiscent of conventional hearing aids, they act as an accessibility aid unto itself. It’s entirely plausible a disabled person—or anyone else, for that matter—may choose Powerbeats Pro over its cousin in AirPods due to them being a literal better fit, despite the fact they may not be fitness-inclined. More pointedly, it’s also entirely plausible the ear hooks on Powerbeats Pro make it more accessible for someone to get them in and out of someone’s ears. What I’m saying is, there’s an argument to be made that someone might want AirPods Pro for the hearing aid feature—but can’t buy them because they don’t fit or, in a nod to sensory conditions, they’re uncomfortable to wear. The Powerbeats Pro seemingly would be an attractive alternative, given Beats is an Apple subsidiary and are virtually identical to AirPods Pro in terms of their general function.
If anything, expanding the hearing aid feature would give Apple another feather to put in its cap when it comes to transforming the over-the-counter hearing aid market and, by extension, shattering the stigmas associated with hearing loss and wearing hearing aids. This is exactly the kind of thing that aligns with the company’s purported purpose to use its technologies as agents of change in making the world a better place. It’s also not insignificant that AirPods, as well as Beats, are entrenched in the cultural zeitgeist in ways hearing aids decidedly aren’t. Expanding the hearing aid feature’s literal accessibility (in the access sense of the word) would serve only to reinforce that notion.
Be My Eyes Touts ‘Significant New Capabilities’ in Winter 2025 App Update
Be My Eyes has announced its Winter 2025 update. In a blog post published last week, the San Francisco-based company said the upgrades bring with it “performance enhancements and flexible data choices for users.” The announcement is also notable in its timing; it coincides with the launch of the original app a decade ago, in 2015.
On the user experience side, Be My Eyes has added support for video calls in full 1080p resolution by default. The higher resolution, the company said, will enable volunteers and service agents to view “clearer images and finer details when zooming-in” as they ready to provide descriptions the user. Furthermore, Be My Eyes responded to a popular request by adding the ability for Be My AI to read descriptive texts aloud, even when a screen reader is inactive on the device. According to Be My Eyes, the functionality “[makes] for a more elegant and easy way to access descriptions without the need for additional software or settings.” Lastly, Be My Eyes has added the ability to save and share images, which includes alt-text, as well as support for the Shortcuts app on iOS.
As to data management, Be My Eyes has instituted a policy whereby images are deleted after 30 days. Users of course have discretion over which images they want to keep, but the default action is to purge images after a month’s time. In addition, the company notes users have the ability to opt-out of having their data used to train AI models if desired. The toggle is available in an “updated and simplified” menu in Settings.
For Jatin Nayyar, HireVue Has Lived Up to Its Promise To Provide ‘More Equitable Opportunities’ to disabled workers
When I connected late last year with Jatin Nayyar over email for a brief interview, the 22-year-old from Marlboro, New Jersey and recent George Washington University graduate explained to me he wanted to work at employment technology company HireVue because he felt strongly he “[deeply aligns] with its mission to harness technology to unlock human potential and provide more equitable opportunities.”
On its website, HireVue boasts it puts the ‘human’ back into Human Resources by “marking a new era in hiring that offers unparalleled insight into skills and potential, so we can put the human back in human resources.” For the unfamiliar, HireVue was a repeat subject of coverage at my recently-shuttered Forbes column. In October 2023, I sat down with company executives Anthony Reynolds and Patrick Morrissey. Reynolds is HireVue’s CEO while Morrissey serves as the company’s chief customer officer.
“As someone with Tourette Syndrome, I’ve always been passionate about promoting inclusivity and leveraging diverse perspectives to improve outcomes,” Nayyar said. “HireVue’s commitment to using AI to reduce bias and create fairer hiring practices resonates with my personal belief that intelligence and potential come in many forms and that everyone deserves a chance to shine, regardless of how they express themselves. I’m excited to contribute to a company that values and elevates human potential.”
According to Nayyar, who has Tourettes Syndrome, HireVue provided him with several accommodations during the hiring process. He went through “an array of their solutions,” one being live and on-demand video interviewing. Tourettes, he explained, is exacerbated by stress; its symptoms can manifest “quite prominently” during job interviews. Nayyar was able to disclose his tics beforehand, on his terms, without meeting the interviewer. It was something, he told me, he appreciated very much. It made him comfortable.
“[HireVue] gave me a chance to share my voice and show what I can bring to the table,” Nayyar said.
Nayyar called the support he’s received from co-workers as “great.” Every one of them he’s been in contact with has been extremely empathetic and helpful to his needs, with Nayyar noting he considers it mentorship. While Nayyar acknowledged being the youngest member of his team—not to mention being in a wholly new industry—was difficult at first, he’s nonetheless very proud to have been able to “make strides” with the steadfast love and support from his colleagues.
Looking towards the future, Nayyar hopes to continue being an advocate for mental health awareness and disability inclusion. He also looks forward to continue as a competitive boxer and hopes to get back into the sports industry at some point down the proverbial line.
Yours Truly Is Featured In the 2024 Six Colors Apple Report Card, Talking Accessibility
Jason Snell at Six Colors posted his annual Apple Report Card earlier this week. It’s a big deal because not only have I written for him once, but I’ve been participating as a years-long panelist tasked with grading Apple in the annual Apple Report Card. This year’s edition is a milestone, as it’s the series’ 10th anniversary, and I’m honored to have been part of it for much of its existence. For the uninitiated, Snell describes the Report Card as a way in which to “get a broad sense of sentiment—the ‘vibe in the room’—regarding the past year” of Apple from what he called a “hand-selected group” of people who follow the company extremely closely. There’s even a nerdy but cool data visualization of the results from this year’s survey.
Of course, my contribution to the Report Card is Apple’s performance on accessibility. What follows are my verbatim responses on the company’s progress spanning a slew of categories.
On the Mac. “I feel like the Mac is firing on all cylinders right now, thanks in large part to Apple silicon. While my daily driver machine is a 2019 Retina 4K iMac on Intel—I do have an M2 MacBook Air as well—the value proposition is off the charts, seeing it still going strong nearly 6 years after I bought it.”
On the iPhone. “The iPhone impresses every year in my eyes. I upgrade it every year not merely for journalism’s sake but because it’s my most important and oft-used computer.”
On the iPad. “I was gifted a 13″ M4 iPad Pro (with Magic Keyboard) for my birthday in September. The hardware, most especially the OLED display, is stunning. As to software, although iPadOS does exactly what I need it to do, there’s no question the pace of improvement needs to pick up.”
On Wearables. “Apple Watch, AirPods, and Vision Pro all are winners in general. The new Apple Watch’s bigger screen in a lightweight package is lovely, especially after wearing an Ultra for 2 years solely for its big screen. Vision Pro is the most accessible content consumption device I’ve ever used, but it is heavy, and the app story for streaming TV and movies feels thin other than TV+ and Disney+.”
On Home. “As an Apple user, I love the idea of HomeKit. The smart home makes tasks like turning lights on and off more accessible. What frustrates me, however, is its unreliability. I see ‘No Response’ statuses way more often than I’d like, and it sullies the overall experience.”
On Apple TV. “The Apple TV box is laughably over-engineered for what it mainly does: stream content. I appreciate the horsepower, to be sure, but the point stands. Software-wise, I’m still holding my breath for tvOS to have its iOS 7 moment. The grid of icons UI is good in some ways, but on a screen as large as a television’s, there is a lot of untapped potential. Widgets, etc.”
On Services. “I wish Apple would revisit the Apple TV branding to make it clearer. There’s the box and the streaming service, but I’d bet most people associate ‘Apple TV’ with TV+ instead of the box. The TV app adds more complexity and confusion in terms of naming.”
On World Impact. “From an accessibility standpoint, the AirPods Pro hearing aid feature has to exemplify Apple’s ethos of making the world a better place. It isn’t perfect, but the salient point is it’s tremendously significant the functionality exists at all. And it doesn’t take into account the continued work Apple did in 2024 to make its devices more accessible to the disability community. As I always say, Apple deserves more laudation in this realm than ‘gee whiz.’”
How Genomics Uses DNA And Tech To Make Healthcare Accessible To All
“[We] are united by a single vision: help people live longer, healthier lives, through the power of genomics.”
That’s what Genomics co-founder and CEO Sir Peter Donnelly told me about his company in a recent interview. He further described Genomics as a company, driven by science, which “uses large-scale genetic information to develop innovative precision healthcare tools and to bring new understanding to drug discovery for partners across the healthcare, insurance, employer, and life sciences industries.” The company, founded in 2014 with offices in Oxford, Cambridge, London, and Cambridge, Massachusetts, collaborates with what Sir Donnelly characterized as “some of the world’s leading organizations” in an effort to help them “predict, prevent, treat, and cure disease using our proprietary algorithms and databases.” The work Genomics does, he added, goes a long way towards “reducing the human and financial cost of critical diseases like cancer, heart disease, and diabetes.”
“We [at Genomics] believe genomics should be at the heart of every healthcare service around the world,” Sir Donnelly said of his company’s raison d'être. “[We believe] it will be the key to personalized healthcare of the future. We are a rapidly growing and expanding organization.”
Sir Donnelly expounded on these sentiments, telling me he believes the potential impact of Genomics’ science and technology to be profound and “enormous.” A goal of his was to share his company’s innovations with partners who would be willing to join Genomics on its mission to “help people live longer healthier lives, as soon as possible.” Genomics, Sir Donnelly said, works with companies across industries such as healthcare and insurers in providing better care. Notably, Genomics works with the National Health Service in the United Kingdom.
“We’re helping drive new possibilities in health, healthcare, and drug discovery, transforming people’s health and lives with the power of genomic prediction,” Sir Donnelly said. “Our mission is to help everyone make smart, proactive decisions to live healthier lives—making individuals co-pilots instead of passengers on their individual health journey.”
This column has seen a fair amount of coverage over time at the intersection of disability, technology, and healthcare. The reason for this is obvious and simple: more than any other group, disabled people oftentimes are in need of the most healthcare to sustain themselves. As I have written about the modern miracle known as the internet being the basis for obtaining medication more accessibly through conduits such as Amazon Pharmacy, Sir Donnelly and Genomics are coming at accessibility at another angle by, as he told me in citing one example, working in clinical trials with the NHS to help people “match the right people to the right treatment, for better patient outcomes.” Moreover, Sir Donnelly mentioned Genomics’ Insights test. He described it as providing “genetic risk testing as a secure, end-to-end solution based on a simple saliva sample collection that can be taken at home” which tests for diseases such as cardiovascular disease, Type 2 diabetes, breast or prostate cancer, and more. Based on the tester’s risk profile, he said, the person is given “actionable information about what to discuss with their doctor and give them helpful advice on setting diet, exercise, and other lifestyle goals.” Especially to a disabled person, maybe someone with certain cognitive disabilities, this information is worth its weight in gold in helping them know what they need to discuss with their doctor(s). Likewise, that the Insights test uses one’s smartphone means, for instance, results can be read using the system’s accessibility software.
“Because our genetic risk scores combine information from millions of places in our DNA, they are largely uncorrelated with a family history of disease and with most clinical risk factors. Over 70% of people are at high risk for at least one of the conditions we assess,” Sir Donnelly said. “In most cases, they will have no idea about this. Our test gives someone a chance to learn their particular risks, and for them and their healthcare professionals to take action to prevent that disease well before any symptoms become apparent.”
When asked about feedback from Genomics’ users, Sir Donnelly told me the company has been “lucky” to have received “great accolades” from partners and individuals. He pointed to a survey of MassMutual policyholders receiving the aforementioned Insights test, saying 76% found the information gleaned from the test valuable while 61% reported formulating an “improved impression” of their insurance company.
As to the future, Sir Donnelly believes the next 5–10 years will prove prosperous for genomics, helping it become the “gold-standard across the insurance, life sciences, and health services industries.” His company, he believes, will stand “at the forefront” of the revolution.
“By giving individuals much more precise information about the particular health risks they face, they can take the right actions, at the right time, to help prevent disease entirely, or to catch ill health early—when outcomes are much better,” Sir Donnelly said. “We find people are frequently under the impression that they know what diseases they are at risk of, based on hereditary factors. Through advanced genomics and PRS technology, we can know so much more about an individual’s DNA, and accurately make predictions about the potential future of their health. This puts patients in the driver’s seat of their health journey.”
How One Influencer Uses Pilates, Tech, And Empathy To ‘Empower Everyone’
Carrie Minter Ebers is behind the Carrie's Pilates business.
Carrie Minter Ebers describes herself on her Instagram as a “glamour girl extraordinaire.” Ebers, who boasts over 400,000 followers on said Instagram, described herself in a recent interview with me as a model, Pilates instructor, and entrepreneur. In 2013, she founded Carrie’s Pilates. On its website, it states it believes Pilates should be “fun, challenging, and empowering for everyone.” Ebers’ business has locations in California, Texas, and Canada. She said the company’s aim is to “empower individuals of all shapes and sizes to achieve their fitness goals and feel great about themselves both physically and mentally.”
In other words, Pilates ought to be accessible to everyone.
Ebers explained Pilates is cool because it provides what she called a “comprehensive approach” to fitness by enhancing strength, flexibility, and mental clarity. In a nod to accessibility, she added Pilates’ flexibility allows anybody, regardless of their background or ability level, to pick it up. She reiterated the inclusivity focus of her business, saying “our goal is to empower individuals to achieve their fitness aspirations and feel great about themselves.” Moreover, Ebers was candid in sharing with me getting into fitness and wellness vis-a-vis Pilates “changed my life” as someone who has struggled with anxiety, depression, and addiction.
Accessibility indeed holds a special place in Ebers’ heart. She has a sibling who is disabled, so she told me disability inclusion is “very important” to her. As a show of her staunch support of the disability community, Ebers serves on the board of directors of Camp Summit. Camp Summit is a residential camp intended for people with disabilities. According to its website, Camp Summit opened its first camp in June 2010 in the Marin Headlands, north of San Francisco over the Golden Gate Bridge, and touts offering campers “stimulating and challenging activities” led by devoted staff members who are “uniquely qualified to nuture such a community located in beautiful, picturesque settings.”
“There isn’t a lot of support for adults with special needs,” Ebers said in lamenting the lack of empathy for disabled adults. “They can only go to school for so long, and many are not functional enough to have a job.”
When asked about how technology plays a role in her work, Ebers told me it’s “integral” to the day-to-day operations of her business. Technology also is central to the design and development of the company’s proprietary Transformer 2.0 machine, not to mention its mobile-friendly website and smartphone app. Both enable people to easily book classes, check schedules, and purchase sessions. These pieces of technology, along with the proliferation of social media and other tools, help Ebers and team “support our studio owners and enhance the client experience and support our mission to make fitness accessible.”
As to feedback, Ebers called the reception to Carrie’s Pilates “overwhelmingly positive” and said the experiences has a “cult-like following” amongst clients. She said clients are appreciative of the company’s high-energy trainers and the “unique combination” of high-intensity interval training, strength training, cardio, and Pilates. People have reported “significant improvements” to their mental and physical health, Ebers told me, and noted the company’s retail expansion is reflective of “the growing demand and satisfaction with our programs.”
Looking towards the future, Ebers said business-wise, the company hopes to expand further; the goal is to grow to more than 50 studios within the next 2 to 3 years. She added Carrie’s Pilates is committed to “continuous innovation in our workout programs and equipment to better serve our clients’ evolving needs.” As to her personal life, Ebers said she intends to continue her advocacy on both the mental health and disability awareness fronts. She’s also slated to continue working with Camp Summit, and hopes to launch a nonprofit organization which expands housing opportunities for disabled adults. It will not only give people with disabilities increased agency and autonomy in their own lives, but will provide much-needed respite for caretakers. This is key, Ebers told me, especially for aging adults who, frankly, need breaks.
The 2023 Apple report card
Since 2015, Jason Snell of Six Colors has commissioned a hand-selected group of Apple watchers, including yours truly, to grade the company’s performance across a slew of categories in the last year. I added commentary in software reliability, Apple TV, of course, accessibility.
App Store Lock-In Can Be An Advantage for Accessibility
Amidst the hoopla around the latest round of curated Vision Pro demos with press ahead of launch—more on that in a later piece—Apple has tweaked its guidelines for developers which now allow app makers to “include a link to the developer’s website that informs users of other ways to purchase digital goods or services” using the StoreKit API’s External Purchase Link Entitlement. The change comes following the Supreme Court’s ruling that the company acted anticompetitively by barring developers from alerting users of available payment methods outside of the App Store.
At first blush, that Apple has capitulated on the anti-steering ruling seems not relevant whatsoever in an accessibility context. This inclination is mostly true—for the record, I do believe it’s not unreasonable for people to know of alternative payment methods and have thought Apple’s rules silly and needlessly punitive—but it’s worth examining why the App Store is a great first option for a segment of users. It’s here where the accessibility-oriented argument comes into play.
As ever, the argument transcends sheer convenience, which goes hand-in-hand with ease of use.
In a nutshell, the argument is the App Store is the most accessible way to make purchases for certain members of the disability community. Payment information is stored within a person’s Apple ID profile and purchases can be authenticated using biometrics like Face ID or Touch ID. Privacy and security notwithstanding, the reality is using the App Store for purchases—which includes in-app purchases—can be far more accessible than being jettisoned off to Safari to do so on a third-party website. This is mainly for cognitive reasons; for a certain type of person, it may prove overwhelming to choose between using the App Store or not. It seems like a trivial thing but it isn’t: it’s highly plausible that it can be jarring (and thus worsen the user experience) for someone to start in the App Store for something, then need to go elsewhere to complete the task for an app they downloaded in the App Store. Moreover, once someone leaves the App Store, however momentarily, the question then becomes determining the user flow for checkout. Does the site accept Apple Pay? Is personal information needed? Is the website itself accessible? These considerations are not at all trivial and all have major influence on shaping the user experience. What’s more, using an external payment system may exacerbate visual and motor friction, which may involve lots of typing and tapping, that’s eliminated with a few taps using the App Store.
Any retort that these issues are immaterial because users aren’t forced to venture outside of the App Store misses the point. The anti-steering conversation is an entirely separate matter. The salient point is simply that, as with everything in life whether digital or physical, accessibility touches the App Store in more ways than one. In this case, the ostensible disadvantage of being “confined” to the App Store isn’t necessarily unilaterally bad for every single user. Maybe as a business owner it’s bad, but not absolutely so for a customer. To cite one personal anecdote on macOS, I’m nerdy enough and technologically savvy enough to download software from the web. I used to do it during my Windows days all the time. Given a choice, however, I generally prefer the Mac App Store because I find the process more streamlined and, pointedly here, more accessible. The same is true for my iPhone and iPad. I know the App Store integrates with accessibility features I use every day—Hover Text on the Mac, for one—whereas I’m uncertain externally.
The accessibility case for the App Store (and using Apple’s StoreKit framework as a developer) is admittedly esoteric and nuanced in context of the legal issues. Still, that doesn’t make accessibility any less noteworthy or valid. Reasonable minds can debate Apple’s stewardship of its storefront. It’s also reasonable to contend the App Store’s lock-in is not as terrible as regulators say it is. From a usability perspective, it very well can be advantageous depending on one’s needs and tolerances. Disabled people use the App Store too, but I doubt the EU thinks about them.
I don’t presume to speak for every disabled person on the planet who uses an iPhone. Likewise, I don’t have an app on the App Store. It’s just really important to note that, like with accessibility writ large, the App Store is the path of least resistance. That matters a lot for access and inclusion.
‘echo’ reverberates as another winner for disability representation in hollywood
Last night, I watched the first episode of Echo, the just-released Marvel miniseries airing on Disney+, and came away enthralled. I admit to not knowing much of the mythos behind the Maya Lopez character, but decided to check it out mainly because Lopez is a Deaf protagonist. It isn’t every day you see something from a major Hollywood studio feature a disabled person as the lead. It should be noted that besides being Deaf, Lopez has limb differences and uses a prosthetic leg.
Marvel has a trailer for the series posted on its YouTube channel.
I can’t wait to finish the series (it’s only five episodes). I enjoy Marvel material and superheroes—although I’m far from encyclopedic on the MCU—but what was so enthralling about Echo to me was to see someone who looks like me feature so prominently on screen. I’ve felt this way before, as I watched CODA on Apple TV+ and came away gleeful at the opportunity to have watched a piece of entertainment that more or less accurately depicted my own lived experiences growing up as a CODA. It gives me immense, almost unbridled, joy to be able to watch Lopez, played by Alaqua Cox, who’s Deaf, and understand what she’s saying in American Sign Language without needing to rely on the subtitles. What’s more, it’s heartening to see the supporting cast use ASL to communicate with Lopez. All told, whatever plagues the storyline—I, for one, remain perplexed because I don’t know what exactly Echo’s powers are since they weren’t addressed off the bat—the authenticity and earnestness with which the Deaf community is portrayed is damn impressive.
If you’re a reader of my column at Forbes, you’ll know I’ve posted a lot of coverage over the years that sits at the intersection of disability, technology, and Hollywood. That’s what happens when the world’s biggest tech companies decide on leveraging their massive war chests to roll their own streaming video service. I’ve long beat the drum that disability coverage in the news media is piss-poor, what with the penchant to highlight inspiration porn about disabled people “overcoming” our own bodies. Yet on film and television, the tide is slowing but surely turning for the better. Netflix has titles such as Deaf U, Mech Cadets, and All The Light We Cannot See. That CODA won the Best Picture Oscar in 2022, along with Troy Kotsur winning Best Supporting Actor, truly was a watershed moment for disability representation. That Apple carries a host of disability-centric content on its TV+ roster—think the aforementioned CODA and See and the Michael J. Fox documentary Still, amongst others—is a direct reflection of the company’s institutional ethos to make its products accessible to and inclusive of members of the disability community. Whatever one thinks of See as a piece of art and its entertainment value, the fact it so strongly puts blindness at the forefront of the story is not at all insignificant or trivial. It’s something to applaud and to notice, even if the show itself bores you to tears or has glaringly obvious plot holes.
Echo is deserving of being treated much in the same way. It isn’t beyond reproach as a television show, but the representational gains are immensely important. It matters for disabled people.
I’d subscribe to Disney+ just to watch Echo. I highly recommend it.
Brave new world
Come May, I will mark my 11th year in the media industry. It’s a pretty surreal notion.
Prior to becoming a tech journalist, I spent more than a decade working for my then-local school district working in special education classrooms of various incarnations. One of the ways I would unwind after long days with my students would be coming home and writing about the latest Apple news on my WordPress blog. I used to spend my breaks during the day scouring Twitter for the latest news and analysis, then make notes to myself in my phone as to what I wanted to write about later. I remember watching Steve Jobs introduce the original iPad and the iPhone 4 during media events, and dream about someday being in the audience for these keynotes. All this daydreaming happened in the early years of the iPhone, circa 2008 to 2010, and I regularly published to my site—nearly every day—during that period. I kept it up during the nascent days of my reporting career in early 2013, but eventually stopped altogether as my career grew by leaps and bounds and my work was more “professional” and showing up in real, newsy publications.
All these years later, I never have forgotten where I came from. The original blog has long been gone from the web, but I still own the domain and faithfully renew it yearly. My work email is proof!
Follow my work closely enough and it’s obvious the overwhelming majority of my work has appeared on my Forbes column in the last few years. I make it unambiguous that I’m technically a freelancer and merely part of their invite-only contributor network. Because I’ve pushed so much content there, it’s gotten to the point where PR folks (and others) immediately assume I’m a Forbes staffer; the reality is, I’m very much indie. In April, my column will reach its 4th birthday. To say having my name attached to such a prestigious brand has transformed my career is the furthest thing from an overstatement. Believe it or not, I spent the first 6 years of my career covering Apple exclusively, The time doing my column has seen me widen my aperture to an exponential degree. It’s empowered me to cover a veritable all-star team of the tech industry, spanning the world’s biggest companies to the scrappiest startups and more. I’ve written about things and interviewed people—like Tony Coelho, the ex-congressman credited with pioneering the Americans with Disabilities Act, as well as Apple CEO Tim Cook—I never dreamed were remotely possible.
However grateful I am for Forbes’ role in sending my journalistic career to another stratosphere, I have felt a pull in the last several months to take firmer control of my own destiny. As a freelancer, I’ve spent literally more than a decade as my own boss, but the salient point is I increasingly crave fuller editorial independence. Forbes gives contributors a lot of latitude to act autonomously, but it has limitations. This new website, sparsely designed it is, is an opportunity to regain full control of my writing while being yet another outlet to share my work. It feels like a rare win-win situation.
This new website I’m christening today? It’s called Curb Cuts.
In essence, Curb Cuts can best be viewed as a complementary component to my longstanding Forbes column. A sister network, if you will. There are a lot of stories I’d like to write but don’t because (a) I’m managing a one-man newsroom here; and (b) I can’t possibly cover every accessibility story on the planet. Besides, there are numerous topics adjacent to accessibility and assistive tech that, for myriad, oftentimes stupid, reasons, don’t perfectly fit the scope of my column. These tensions add lots of stress to my life, and it’s already riddled with sky-high anxiety and depression. Curb Cuts exists in part to alleviate these stressors. Now the stress becomes deciding which stories go where—but that isn’t necessarily a bad problem for me to have in 2024.
Going back to the lede, I’ve been doing this for a long time now and quite literally built my miniature empire all by myself with my own two hands and the iMac on my desk. It hasn’t always been easy, but I’ve done it, and feel like I’ve gained the respect and trust of a lot of people so as to try to garner more eyeballs towards my newfound little experiment here. As I said, I’ll still be putting my byline towards my Forbes column—I just won’t be killing myself putting all my eggs in that basket.
Accessibility and assistive technology is not something to cover for the glory. My reporting will never be A1 material. It will never win me a Pulitzer. Disability coverage in mainstream news is woefully inadquate, and I like to think my work is my dent in the universe. Technology makes the world go ‘round, and it’s important to me to show the able-bodied amongst us that not everyone uses their iPhones and iPads and MacBooks in the same way. I want to show people that there’s more to critiquing products than Geekbench scores, camera comparisons, and “average users.”
As they say, hold onto your butts. I hope you’ll join me on this next chapter.