With iOS 19 and iPadOS 19 arriving this fall, Apple is preparing to introduce support for Brain Computer Interfaces (BCIs)—a major step toward enabling mind-controlled interaction with its devices. Though this functionality is designed primarily for users with severe mobility impairments, it marks a significant shift in how technology and neurointerfaces might converge in the years ahead.
BCI integration will allow users with compatible brain implants to control their iPhone, iPad, or Apple Vision Pro using only their thoughts. Apple’s update extends the existing Switch Control accessibility protocol to work with BCI inputs, replacing the need for physical or voice commands with direct neural interaction. According to Apple’s announcement, this expansion is part of a broader effort to make its ecosystem more accessible to people with profound physical disabilities.
The primary partner in this rollout is Synchron, a neurotechnology company that has developed the Stentrode, a brain implant that converts neural activity into digital signals. Unlike more invasive brain surgeries, the Stentrode is implanted via the jugular vein and positioned on the motor cortex—eliminating the need for open-brain procedures. It uses a set of 16 electrodes to detect motor intent, which is then translated into UI commands, such as selecting icons or navigating interfaces.
Synchron has already tested the device in a small group of patients under the FDA’s investigational device exemption. While it has not yet received full regulatory approval in the U.S., the company says it has implanted the Stentrode in ten individuals since 2019. These early trials show promise for users with conditions like ALS (amyotrophic lateral sclerosis), who are unable to rely on conventional input methods.
Apple’s collaboration with Synchron makes the company the first to achieve native integration with a BCI-compatible HID (Human Interface Device) profile. The system also supports bidirectional communication—meaning Apple devices can provide contextual UI data back to the implant system to improve decoding accuracy and the user experience. This closed-loop approach could make interactions more efficient and personalized compared to traditional assistive devices that emulate basic input hardware.
While BCI integration may sound futuristic, Apple is positioning this feature as a practical accessibility solution rather than a novelty. For now, the technology will remain limited to clinical settings. Broader consumer applications—such as controlling a device purely by thought without a medical basis—are still years away, constrained by regulatory hurdles, implant safety, and ethical considerations.
Still, the groundwork is being laid. With Apple integrating BCI at the OS level, and Synchron demonstrating viable hardware and real-world use cases, this development may mark the beginning of a new accessibility paradigm—where thought-driven interfaces become part of mainstream device interaction, especially for those most in need of alternatives to physical input.
