Theoretically, it should be possible to migrate mobile apps to Apple Vision Pro’s operating system, visionOS.
After all, as much as 80-90% — maybe more — of the code for visionOS is the same as iOS, Michał Pierzchała estimated during a recent React Native podcast. Pierzchała is the head of technology at the software engineering consultancy Callstack. The foundation for both is UIKit, explained Oskar Kwaśniewski, a software engineer with Callstack who works on the project.
That said, when Callstack partnered with Matt Hargett, founder of Rebecker Specialties, to create a React Native app for Vision Pro, they discovered there wasn’t a way to use the framework on visionOS. So Callstack decided to create a fork of React Native that would. Out of that fork, Callstack built a new open source, out-of-tree platform that can be used with visionOS. It allows developers to retarget applications to Vision Pro with full support for the platform SDK.
“The most notable feature of Apple Vision Pro is, of course, this immersive space where you have multiple apps being next to each other,” Kwaśniewski, who worked on the fork, told The New Stack. “This is the vision that we will bring to React Native so that users can easily mirror apps, and take the full space of Apple Vision Pro users.”
Quirks of Developing for visionOS
It helps to think of visionOS as having three types of spatial content: windows, 3D volumes and spaces. Windows are rectangular boxes that contain traditional views and controls and can contain some 3D content. Volumes are more like a big cube that can show 3D experiences viewable from any angle. Spaces are … well everything else. By default, apps launch into Shared Spaces, where they exist side-by-side, much like multiple apps on a desktop.
Window and a volume, from Apple’s visionOS page.
“Apps can use windows and volumes to show content, and the user can reposition these elements wherever they like,” the Apple visionOS site explains. “For a more immersive experience, an app can open a dedicated Full Space where only that app’s content will appear. Inside a Full Space, an app can use windows and volumes, create unbounded 3D content, open a portal to a different world, or even fully immerse people in an environment.”
Apple Vision Pro is a mixed reality or extended reality headset, as opposed to virtual reality (which offers a fully immersive experience). Mixed reality is more like a virtual overlay, allowing the user to interact with their surroundings. The headset allows a user to use hands, voice and eyes to navigate, all of which is handled by the system layer and does not need to be programmed by the developer, Kwaśniewski and Pierzchała shared during the podcast.
Vision Pros are still hard to come by — Callstack was still on a waitlist as of the January podcast — so Kwaśniewski has only developed on a simulator, but he said the OS will highlight where a user looks and select the element, which can then be activated by a finger motion — much like clicking on a trackpad.
Alternative Options
The team did look at Flutter as a possible option, but chose React Native because it’s a more established framework with a bigger share of the actual apps shipped inside of the Apple App Store and Google Play, Kwaśniewski said. The React Native platform will allow those developers to extend their mobile app into visionOS, he added.
NativeScript is another option for creating VisionPro apps. It offers visionOS support for multiple JavaScript frameworks, including React, Angular and Svelte, which allows developers to use JavaScript with Apple’s SwiftUI. However, React Native is easier and faster by comparison, Pierzchała said.
Building the Out-of-Tree Platform
Kwaśniewski compared forking React Native for visionOS to Microsoft’s fork of React Native for Windows.
“It allows you to take the same code base that you’ve had, that you’ve already written in well-known JavaScript, and you run it on MacOS and Windows, and we also allow it to run it on visionOS,” he said.
In building the React Native visionOS framework, the team leveraged another framework,Swift UI, to bridge the gap between React Native and the visual aspects of Vision Pro, he added.
The team found it wasn’t possible or even necessary to migrate all the React Native code. For instance, there are APIs that just don’t make sense on the Vision Pro.
“The API to retrieve information about the current user screen doesn’t make sense for this platform, as the screen is one centimeter from our eyes and we can’t access the screen that user is looking through; but for visuals, we use windows to display stuff,” he said. ”That’s why there are some common issues that mostly library developers and maintainers need to solve in order to get their library working for this platform.”
Only libraries using native code need migration; JavaScript-only libraries will work out of the box, according to Kwaśniewski’s January post detailing how to migrate libraries. The framework is designed to address some of the challenges, of course. Kwaśniewski volunteered that any developer working on converting their library could reach out to him for help.
Correction: Story updated Jan. 28, 2024 to correct Pierzchała’s last name.
TRENDING STORIES
YOUTUBE.COM/THENEWSTACK
Tech moves fast, don't miss an episode. Subscribe to our YouTube channel to stream all our podcasts, interviews, demos, and more.
Loraine Lawson is a veteran technology reporter who has covered technology issues from data integration to security for 25 years. Before joining The New Stack, she served as the editor of the banking technology site Bank Automation News. She has...