Apple
’s
Vision Pro
, went on sale last week, and
Adobe
has introduced two new native apps for the AR headset –
Lightroom
and
Firefly AI
.
In a press release, Adobe announced they have developed a new Firefly experience specifically designed for the headset's visionOS. This new app will allow users to generate images and place them into real-world environments such as walls and desks.
The Firefly visionOS app has an interface that is easy to use for those who have already used the web-based version of the tool.
To generate images, users can simply enter a text description in the prompt box at the bottom and click on the "generate" button. The app will then display four different images, which can be dragged out of the main app window and placed around the home like virtual posters or prints.
The Firefly AI model used in the new visionOS app is designed to be commercially safe. It applies a content credential "nutrition label" to images, which embeds metadata to mark them as AI-generated transparently. In addition, Adobe announced that they plan to add new features to the app that will allow users to generate "wrap-around panoramas, 360-degree environments, and more" for the visionOS Firefly experience.
When the
Apple Vision Pro
was announced last June, they mentioned the native Adobe Lightroom photo editing app. The visionOS Lightroom experience is similar to the
iPad
version but with a simpler and cleaner interface. It should be easier to navigate with hand gestures compared to the more complex desktop software.
The Vision Pro app gives users the ability to view wide-frame panoramic photos and videos with a new way to edit and scale images, and more to come in the future.