markwmsn wrote:
The recognition of flowers, animals, landmarks, and such has been in Photos since before the introduction of Apple Intelligence.
Exactly. Since macOS Sierra, Version 10.12 in 2017 has Photos been scanning our Photos automatically to recognize objects and classify them according to categories, like concert, sporting event etc. And face recognition has been available already in Aperture and iPhoto. What is new, is the introduction of AI in the editing tools.like Clean Up or image generation, like Image Playground. These tools require a lot of processing power and are better on the new models with dedicated hardware.
I am not surprised that Photos is making heavy use of AI tools. The Photos.app is available on nearly all Apple devices (iPhone, iPad, iPod Touch, Mac, Vision Pro, iCloud) and one design principle for Photos has been the desire to make the user interface similar on all platforms for a unified experience. The second design principle is code sharing - using the same frameworks on all platforms, whenever possible for an easier maintenance and more compatibility.
This is a very hard requirement, given the different display sizes and methods of interaction. You may have noticed, that Apple has been very frugal with menu commands. Apple is encouraging us to use Siri instead of using the graphical user interface. And the most common tasks, when structuring our Photos Libraries are now done automatically by Photos using AI - and there are many predefined collections, so we do not need to create them for ourselves. There are not even the tools to do that on or own. On a Mac we still have smart albums and keywords in Photos, but they have never come to the iPhone, iPad, Vision Pro. We are limited to the predefined collections and the search based on the AI results. I am using them as a starting point to build my own view of the library on top of it, with folders, albums, smart albums and keywords. And sync the structure from my Mac to the other devices with iCloud Photos.
Even the layout is based on the AI results (since macOS 10.15 Catalina). If we do not disable it, we are only seeing a curated selection of the photos, cropped and tiled, and cannot even tell by the cropped square, if a photo is a panoramic shot or a portrait photo. The tiled "Days" view is currently pretty useless, as none of the photos is shown, as I have cropped it for the best composition. I wish, the developers and designers would show more respect for the artistic taste of the customers.