One of the tasks of pet owners is to keep an eye on their animal at home while they are away.
At WWDC 2023, Apple discussed its new API and framework that can be used to track the animal’s movements around the house with an iPhone mounted on a motorized mount.
This can be achieved using the DocKit framework which allows developers “create incredible photo and video experiences with motorized supports”. Allows developers to create applications that “automatically track live video subjects across a 360-degree field of view, take direct control of the mount to customize framing, directly control motors, and provide their own inference model for tracking other objects«.
When used in combination with what Apple calls Animal Body Pose API, you can virtually turn an iPhone into a standalone pet tracking camera.
The Animal Body Pose API is designed to recognize animals and identify their poses.. The API builds on Apple’s previous efforts to detect human body poses, and currently only works for cats and dogs.
In a video from the session, Apple engineer Nadia Zouba explains that the Animal Body Pose API can detect which animal is present in a certain image or video and the pose it adopts. To do this, she takes 25 reference points from the animal’s body to draw a digital skeleton.
In other words, the API can be used to recognize different postures of animals, like when they stretch, stand up and ask for a treat or run away from a threat. Its potential applications may also include things like a dog food dispenser that activates “by recognizing the animal and sensing its pose.”
The API is supported on iOS 17, iPadOS 17, tvOS 17, and macOS Sonoma. It is not known if there will be a pet tracking app made by Apple, but developers can use the available tools to create their own pet camera.
This is in addition to the pet-related stuff Apple has announced this year. The album People of the iOS 17 Photos app is now called People and Pets because it also recognizes your pets.