Apple made a big splash at WWDC 2023 this year as it introduced the first major new product since the Apple Watch with the Vision Pro spatial computing headset. But of course, we also got software announcements for iOS 17, iPadOS 17, watchOS 10, and macOS 14 Sonoma.
Though I feel that iOS 17 is an overall underwhelming update compared to the past few years with iOS 14 and iOS 16, there’s still a lot of cool stuff coming. The developer beta is out now, and people have been diving into all that iOS 17 has to offer so far. And you know what? There’s plenty of great stuff — including a few things Apple didn’t even mention during the keynote.
Here’s what I’m really looking forward to trying in iOS 17 when it comes to my iPhone this fall.
Changes to ‘ducking’ autocorrect
OK, this one was mentioned in the keynote, and it’s pretty significant because I’m sure that everyone’s come across autocorrect quirks at some point (if not multiple times daily).
Apple has made significant improvements to the language model in the keyboard to make it more accurate than ever, and it will temporarily underline a word so you know it was changed. If it was a mistake, you can quickly change it back with a shortcut.
Hopefully, this alleviates everyone’s qualms with “ducking” autocorrect.
One of the best iOS features in recent years has been the ability to automatically fill in one-time verification code requests in Safari with those SMS codes with a single tap. It’s painful when whatever service I’m trying to log into sends the code through email, however, because that means I have to go into my email, copy the code, then go back to Safari and paste it in (or type it in manually if paste is not supported for some reason).
Now, iOS 17 is finally adding the ability to AutoFill those one-time verification codes when they are sent by email from the Mail app. I’m really looking forward to not actually having to open up the Mail app for those codes now.
Speaking of one-time verification codes, I’m pretty sure your Messages app is full of threads that are just a bunch of verification codes, right? I know mine is. And with iOS 17, you may even forget about those verification codes in your inbox, too, now that AutoFill will support them.
Thankfully, iOS 17 will also add an auto-delete for any codes that end up in your Messages or Mail apps after you insert them with AutoFill. That’s one less thing to worry about and less digital clutter. What’s not to like?
When Apple let you add widgets to the home screen in iOS 14, I was excited … until I realized they’re really nothing more than glorified app icons. Sure, they display information, but you aren’t able to interact with widgets in any way, shape, or form other than being able to launch the app it’s associated with.
Now, iOS 17 (and iPadOS 17) changes that, as Apple is taking a page out of Android’s playbook and finally giving us the interactive widgets we’ve been dreaming of. Check off a task as complete in your to-do app, control audio playback, and more — all without leaving the iPhone home or lock screen.
AirDrop is a feature that I use daily for both work and my personal life. Honestly, it’s one of the reasons why I continue to primarily use Apple products — I wish getting photos from my Android devices to my Mac was as easy (yes, I use Google Photos for that, but AirDrop is still easier).
Though I’m not sure how often I need to share my contact information with someone else in person, NameDrop looks fun and cool. It’s a modernized version of Bump! — an iOS and Android app from 2009 that allowed users to share contact info, photos, and files between devices just by bringing them close to each other. AirDrop as a whole is also streamlining the process to work the same way as NameDrop. It will also work over Wi-Fi to complete a larger download when you step out of AirDrop range. Yes, please.
I personally hate my voice and how it sounds. But I’m still excited about Personal Voice, an accessibility feature that Apple previewed earlier this year and will be launching with iOS 17.
With Personal Voice, you can create a personalized digital replica of your voice using AI and machine learning. This is done by recording about 15 minutes of audio of your voice through a series of random prompts. Personal Voice is used with the Live Speech feature, letting users communicate with others in FaceTime, the Phone app, and other apps through type-to-speak.
It’s an accessibility feature that’s designed to help those who are at risk of losing their ability to speak, such as people who have been diagnosed with amyotrophic lateral sclerosis (ALS) and other conditions. This is also the kind of technology that would be incredible to have for being able to hear the voice of a loved one who’s no longer with us.
These days, it’s hard for me to find time to do a long road trip (ah, that parenting life). But in the past, when I took a day to drive out to San Francisco or Las Vegas, I relied heavily on Apple Maps to get me where I needed to be. But of course, when you’re traveling, you may not always have a good cellular signal the entire time.
Google Maps has had offline maps for what seems like forever. Now, Apple Maps will finally support offline maps with iOS 17, which is something a lot of people weren’t expecting. With offline maps, it’ll make it easier to prepare for a long drive where your signal may not always be available.
If you have a pet, then you probably take a lot of photos of them. We have a Siberian Husky, and I can confirm — I have a lot of dog photos. My sister also has two miniature pinschers and a Doberman, and yes, I also take a lot of photos of them.
Now, pets will finally be recognized in the Photos app, just like people. With iOS 17, the “People” album will become the “People and Pets” album, and iOS 17 can recognize when an animal is of importance because of the number of photos you have of them. From what I’m hearing, it is also pretty good at being able to distinguish between similarly colored animals. However, it does seem to be limited to cats and dogs only for now.
I’m excited for our husky, Wolf, to get his own album on my iPhone!
One of the cool hidden features in iOS recently is Visual Look Up. With this, iOS can identify popular landmarks, statues, art, plants, animals, and more in your photos. I use it often to check on what that cool plant or flower is when I’m talking about a photo in my camera comparisons, for example.
Visual Look Up will now expand to identify food in your photos and, if it’s able to, give you a recipe for something similar. You can also use Visual Look Up to get information about subjects in a photo when you remove the background, and it even works with video now.
I use Messages a lot. It’s my primary way of staying in contact with my husband, family, and close friends, as well as some old co-workers and acquaintances. I use Messages to have silly conversations and serious chats, and to just send cool links or share ideas about what to check out next time we go out.
With all of the messages that I send and receive daily, I often have to look back for something specific. Right now, search in Messages is kind of a mess. But iOS 17 brings new search filters, allowing you to combine keyword searches to help narrow down the results, kind of like in the Photos app. I look forward to being able to find specific messages even faster.
An Android feature that has been incredibly helpful, at least with Pixel phones like the Pixel 7a, is the integrated level feature in the Camera app. When you attempt to take a straight-on photo, a line appears in the middle that will show whether the scene in the viewfinder is straight or crooked, making it easy to correct.
While iOS has always had a leveling feature if you use the Grid mode, iOS 17 is now separating the leveling feature so you can use it without the grid. It appears that Apple’s version will have a broken horizontal line in the middle, and you’ll have to adjust your camera to line it up properly (it’ll turn yellow).
It’s a simple little feature that will be helpful when trying to get the perfect straight shot.