Last year, Apple showcased exciting advancements in artificial intelligence at WWDC. This year, the focus shifted away from AI and towards updates across its operating systems, services, and software. Among these updates is a fresh design dubbed “Liquid Glass,” along with a new naming style.
Still, Apple managed to catch attention with some AI updates. Let’s look at what’s new and interesting.
Visual Intelligence
Apple’s Visual Intelligence is an AI tool that helps you learn more about what you see. It can identify plants, restaurants, or even clothing. With the upcoming iOS 16, it will link to the information displayed on your iPhone. For example, while scrolling through social media, you can search for details about an image using Google or ChatGPT.
To use Visual Intelligence, just open the Control Center or customize the Action button—usually the one for screenshots.
ChatGPT in Image Playground
In the Image Playground app, Apple has integrated ChatGPT, making it easier to create images in various styles like “anime” or “watercolor.” You can even give prompts for more tailored images.
Your Personal Workout Coach
Apple’s new AI workout coach is designed to motivate you while exercising. The app uses voice technology to encourage you throughout your workout, offering feedback on your pace and heart rate. After you finish, it summarizes your results, making it feel like you have a personal trainer by your side.
Live Translation Feature
A new live translation feature allows you to communicate in real-time, whether through Messages, FaceTime, or phone calls. This technology translates spoken words or text into your preferred language instantly. During FaceTime calls, you’ll see live captions, and phone conversations are translated aloud.
Enhanced Call Features
Apple has rolled out two helpful features for phone calls. The first is call screening, which answers calls from unknown numbers and lets you hear the caller’s name and purpose. The second, hold assist, detects hold music when you call a help center, allowing you to multitask while waiting for a live agent.
Polls in Messages
Now you can create polls in the Messages app based on your chat context. If you and your friends can’t decide where to eat, for example, Apple Intelligence suggests starting a poll.
AI Shortcuts
The Shortcuts app will become smarter with AI. Users can choose an AI model that enables features like summarization, making common tasks easier and quicker.
Spotlight Updates
Spotlight, the search tool for Macs, is getting a contextual upgrade. It will now understand your typical actions better, providing suggestions tailored to what you’re currently doing.
Foundation Models for Developers
Apple is opening its AI models to developers, even when offline. This Foundation Models framework allows programmers to enrich their apps with AI features that are compatible with Apple’s systems, positioning Apple as a competitor in the burgeoning AI landscape.
Siri’s Setback
One disappointing takeaway from the event was the delay for updates to Siri. Attendees were hoping for new AI features, but Apple’s software chief, Craig Federighi, shared that there wouldn’t be any revelations until next year. This raises questions about Siri’s future in a market that’s rapidly evolving.
In summary, while Apple may have toned down its focus on AI, the new features show a commitment to making everyday tasks easier through technology. As these advancements roll out, they can significantly enhance user experience across various platforms.
For more details on these updates, you can check out Apple’s official announcements.
Source link
Apple,Apple Intelligence,wwdc