
WWDC 2025: Visual Intelligence Turns Screen into Smart Spaces
Apple’s Worldwide Developers Conference (WWDC 2025) took place on 10 June, streamed live from Apple Park in Cupertino, California, and turned out to be less about flashy hardware and more about deliberate software refinement, powerful AI integrations, and a bold new design language.
Held virtually and streamed globally, the keynote on 10 June drew developers, tech enthusiasts, and investors eager to see how Apple would respond to the rapid evolution in generative AI and the growing design expectations of its user base. “This year, we’re not just upgrading features—we’re elevating the entire Apple experience,” said Tim Cook, setting the tone for the keynote.
Let’s take a look at the most important updates from WWDC 2025.
Visual intelligence: Apple’s most tangible AI leap
- Tap on screenshots to identify objects, translate text, or create calendar events.
- Use Genmoji and image generation within Messages and Notes.
- Recognise content in any app and offer instant actions—like saving, searching, or sharing.
- Works entirely on-device or via Private Cloud Compute for intensive tasks.
And while last year was all about playing with generative features under the Apple Intelligence umbrella, this year’s spotlight belongs firmly to Visual Intelligence, the ability to interact with what you see on your screen. From identifying objects in screenshots to summarising content and triggering context-aware actions, Visual Intelligence is Apple’s most tangible and widely integrated AI experience yet.
It powers everything from photo recognition and live translations to app-specific automation, making it the most visible expression of Apple’s intelligence strategy in 2025.
iPhone: AI-powered everyday use
The iPhone gains the most direct benefits from Apple Intelligence. From Visual Intelligence that lets users tap into screenshots and images to perform actions, to real-time Live Translation in phone calls, Apple made the iPhone smarter and more helpful. Siri now integrates with ChatGPT-4o, letting users generate stories, answer questions, and get summaries from web content. Features like Hold Assist and Call Screening upgrade the Phone app, making everyday calls less frustrating and more productive.
- Ask Siri to summarise web articles or emails.
- Generate bedtime stories, email replies, or text drafts.
- ChatGPT-4o integration is opt-in and privacy-first.
- Responses can be contextual, creative, or research-driven—based on your request.
Key iPhone apps that got smarter:
- Phone: New features include Hold Assist, which waits on hold for you during customer service calls, and Call Screening, which transcribes the caller’s message live so you can decide whether to answer.
- Messages: Users can now create polls, apply text effects, and use AI to translate or summarise conversations in real-time. It supports Genmoji for playful personalisation and deeper Siri suggestions.
- Mail: AI tools summarise emails and suggest smart replies. Writing Assist helps users rephrase or expand messages, especially useful in professional communication.
- Photos: Redesigned to include Visual Intelligence, the app now recognises objects and people, auto-tags content, and allows users to search using natural language.
- Safari: Users can summarise web pages with Apple Intelligence, translate content instantly, or generate quick previews of articles or reports.
- Notes and Calendar: Automation and smart suggestions make planning more efficient. For example, Visual Intelligence can recognise dates or tasks from screenshots and offer calendar integration.
These updates make the iPhone more than a smart device—it becomes a context-aware companion that reduces friction in daily tasks.
macOS 26 (Tahoe): Intelligence for workflows
macOS 26, codenamed Tahoe, introduces a more intelligent and fluid desktop experience. It brings AI-driven file search, Mail summaries, and smart automation throughout native apps. The redesigned Control Centre and smarter Notes and Calendar integration create a more seamless workflow.
The Photos app has been redesigned to match its iOS counterpart, featuring Visual Intelligence search and auto-tagging. Apple Intelligence is deeply embedded across macOS to speed up repetitive tasks, draft text, and enable cross-app actions.
“macOS 26 is designed for flow. It’s fast, smart, and deeply integrated,” Federighi said.
iPadOS 26: Towards a Mac-like future
iPadOS 26 narrows the gap between the iPad and Mac by introducing windowed multitasking, enhanced keyboard shortcuts, and full external display support. With Liquid Glass and Apple Intelligence, the experience becomes smoother and smarter. Visual Intelligence also works in Split View and Slide Over, allowing users to tap on items, translate text, or set reminders, all from within running apps. “With iPadOS 26, we want to give users the power of a Mac, with the freedom of touch,” said Federighi.
Apple Watch: Now with more brainpower
watchOS 26 brings the spotlight to Workout Buddy, a personalised coaching assistant that uses Apple Intelligence to provide real-time feedback, motivation, and progress tracking.
- Offers tailored motivation based on your recent performance.
- Adapts your fitness goals and intensity dynamically.
- Provides spoken encouragement during workouts.
- Integrated with Apple Health and Siri for contextual updates.
Gesture-based controls are more intuitive—users can now flick their wrist or pinch to navigate apps without touching the screen. This makes workouts, calls, and messages more accessible on the go.
Home and tvOS 26
Apple didn’t spend long on tvOS 26, but it did get updates under the hood. Users can expect faster responses from Siri, better AirPlay continuity, and a redesigned Home app UI with shared profiles and suggested automations based on user routines. Scene recognition and AIdriven alerts enhance home security integrations with compatible smart devices.
AirPods and audio upgrades
AirPods Pro now support voice isolation during video recording, making them ideal for creators. Users can also nod to answer calls or shake their head to reject them—new motion gestures that bring hands-free control to everyday tasks. AirPods can now act as remote shutters for the iPhone camera, giving creators more flexibility when shooting.
Vision Pro and visionOS 26: Small upgrades, big potential
visionOS 26 builds on last year’s Vision Pro debut. While no new hardware was introduced, Apple added several meaningful software enhancements that enrich the spatial computing experience. Support for PlayStation VR2 controllers allows for more immersive gaming and navigation. The smoother Mac Virtual Display now enables better multi-window productivity for professional workflows. But the standout is clearly Spatial Widgets, which bring glanceable, floating UI components right into your environment.
- Lets users pin widgets around their space for glanceable updates.
- Integrates Calendar, Reminders, and Fitness apps in your environment.
- Works in both work and home modes for adaptive utility.
New SDKs allow developers to create shared spatial experiences, ideal for education, training, and collaborative design.
Other major highlights
Beyond the major platform updates, Apple introduced several noteworthy developments that shape the broader vision of WWDC 2025. These include support for developers, naming standardisation, and hardware compatibility guidelines—all critical for understanding the roadmap ahead.
Developer highlights: Opening the AI doors
For the first time, developers can access foundation models through Apple Intelligence APIs. This unlocks new capabilities for third-party apps. They can now summarise long-form content like articles or documents with precision. Apps can suggest replies or responses based on user behaviour and context, making interactions smarter. Perhaps most notably, developers can incorporate Visual Intelligence to analyse visual content within app interfaces—turning images and screenshots into actionable insights.
Etsy and TripIt showcased how their apps now integrate Visual Intelligence to enhance product discovery and travel planning. “The ability to meet shoppers right on their iPhone with Visual Intelligence is a meaningful unlock,” said Etsy’s CTO.
Name game: Year-based OS naming convention
Apple introduced a consistent naming scheme across its platforms: iOS 26, iPadOS 26, macOS 26, watchOS 26, tvOS 26, and visionOS 26. This streamlines product understanding and reflects a more unified ecosystem.
A word on performance and compatibility
Apple Intelligence features will run on A17 or newer iPhones and M1 or newer Macs and iPads. This excludes older devices, which will still receive other OS features but not the full AI suite. “We want to ensure every interaction is fast, reliable and private,” said Federighi.
Beta timeline and public release
Developer betas are live now. Public betas roll out in July 2025. The official release will follow in September alongside new hardware.

Distilled
WWDC 2025 didn’t bring mind-blowing gadgets or a reinvented Siri—yet. But it showed Apple at its most mature and methodical. From Liquid Glass UI to the cross-platform rollout of Apple Intelligence, every move was designed for stability, security, and smart integration. Apple chose polish over hype—and in doing so, reminded everyone why it leads not just in design, but in trust.
If WWDC 2024 teased Apple’s AI ambitions, then WWDC 2025 quietly but firmly laid the first real foundation. The future is being built, glass by glass, line by line, and now, smartly so.