06/10 2025
471
Key Points:
1. An Apple insider revealed that last year's WWDC "didn't even have prototypes for those demo features. Engineers didn't even know if they could be implemented. The only real technology in the keynote was the colorful light band effect that flashed on the screen when users woke up Siri."
2. This year's WWDC, Apple appeared even more restrained, with AI seamlessly integrated into many product features. The conference focused more on Apple's strengths in OS and design. In simple terms: Not enough AI, so UI makes up for it.
3. Why is Apple's AI lagging behind? The reason is not a single factor but a combination of technical, strategic, organizational, and cultural issues.
Author: Lin Yi
Editor: Key Point Master
At 1 AM today, WWDC2025 arrived as scheduled. This year's conference was officially named "Sleek Peek" (A Glimpse of Elegance). It has to be said that Apple is a master at managing expectations.
When Apple first announced the Apple Intelligence plan last year, it supported the new AI narrative with cool demos and PPTs. Within two days, Apple's stock price surged by 10%, and analysts exclaimed, "Apple has finally caught up with the AI wave." However, subsequent insider revelations revealed that "those demo features didn't even have prototypes. Engineers didn't even know if they could be implemented. The only real technology in that keynote was the colorful light band effect that flashed on the screen when users woke up Siri."
Apple had to admit that some ambitious AI features, such as Siri's context awareness and multitasking capabilities demonstrated at WWDC24, would take more time to implement. John Giannandrea, Apple's AI leader who previously oversaw the acquisition of DeepMind at Google and integrated AI into core products, was also relieved of most of his management responsibilities.
This year's WWDC, Apple appeared even more restrained, with AI seamlessly integrated into many product features. The conference focused more on Apple's strengths in OS and design. In simple terms: Not enough AI, so UI makes up for it.
Apple Intelligence is further integrated into iPhone, Apple Watch, Apple Vision Pro, Mac, and iPad.
The main advancements include:
Expanded Language Support: The generative models supporting Apple Intelligence are now more powerful and efficient and support more languages.
Enhanced Communication Tools: Features like Genmoji and Image Playground provide new ways to express creativity, allowing users to mix existing emojis or add descriptions to generate unique visual effects. Image Playground now also integrates ChatGPT-style functionality for image creation and requires user permission.
Real-time Translation: Conversations can be instantly translated in Messages, FaceTime, and phone calls. This device-based feature ensures the privacy of personal conversations.
Smart Summaries and Replies: Users can quickly get the gist of emails, notifications, and notes or write replies with just a tap. In addition to Apple Pay purchases, the Wallet app can also summarize order tracking details from emails.
Visual Intelligence: Building on previous generations, Visual Intelligence has now expanded to the iPhone screen, enabling users to search, take action, and ask questions about anything they see in apps. It can even integrate with the iPhone camera for real-world object recognition and search.
Fitness Companion (watchOS 26): Leveraging Apple Intelligence, this new fitness feature provides personalized motivation during workouts, analyzes past performance, and offers real-time encouragement through dynamic, engaging voices.
Smart Shortcuts and Spotlight (macOS Tahoe): Shortcuts now include "Smart Actions" that directly leverage the Apple Intelligence model for powerful personalized automation, such as summarizing text or creating images. Spotlight has received its biggest update yet, becoming a central hub for browsing content, launching apps, and performing system and app operations, with smart suggestions and shortcuts.
From the live demos, most of these features are very practical but far from stunning.
From the perspective of AI developers, WWDC is a grand event for the global developer community. Apple released new tools this year to help them create better apps. For example, the Foundation Models framework provides direct access to large language models on the device, responding with plain text or structured Swift data. App Intents allow developers to integrate their app content and functionality throughout the system, including new shortcuts and visual intelligence experiences.
Additionally, opening up local models is a key advancement in Apple's AI ecosystem. Developers can now directly invoke built-in AI models on iPhone, iPad, and MacBook devices, which have performance levels close to GPT-3.5. Their core advantage lies in supporting fully offline operation and being completely free for developers, significantly lowering the barrier to entry for AI app development. This is undoubtedly very important for Apple to participate in AI technology and ecosystem competition.
As a former leader, why is Apple's AI currently lagging behind? Based on common industry views, the reason is not a single factor but a combination of technical, strategic, organizational, and cultural issues.
First, Siri's inherently flawed technical architecture. Although Siri started early, its closed technical architecture laid hidden dangers for later development. As the codebase became increasingly bloated and entangled, adding new features became exceptionally difficult and time-consuming. In the era of large models requiring rapid iteration and flexible expansion, this "first-mover advantage" eventually turned into a heavy burden, making Siri struggle in competition. This is also the biggest reason for the difficult birth of the new Siri.
Second, strict privacy policies. Apple's long-standing pride in privacy principles, especially the end-side processing strategy of "data not leaving the device," has inadvertently become a constraint on its AI development. Unlike competitors like Google and Amazon that optimize models with massive cloud data, Apple's reluctance to retain large-scale user interaction records has led to a severe lack of AI training data.
Third, turbulent internal organization. The Siri team has undergone multiple reorganizations, with a wavering direction and a long-term lack of effective collaboration with the company's core machine learning research team, resulting in severe internal friction. The senior decision-making layer also underestimated the disruptive potential of generative AI for a time, leading to insufficient resource investment. It wasn't until competitive pressure loomed that they hastily caught up. This top-down judgment error and strategic swing ultimately harmed the product iteration rhythm and market credibility.
Fourth, a conservative innovation culture. Apple's culture of extreme secrecy has become a shackle on innovation in the AI field. Compared to competitors like Google and Meta that actively embrace open-source communities and academic exchanges, Apple's closed nature has caused it to miss opportunities to absorb top external talent and cutting-edge ideas. At the same time, the limited openness of its ecosystem (such as SiriKit) has restricted the expansion of AI application scenarios, failing to fully leverage the power of global developers to build a thriving intelligent ecosystem.
From Steve Jobs' unveiling of Siri, hoping to usher in a new era of human-computer interaction, to executives admitting strategic mistakes and striving to catch up more than a decade later, Apple's AI journey has experienced a dramatic turn from a high start to a low finish. Nowadays, Apple has recognized its own shortcomings and is fully committed to catching up with the determination to "not miss the next bus." With its unparalleled brand appeal, strong ecological stickiness, and unique adherence to privacy and humanized experiences, Apple's catching up is not impossible. We'll wait and see.
Below is a partial transcript of the 2025 WWDC Keynote by Tim Cook and Apple executives:
Tim Cook: Good morning, and welcome to WWDC. This is Craig.
Apple Intelligence
First, Apple Intelligence. We are opening up access so that any app can directly tap into the large language models on the device, the core of Apple Intelligence, through the new Foundation Models framework. This allows developers to directly access powerful, fast, private, and even offline intelligent capabilities. For example, if you're preparing for an exam, you can use an app like Kahoot! to create personalized quizzes based on your notes, making learning more engaging. And because it uses an on-device model, there are no cloud API costs.
We are introducing a universal design across platforms for the first time, starting with a new expressive material we call Liquid Glass. It transforms based on your content and even your context, making navigation and control clearer. It perfectly refracts light and dynamically responds to your movements with specular highlights. Elements once considered for rectangular displays have been redesigned to perfectly concentric with the rounded corners of the hardware.
The lock screen has been updated with time and controls now made of Liquid Glass, featuring beautiful glass edges and a new sense of responsiveness. Whether you use Dark Mode or one of our new styles, like our All Clear look. I'll switch to my random photo wallpaper. And observe as I switch images. Now the time flexibly adapts to the available space. Its unique San Francisco font is meticulously designed to dynamically scale the thickness, width, and height of each digit to blend into the scene. When new information appears, such as a message from a friend or an email from a colleague, it also preserves the best part of my photo and view. Through advanced computer vision technology running on the Neural Engine, we can generate spatial scenes from your 2D photos, creating delightful 3D effects that bring your favorite memories to life as you move your iPhone in your hand.
Next, let's talk about the camera. Simplified and streamlined design enhances your two most frequently used capture modes: Photos and Videos. Just swipe your finger left or right to reveal other modes like Cinematic Mode and Portrait Mode, and easily reach all settings like Aspect Ratio and Timer with a simple upward swipe. Photos now have separate tabs for Library and Favorites, where you can find your favorites, albums, and easily access Search. Now web pages extend edge-to-edge all the way to the bottom of the screen, allowing you to see more page content. The tab bar has been redesigned to float above web pages and surfaces. Frequently used actions, such as Search and Refresh.
During calls, your most important controls now float in the bottom right corner, seamlessly receding when you don't need them. We've also redesigned the FaceTime onboarding page. Now, it's a space that celebrates your closest relationships with beautifully personalized contact posters.
CarPlay Updates
This year, we've made some significant updates to make CarPlay even more beautiful and easier to use, including icons that look great in both light and dark modes. We've also added a compact design so you can still see on-screen content, like upcoming routes, when you receive a call. As well as tap-to-return and pinned conversations in Messages. iOS 26 also gives you widgets in CarPlay. They're a quick and at-a-glance way to get information and Live Activities, so you can stay on top of what matters to you, like your friend's flight status. For developers, widgets and Live Activities you develop for iPhone can also be used in CarPlay. All these updates are also available for CarPlay Ultra. CarPlay Ultra allows you to choose the layout and design of the information that matters most to you and adds vehicle controls for features like radio and climate to the CarPlay experience.
iOS 26
Mobile apps are the foundation of the iPhone experience. This year, we're offering you a new unified layout option that integrates your Favorites, Recent Calls, and Voicemails. Your favorites are always prominently displayed. So you can start a call with just a tap. Your recent calls and voicemails now appear in a convenient list below, so you can easily scroll through and see who you've contacted and what they've said. With Apple Intelligence, new voicemail summaries help provide the most important information.
Now, when you get a call from an unknown number, Call Screening can help you quickly determine if it's important or just another annoying telemarketer. Call Screening automatically and silently answers calls from unknown numbers in the background. Once the caller tells their name and reason for calling, your phone will ring. (Ringtone) You can review their message and decide to answer or ignore. Next time you're stuck on hold changing a flight or dealing with utilities. The phone app automatically detects hold music and asks if you'd like to hold for you. While you're on hold, the music stops, but the call stays connected, so you can keep using your iPhone, even put it away and continue with your day. Once a live agent becomes available, we'll call you back and inform the agent you're on your way. Another way we keep you connected is through Messages.
This year, we're giving you an exciting new way to add personality to these shared spaces with backgrounds that show off your personality. When you choose a background, it displays for everyone in the conversation to enjoy. There are many age-old problems in group chats. We're introducing polls. You can create polls for anything, and with Apple Intelligence messages can detect when a poll might be useful and suggest it, for example, if someone asks where we should go for our annual trip? Anyone in the group can contribute new ideas and watch the live voting results. Now, you can request to send and receive Apple Cash directly from a group chat to settle up for concert tickets or dinner bills. We've also brought typing indicators to your group chats, so you know who's about to join the conversation. We've brought filters to messages to help you cut through the noise. We're introducing the ability to filter new senders, so your mail isn't cluttered with spam and unknown numbers.
To that, you can mix two emojis together to create something new, like a sloth and a lightbulb, and when you're the last one in a group chat to get a joke, you can choose to change the reaction, like making your friend look shocked or making them laugh. You can also adjust hairstyles and more. You have new ChatGPT styles, like oil painting, that you can apply to your friend's contact posters.
It's integrated into Messages, FaceTime, and Phone and powered by Apple-built models that run entirely on the device. For example, in Messages, when you're watching a video over FaceTime, you can keep hearing Grandma's voice while reading real-time captions of the translation. (Speaking in a foreign language) When you're on a phone call, your speech is translated, and the translation is read aloud to the person on the other end.
[Voiceover] Hello, are you available to cater a wedding on December 6th? (Soft music) (Speaking in a foreign language)
In Apple Music, we've added lyrics translation to help you understand the meaning behind your favorite songs. And with lyrics pronunciation, everyone can sing along, no matter the language. (Upbeat music) We've also introduced Auto Mix, which uses smart technology to seamlessly mix one song into the next like a DJ, with perfect timing for tempo stretching and beat matching. (Upbeat music) With Music Pins, just pin your favorite artist's album or playlist to the top of your library.
Let's talk about Maps. Your iPhone will now learn your preferred routes and offer them in Maps alongside other options. With the Maps widget, you can check your commute route before you leave to see how long it will take. We're introducing visited places. You can choose to have your iPhone detect when you're somewhere, like a restaurant or store, and see your visited places in the Maps gallery. Or just tap to search and share.
Next up is Wallet. With Driver's License, ID, and Wallet, we've created the most private and secure way for you to show your identity. Currently available in nine states and Puerto Rico, starting this fall, you can use your U.S. passport to create a digital ID, but it doesn't replace the physical passport. The digital ID can be used for domestic travel at TSA checkpoints that support it in the app, as well as in person for age and identity verification. When you're at the airport, you'll love the updated boarding pass and Wallet, which now include convenient indoor maps to help you navigate the airport quickly. With boarding passes and shareable real-time flight status, Find My bag tracking. So loved ones know when you've landed.
Next
Apple Pay. In addition to using rewards and installments when shopping online with Apple Pay, you can now choose to redeem points or pay in installments when shopping in person. With Apple Intelligence, Wallet can now identify and summarize order tracking details beyond Apple Pay orders. It finds emails sent by merchants or delivery carriers and conveniently summarizes your order details, order progress notifications, and more.
Introducing the Game app, where you can see the most relevant content, like updates and can't-miss events in the games you're playing. On the Library tab, you can see all the games you've downloaded from the App Store. Head to the Play Together tab to see what your friends are playing, compare scores and achievements, and invite them to play.
We're extending visual intelligence to your iPhone screen so you can use whatever you're looking at to search and take action in your apps. Like opening a social media app, I see this great jacket, and I want to know where I can buy something like this. Now at the top, I have screenshot controls like markup and share. At the bottom, I have visual intelligence tools, like the image search button in the bottom right, and I can tap here to find similar images on Google or other apps I frequently use, say I come across an event I'm interested in, and I want to add it to my calendar. Visual intelligence makes this easier than ever. Notice the Add to Calendar suggestion at the bottom. With Apple Intelligence, it extracts the date, time, and location and prefills them. I'll tap the event, and it's that simple.
watchOS 26
Apple Watch is indispensable, and with watchOS 26, we're continuing to push that forward. It features liquid glass with a gorgeous new design. It brings a more expressive experience with the Smarts stack, numbers on the photo face, control center, and in-app navigation and controls. watchOS introduces an exciting new fitness feature.
Workout Buddy. Workout Buddy is pretty incredible. It collects all your workout data simultaneously and quickly analyzes vast fitness histories. Combined with Apple Intelligence on your iPhone, it provides just the right encouragement as you start your run, and Workout Buddy can be activated with a peptide.
[Voiceover 2] Great start to your run. This is your second run this week, and you're doing great. Six days in a row closing that move ring. Let's build on that achievement with today's run. You'll be picking up some home ambiance from Murra Masa. It acknowledges your effort after you finish your workout. Great run. Your average pace was nine minutes and seven seconds per mile for 6.3 miles.
It's personal, private, and will offer the most popular workout types in English. Now, features like custom workouts and race routes are just a tap away. So you can easily create custom workouts with work and recovery intervals. For your music, you can keep it simple and let Apple Music choose the best playlist based on the workout you're doing and the music you love. You can choose from recommended playlists or podcasts based on what you've listened to while doing that type of workout, all powered by suggestions.
This year, the Smarts stack improves its predictive algorithms by fusing on-device data like points of interest, location, and more, so it can accurately predict which features might be useful to you right now. So when you walk into the gym for your regular morning workout, a Smarts stack hint will appear subtly, and when you open the Smarts stack, the widgets will be ready for you to start your workout. Or, when Apple Watch detects you're in a remote location without a connection, a hint will appear to start Backtrack to support you while you're off the grid.
We've also made Apple Watch more proactive by enhancing notifications. You can use silent mode to avoid loud alerts at inopportune moments. But now Apple Watch can interpret ambient noise in your environment and automatically adjust the volume of incoming calls and notifications, which works much better. When you receive a notification you want to deal with later, the new wrist flick gesture will dismiss it and take you back to the watch face. You can also use wrist flick to silence incoming calls, timers, and alarms, or even dismiss the Smarts Stack.
The Messages app gets even better with Apple Intelligence Live Translate. Making it easier to stay in touch with loved ones who speak... just raise your wrist. Conversation context enhances the experience, and now based on the context of your messages, you can quickly perform new actions powered by machine learning intelligence, like sharing your location via Find My when someone is looking for you. We're also giving you a new way to quickly jot down important things, now available on Apple Watch. It's perfect when you want to save a quick note for yourself or view existing ones. watchOS also introduces new APIs for developers. Smarts stack can intelligently display a widget, like the slopes widget when you arrive at a ski resort with the improved location API, and now Dark Noise can add custom controls to Control Center, so you can easily turn on sleep soundscapes without picking up your iPhone.
tvOS 26
With tvOS 26, we're making Apple TV more enjoyable than ever. With layered design and new specular highlights, app icons look more vibrant, bringing depth and detail to every edge. You'll immediately notice how the play controls refract the content below, perfectly complementing the action without distracting from the story. This unobtrusive design carries through to the Control Center, and as you browse, you'll see a bold new look and exquisite movie poster art. Now, you can set your Apple TV to display your profile when it wakes from sleep. For developers, tvOS also introduces a new API for automatic sign-in that links app sign-ins to your Apple account. This means you no longer need to sign in on every Apple device. Apple Music is another popular choice. Singing along with Apple Music is even better with tvOS. Your iPhone becomes the microphone, amplifying your voice through the TV and creating visual effects that light up the big screen. Everyone can join in with their own iPhone and add songs to the queue, react with on-screen emojis, or take turns singing along with their favorite artists. (Upbeat music)
Let me introduce you to macOS Tahoe. Widgets, the Dock, and app icons have been refined with liquid glass sidebars and toolbars that reflect the depth of your workspace and provide subtle cues as you scroll to easily reach content. The menu bar is now fully transparent, making your display feel larger, and there are more ways to customize the controls displayed in the menu bar and Control Center and how they're laid out. You can change the color of folders and add symbols or emojis to give them unique identities.
macOS Tahoe
Let's talk about the power of continuity. We're bringing Live Activities to Mac. So if you order an Uber Eats on your iPhone, the Live Activity also appears in the menu bar. When you click on it, the app opens in an iPhone mirror so you can take action directly on your Mac. We're also enhancing the phone experience by bringing the Phone app to Mac. You have convenient access to familiar content like recent calls, contacts, and voicemails, synchronized from your iPhone, and easily make calls with just a click. Shortcuts help people get more done faster every day. In macOS Tahoe, you can run them automatically, like at a specific time of day or when you perform actions like saving a file to a folder or connecting to a display. With Apple Intelligence, we're introducing Smart Actions. For example, if you're a student, you can set up a shortcut to compare your lecture recordings with your notes and add any points you missed. You can access Apple Intelligence models on your device using private cloud computing or use ChatGPT.
Spotlight Presentation
Spotlight is getting its biggest update ever. It puts your most relevant files in Easy Reach, including smart suggestions based on your daily routine or what you're currently working on. We're also bringing system and app actions to Spotlight. You can take hundreds of actions, from creating events to starting a recording, even playing a podcast. You can even fill in parameters for actions. Say you want to send an email, you can compose the message, set the recipients, and add a subject, all without taking your hands off the keyboard. Plus, you can access your app's mini bar items directly from Spotlight. We're introducing shortcuts. For example, just type sm to send a message, ar to add a new reminder, and then just type the reminder, and I'll copy an image to the clipboard.
Let me paste that in. Perfect. Next, quick resizing, I'll search for background removal, and find the page menu item. I just hit return. There we go. I can now get my clipboard history with Spotlight. It's on the right. This is great for going back to images, text, or links I copied earlier in the day. Spotlight knows I regularly add documents to this idea board, so it actually offers it as an action here. All set, files are queued up.
Starting with the new Games app, it also elevates the gaming experience. Click the controller to bring up the new game overlay, where you can adjust the most important system settings in-game. MacOS Tahoe introduces Metal 4, which brings next-generation rendering technologies to games, like frame interpolation, denoising, and more.
Let's start with some amazing new spatial experiences in visionOS, like Widgets. visionOS now has beautiful new widgets, like Clock, with unique and refined clock face designs. Weather, adapting to today's forecast. Music for quick access to your favorite tunes.
visionOS
All of this and more can be found in the new Widgets app. iOS introduced us to spatial scenes, and they're coming to visionOS too.
It uses a new AI algorithm that leverages computational depth to create multiple perspectives from your 2D photos, making the image more immersive. Spatial scenes will also make your web browsing experience more engaging. Just select spatial browsing to transform supported articles, hiding distractions and revealing vivid inline photos as you scroll. We're first releasing avatars as a beta feature that represents you when you're on a video call. With visionOS 26, avatars take a huge leap forward and represent you more authentically. Now, you and your friends can watch the latest movie or play spatial games together, which is great for businesses as enterprise users can collaborate using apps like Dasso System's 3D live to visualize 3D designs in person or with remote colleagues.
With visionOS 26, organizations can easily share a common device pool among team members, and you can securely save your eye and hand data, vision prescriptions, and accessibility settings to your iPhone, so you can quickly get started using a shared team device or a friend's Vision Pro as a guest user. The new protected content API is essentially a view-only mode, ensuring that only those with access can see confidential materials like medical records or business forecasts. We're excited to share the Logitech Muse, purpose-built for Vision Pro, which unlocks a whole new creative workflow. The Logitech Muse lets you draw and collaborate precisely in three dimensions in apps like Spatial Analog. Sony is bringing support for PlayStation VR 2 Sense controllers to visionOS 26. With visionOS 26, we can create amazing new content in more ways. Just like the all-new Adobe visionOS app, powered by Premier, lets you edit and preview spatial videos directly in Vision Pro. We're partnering with GoPro, Insta360, and Canon to support native playback of 180, 360, and wide field-of-view videos on visionOS.
iPadOS 26
iPadOS 26 is a major release. This is what you've all been waiting for. The new windowing system on iPad. When I first open an app, it's full screen. I can swipe home and open another, and it's still full screen. So, if you like using your iPad this way, it's just as simple as before. But now I can also smoothly resize apps into windows with this grab handle in the bottom right corner. If I resized apps before, when I open them again, they open in exactly the same size and position, and I can place them wherever I want. It's designed to work perfectly with touch or trackpad.
Now I also have a more precise and sensitive pointer. When I hover over these new buttons in the top left corner with it, they expand into familiar controls. I can use them to close or minimize my windows. Another way I can arrange windows is with tiling. Since this is designed for the unique qualities of the iPad, I just flick a window to the edge to tile it. There's even a little grabber here to resize both at the same time. If I click and hold on the window controls, I'll see more tiling options. Let's go for the dorm. I just swipe to see the Home Screen. Tapping an app will open it along with all my previous windows for a quick look at everything I have open, and now that I've exposed it, swipe up from the bottom and hold my windows spread out, and tap to bring the window I need to the front. So we're also adding a menu bar on iPad. It's always accessible from the top of the screen. It's arranged into familiar and clearly labeled options and is a great way to quickly find the functions you need.
The new windowing architecture allows us to bring it to every iPad running iPadOS 26, including iPad and iPad Mini.
We're supercharging the Files app. It now includes features like an updated list view to see more details in resizable columns and collapsible folders. It also has the same folder customization options as macOS, with custom colors and icons that sync between devices, making it easy to find the folder you need. Now you can choose which app to open a file with and set it as the default, like choosing Photoshop, VSCO, or Darkroom to open images. To access your files even easier, you can now put folders in the Dock. Just drag any folder from the Files app directly into the Dock. When you tap one, it expands to show the files inside.
To help you with that, we're bringing Preview, a beloved app from MacOS, to iPadOS. It also includes familiar tools to edit images and export them in various different formats and sizes. We're also providing more sophisticated workflows for working with audio and video. To help easily specify which microphone to use, we've added an audio input selector. We're offering voice isolation, which we can provide no matter what app you're using. Keep the sound clear and crisp. Local capture lets you create amazing content with high-quality recordings directly from your iPad.
It works with any video conferencing app you love. Just turn it on in Control Center, and your own local high-quality audio and video are captured to your iPad, with echo cancellation for other participants' audio to keep your voice front and center, so you can clearly record your side of the call. Once it's over, each person using local capture can easily share their own audio and video files. When you need to edit and export content, we've made this all much easier by enabling background tasks on iPad. If you start an extended process like an export and switch to another app, the task will continue to run. Background tasks appear as Live Activities, so you always know what's running and always have full control.
We don't have time to cover everything today, like advanced 3D graphics in Math Notes. A new reading pen that brings the traditional calligraphy experience. And the Journal app, which lands on iPad with Apple Pencil support. Icon Composer helps developers and designers create layered icons that look stunning under light, dark, colored, and full-clear appearances, with intuitive visualizations of how light interacts with liquid glass. We're bringing more generative intelligence to Xcode. Predictive code completion becomes more accurate using a larger context of the code, and predictive code completion becomes more accurate. We're expanding our vision for Quick Help, allowing developers to interact with code using natural language directly within Xcode.
Our new operating system versions will be released today as developer betas, public betas will be available next month, and will roll out to all users this fall.
Cook: Thank you all for joining us today. We have a big week ahead. Let's make it a great WWDC. (End)