What is WWDC?
🗣 The Apple Worldwide Developers Conference (WWDC) is an information technology conference held annually by Apple Inc., usually at Apple Park in California. The event is used to showcase new software and technologies in the macOS, iOS, iPadOS, watchOS, and tvOS families, as well as other Apple software, and includes third-party software developers that work on apps for iPhones, iPads, Macs, and other Apple devices. Amazing as it sounds, attendees can participate in hands-on labs with Apple engineers and attend in-depth sessions covering a wide variety of topics.
The first-ever WWDC was held in 1983 with the introduction of Apple Basic, but it wasn’t until 2002 that Apple started using the conference as a major launchpad for new products and eventually became Apple’s primary media event of the year.
The conference is held annually from Monday to Friday for one week in June and consists primarily of a keynote address, presentation sessions, one-on-one “lab” consultations, and special get-togethers and events. The presentations cover programming, design, and other topics and range from introductory to advanced. Almost all regularly scheduled presentations are delivered by Apple employees and streamed live.
During the event, every year Apple Design Awards celebrate apps and games that excel in the categories of Inclusivity, Delight and Fun, Interaction, Social Impact, Visuals and Graphics, and Innovation. There are also digital lounges where Apple engineers and designers host text-based Q&As, session watch parties with the presenters, community icebreakers, and more. Finally, the event promotes cross-over work by habilitating Apple Developer Forums, where anyone can sign in with an Apple ID to ask questions and provide thoughts.
WWDC 2022: Still no glasses?
🗣 Fans expressed their disappointment as Apple’s WWDC event ended without a single mention of a mixed reality headset. While Apple was highly rumored to be giving details of its first smart glasses, they gave details on its latest iPhone and iPad software updates, as well as a new MacBook Pro. But still, no glasses.
Now, Apple’s software is very good, generally speaking. Even as the company has spread its focus among more platforms than ever, those platforms have continued to be excellent. It’s been a while since there was any Apple Maps-style fiasco; the biggest mistakes Apple makes now are much more on the level of putting the Safari URL bar on the wrong part of the screen.
What all that success and maturity breeds, though, is a sense that Apple’s software is… finished — or at least very close. Over the last couple of years, the company’s software announcements at WWDC have been almost exclusively iterative and additive, with few big swings. Last year’s big iOS announcements, for instance, were some quality-of-life improvements to FaceTime and some new kinds of ID that work in Apple Wallet. Otherwise, Apple mostly just rolled out new settings menus: new controls for notifications, focus mode settings, and privacy tools.
Let’s be clear: this is not a bad thing! Neither is the fact that Apple is the best fast-follower in the software business, remarkably quick to adapt and polish everybody else’s new ideas about software. Apple’s devices are as feature-filled, long-lasting, stable, and usable as anything you’ll find anywhere. Too many companies try to reinvent everything all the time for no reason and end up creating problems where they didn’t exist. Apple is nothing if not a ruthlessly efficient machine, and that machine is hard at work honing every pixel its devices create.
But, truth is we’re at an inflection point in technology that will demand more from Apple. It’s now fairly clear that AR and VR are Apple’s next big thing, the next supposedly earth-shakingly huge industry after the smartphone.
AR will demand software that does more but also gets out of the way more
🗣 Apple has been showing off AR for years, of course. But all it’s shown are demos, things you can see or do on the other side of the camera. We’ve seen very little from the company about how it thinks AR devices are going to work and how we’re going to use them. The company that loves raving about its input devices is going to need a few new ones and a new software paradigm to match. That’s what we were supposed to see this year at WWDC, though we didn’t.
Last year, Apple showed that you could take a picture of a piece of paper with your iPhone and it would automatically scan and recognize any text on the page. Live Text is an AR feature through and through, it’s a way of using your phone’s camera and AI to understand and catalog information in the real world. The whole tech industry thinks that’s the future — that’s what Google’s doing with Maps and Lens and what Snapchat is doing with its lenses and filters. Apple needs a lot more where Live Text came from.
From a simple UI perspective, one thing AR will require is a much more efficient system for getting information and accomplishing tasks. Nobody’s going to wear AR glasses that send them Apple Music ads and news notifications every six minutes, right? And full-screen apps that demand your singular attention are increasingly going to be a way of the past.
Given that, we are expecting Apple to continue to bring its devices much closer together in terms of both what they do and how they do it to make its whole ecosystem more usable. With a nearly full line of Macs and iPads running on Apple’s M chip — and maybe a full line after WWDC if the long-awaited Mac Pro finally appears — there’s no reason for the devices not to share more DNA. Universal Control, which was probably the most exciting iOS 15 announcement even if it didn’t ship until February, is a good example of what it looks like for Apple to treat its many screens as part of an ecosystem. If iOS 16 brings true freeform multitasking to the iPad, an iPad in a keyboard dock is basically a Mac. Apple used to avoid that closeness; now, it appears to be embracing it. And, if it sees all these devices as ultimately companions and accessories to a pair of AR glasses, it’ll need them all to do the job well.
The last time Apple had a truly new idea about how we use gadgets was in 2007
👉 The last time Apple had a truly new idea about how we use gadgets was in 2007 when the iPhone launched. Since then, the industry has been on a yes-and path, improving and tweaking without ever really breaking from the basics of multitouch. But AR is going to break all of that. It can’t work otherwise. That’s why companies are working on neural interfaces, trying to perfect gesture control, and trying to figure out how to display everything from translated text to maps and games on a tiny screen in front of your face. Meta is already shipping and selling its best ideas; Google’s are coming out in the form of Lens features and sizzle videos. Now, Apple needs to start showing the world how it thinks an AR future works.
Interested in discovering how AR can help your business? 📲 Check out here and learn more about ways in which our technology can allow your company to reach de next level!