This Week in Apps: WWDC 21 highlights, Instagram Creator Week recap, Android 12 beta 2 arrives

wwdc-2021-tim-cook-crowd-e1623094087992.jpgw680

Welcome back to This Week in Apps, the weekly TechCrunch series that recaps the latest in mobile OS news, mobile applications and the overall app economy.

The app industry continues to grow, with a record 218 billion downloads and $143 billion in global consumer spend in 2020. Consumers last year also spent 3.5 trillion minutes using apps on Android devices alone. And in the U.S., app usage surged ahead of the time spent watching live TV. Currently, the average American watches 3.7 hours of live TV per day, but now spends four hours per day on their mobile devices.

Apps aren’t just a way to pass idle hours — they’re also a big business. In 2019, mobile-first companies had a combined $544 billion valuation, 6.5x higher than those without a mobile focus. In 2020, investors poured $73 billion in capital into mobile companies — a figure that’s up 27% year-over-year.

This week, our series will take a dive into the key announcements impacting app developers from WWDC 21.

This Week in Apps will soon be a newsletter! Sign up here: techcrunch.com/newsletters

Image Credits: Apple

Apple’s WWDC went virtual again this year, but it didn’t slow down the pace of announcements. This week, Apple introduced a slate of new developer tools and frameworks, changes to iOS that will impact how consumers use their devices and new rules for publishing on its App Store, among other things. We don’t have the bandwidth to dig into every dev update — and truly, there are better places to learn about, say, the new concurrency capabilities of Swift 5.5 or what’s new with SwiftUI.

But after a few days of processing everything new, here’s what’s jumping out as the bigger takeaways and updates.

Xcode Cloud

Apple’s development IDE, Xcode 13, now includes Xcode Cloud, a built-in continuous integration and delivery service hosted on Apple’s cloud infrastructure. Apple says the service, birthed out of its 2018 Buddybuild acquisition, will help to speed up the pace of development by combining cloud-based tools for building apps along with tools to run automated tests in parallel, deliver apps to testers via TestFlight and view tester feedback through the web-based App Store Connect dashboard. Beyond the immediate improvements to the development process (which developers are incredibly excited about based on #WWDC21 tweets) Xcode Cloud represents a big step by Apple further into the cloud services space, where Amazon (AWS), Google and Microsoft have dominated. While Xcode Cloud may not replace solutions designed for larger teams with more diverse needs, it’s poised to make app development easier — and deliver a new revenue stream to Apple. If only Apple had announced the pricing! 

Swift Playgrounds 4

Image Credits: Apple

Swift Playgrounds got a notable update in iPadOS 15, as it will now allow developers to build iPhone and iPad apps right on their iPad and submit them to the App Store. In Swift Playgrounds 4, coming later this year, Apple says developers will be able to create the visual design of an app using SwiftUI, see the live preview of their app’s code while building and can run their apps full-screen to test them out. App projects can also be opened and edited with either Swift Playgrounds or Xcode.

While it’s not the Xcode on iPad system some developers have been requesting, it will make app building more accessible because of iPad’s lower price point compared with Mac. It could also encourage more people to try app development, as Swift Playgrounds helps student coders learn the basics then move up to more challenging lessons over time. Now, they can actually build real apps and hit the publish button, too.

App Store

Antitrust pressure swirling around Apple has contributed to a growing sentiment among some developers that Apple doesn’t do enough to help them grow their businesses — and therefore, is undeserving of a 15%-30% cut of the revenues the developers themselves worked to gain. The new App Store updates may start to chip away at that perception.

Soon, developers will be able to create up to 35 custom product pages targeted toward different users, each with their unique URL for sharing and analytics for measuring performance. The pages can include different preview videos, screenshots and text.

Image Credits: Apple

Apple will also allow developers to split traffic between three treatments of the app’s default page to measure which ones convert best, then choose the percentage of the App Store audience that will see one of the three treatments.

Meanwhile, the App Store will begin to show to customers in-app events taking place inside developers’ apps — like game competitions, fitness challenges, film premieres and more — effectively driving traffic to apps and re-engaging users. Combined, Apple is making the case that its App Store can drive discovery beyond just offering an app listing page.

Beyond the App Store product itself, Apple overhauled its App Store policies to address the growing problem of scam apps. The changes give Apple permission to crack down on scammers by removing offenders from its Developer Program. The new guidelines also allow developers to report spam directly to Apple, instead of, you know, relying on tweets and press.

Apple has historically downplayed the scam problem. It noted how the App Store stopped over $1.5 billion in fraudulent transactions in 2020, for example. Even if it’s a small percentage of the App Store, scam apps with fake ratings not only can cheat users out of millions of dollars, they reduce consumer trust in the App Store and Apple itself, which has longer-term consequences for the ecosystem health. What’s unclear, however, is why Apple is seemingly trying to solve the App Review issues using forms — to report fraud (and now, to appeal rulings, too) when it’s becoming apparent that Apple needs a more systematic way of keeping tabs on the app ecosystem beyond the initial review process.

Notifications overhaul

The App Store discovery updates mentioned above also matter more because developers may need to reduce their reliance on notifications to send users back into their apps. Indeed, iOS 15 users will be able to choose which apps they don’t need to hear from right away — these will be rounded up into a new Notification Summary that arrives on a schedule they configure, where Siri intelligence helps determine which apps get a top spot. If an app was already struggling to re-engage users through push notifications, getting relegated to the end of a summary is not going to help matters.

And users can “Send to Summary” right from the Lock Screen notification itself in addition to the existing options to “Deliver Quietly” or be turned off. That  means any ill-timed push could be an app developer’s last.

Image Credits: Apple

Meanwhile, the clever new “Focus” modes let iOS users configure different quiet modes for work, play, sleeping and more, each with their own set of rules and even their own home screens. But making this work across the app ecosystem will require developer adoption of four “interruption levels,” ranging from passive to critical. A new episode of a fav show should be a “passive” notification, for example. “Active” is the default setting — which doesn’t get to break into Focus. “Time sensitive” notifications should be reserved for alerting to more urgent matters, like a delivery that’s arrived on your doorstep or an account security update. These may be able to break through Focus, if allowed.

Image Credits: Apple

“Critical” notifications would be reserved for emergencies, like severe weather alerts or local safety updates. While there is a chance developers may abuse the new system to get their alert through, they risk users silencing their notifications entirely or deleting the app. Focus mode users will be power users and more technically savvy, so they’ll understand that an errant notification here was a choice and not a mistake on the developer’s part.

Image Credits: Apple

Augmented Reality

Apple has been steadily pushing out more tools for building augmented reality apps, but this WWDC it just introduced a huge update that will make it easier for developers getting started with AR. With the launch of RealityKit 2, Apple’s new Object Capture API will allow developers to create 3D models in minutes using only an iPhone or iPad (or a DSLR or drone if they choose).

Explains Apple this will address one of the most difficult parts of making great AR apps, which was the process of creating 3D models. Before, this could take hours and cost thousands of dollars — now, developers with just an iPhone and Mac can participate. The impacts of this update will be seen in the months and years ahead, as developers adopt the new tools for things like AR shopping, games and other AR experiences — including ones we may not have seen yet, but are enabled by more accessible AR technology tools and frameworks.

SharePlay

This update is unexpected and interesting, despite missing what would have been an ideal launch window: mid-pandemic back in 2020. With SharePlay, developers can bring their apps into what Apple is calling “Group Activities” — or shared experiences that take place right inside FaceTime.

If you were co-watching Hulu with friends during the pandemic, you get the idea. But Apple isn’t tacking on some co-viewing system here. Instead, it’s introducing new APIs that let users listen to music, stream video or screen share with friends, in a way that feels organic to FaceTime. There was a hint of serving the locked-down COVID-19 pandemic crowd with this update, as Apple talks about making people feel as if they’re “in the same room” — a nod to those many months where that was not possible. And that may have inspired the changes, to be sure. Similarly, FaceTime’s support for Android and scheduled calls — a clear case of Zoom envy — feels like a case of playing catch-up on Apple’s part.

Image Credits: Apple

The immediate demand for these sorts of experiences may be dulled by a population that’s starting to recover from the pandemic — people are now going out and seeing others in person again thanks to vaccines. But the ability to use apps while FaceTime’ing has a lifespan that extends beyond the COVID era, particularly among iPhone’s youngest users. The demographic growing up with smartphones at ever-younger ages don’t place phone calls — they text and FaceTime. Some argue Gen Z even prefers the latter.

Image Credits: Apple

With its immediate support for Apple services like Apple Music and Apple TV+, SharePlay will hit the ground running — but it will only fully realize its vision with developer adoption. But such a system seems possibly only because of Apple’s tight control over its platform. It also gives a default iOS app a big advantage over third-parties.

More

There were, of course, hundreds of updates announced this week, like Spatial audio, Focus modes, AirPods updates, iPadOS improvements (widgets! multi-tasking), Health updates, iCloud+ with Private Relay, watchOS improvements, Spotlight’s upgrade, macOS 12 Monterey (with Continuity with Universal Control), HomePod updates, StoreKit 2, Screen Time APIs, ShazamKit, App Clips improvements, Photos improvements and others.

Many, however, were iterative updates — like a better version Apple Maps, for example, or Siri support for third-party devices. Others are Apple’s attempt to catch up with competitors, like the Google Lens-like “Live Text” update for taking action on things snapped in your photos. The more significant changes, however, aren’t yet here — like the plan to add Driver’s Licenses to Wallet and the plan to shift to passwordless authentication systems. These will change how we use devices for years to come.

Platforms: Google

✨ Not to be outdone by WWDC (ha), Google this week launched Android 12, beta 2. This release brings more of the new features and design changes to users that weren’t yet available in the first beta which debuted at Google I/O. This includes the new privacy dashboard; the addition of the mic and camera indicators that show when an app is using those features; an indication when an app is reading from the clipboard; and a new panel that makes it easier to switch between internet providers or Wi-Fi networks.

Google also this week released its next Pixel feature drop which brought new camera and photo features, privacy features, Google Assistant improvements and more. Highlights included a way to create stargazing videos, a car crash detection feature and a way to answer or reject calls hands-free.