Why Apple Is Watching Google’s AI Progress Carefully

By Mark Sullivan ,  May 19, 2017

If you see the tech world as a competition between major platforms like Amazon, Apple, Facebook, Google, Microsoft, and Samsung–as I do–Google made several announcements at its I/O developer conference on Wednesday that could affect the balance of power.

As the maker of the second biggest mobile operating system in the world, Apple may be the company Google’s AI advancements impact the most. Many will be watching to see what sorts of AI capabilities Apple announces at its WWDC developer conference coming up June 5.

In this season of developer conferences (Microsoft’s Build was just last week), it’s become clear that all the companies that make up Big Tech have been working hard to leverage artificial intelligence in their products. More specifically, scientists and engineers at these companies are teaching computers to talk and see, through technologies such as natural-language processing and machine vision.

Many of the things announced here at Google’s conference hit on these dominant AI themes in some way.

Google Lens, for example, brings sight to the Google Assistant service, identifying and analyzing images you snap with a smartphone so that the Assistant can act on the data. Lens can read a restaurant menu written in another language and translate it; the Assistant might then provide example pictures of the food choices. There could be a business here as well as a useful tool: For instance, a restaurant might pay Google for the right to display marketing language around the image of its storefront as seen by Lens. I expect the computer-vision aspect of the Assistant to get far more interesting over time.

Above Avalon analyst Neil Cybart likes the concept but not the use cases Google chose to showcase it. “It’s not enough to just say you can walk around taking pictures of storefronts to get Yelp ratings,” he says. “It’s fair to assume Apple will be playing in this area going forward.”

Google’s Photos app uses AI to pick your best shots, label them, and even suggest people you could share the photos with based on the people it’s recognized in the shots. Another feature makes whole albums full of photos show up in someone else’s Photos app, again based on the people detected in the photos.

For many users the new features in Photos might be useful in combating the common problem of having hundreds of memory-using photos on one’s phone and never really using them for anything.

Apple is very likely to talk about how its bringing it AI further into its own Photos app to fix some of the same problems Google fixed, says Global Data analyst Avi Greengart.

Apples And Apples?

Right now Google appears to be well ahead of Apple in delivering useful AI-powered experiences.

Comparing Google AI and Apple AI is relevant because both companies want their apps and services to get as much phone screen time every day as possible. The screens of the more than one billion iOS devices is a big battleground for both companies. Apple wants these devices to spur consumption for its digital services business, which it wants to double in the next four years.

It might hurt Apple if Google could leverage superior AI to create more compelling apps and services, says Tech Knowledge analyst Carolina Milanesi. That might cause significant numbers of iPhone users to adopt them, displacing Apple apps. Apple abhors the idea of being just a hardware company; it wants to supply the whole experience, including software and services.

And iPhone users do have a choice. Google also announced Wednesday that its Assistant is now available on iOS, as is its Allo messaging app and lots of other Google ware. People who prefer Google’s assistant over Siri can easily use it–although the fact that it’s not built into the operating system means that you can’t wake it by saying “OK Google” as you can on an Android handset.

Apples And Oranges

Siri is clearly behind other assistants, both in her ability to comprehend and in her ability to act on information she hears from the user. But Siri is just one end point for AI (albeit a big one). To say Apple is “behind” in AI doesn’t quite capture the nuance of the situation.

At WWDC, Apple might announce enhancements to the AI behind things like Photos, Maps, and Messages in ways that solve real problems. They might be different than the ones Google and others have used AI to solve.

As Moor Insights & Strategy analyst Patrick Moorhead points out, Apple doesn’t like to be forced into pushing out new products or services purely because of competitive pressure, and it has rarely done so. It has lots of people working on AI, but is being very thoughtful about where and how it exposes it.

Apple also isn’t likely to talk as loudly about AI as Google does. Google, after all, is a software and services company–you’d expect it to focus on AI. Apple is mainly a hardware company (and, yes, increasingly a services company). It’s more likely to focus on functionality that AI helps improve, rather than dwelling much on technical underpinnings.

Apple Vs. Google

One clear advantage Google enjoys in AI is access to lots of personal user data that it pulls from services such as Google Docs, Gmail, and Google Calendar. Google will leverage as much user data as privacy concerns permit to help the Google Assistant become an expert on the user’s life, something like the way human assistants become more valuable as they learn more and more.

For example, the Assistant running on the Google Home smart speaker could notify you that traffic on the way to your next appointment is bad, and to allot extra time. It might know that a flight listed in an email has been delayed and alert the user to that.

Apple, by contrast, has shunned collecting personal data because of privacy and security concerns. There’s no doubting that’s a good thing from a privacy perspective, but it might hinder the company from arming Siri with the information she needs to be an expert on the user’s life.

Apple, however, has more control over the chips running the on-device AI computations on its gadgets than Google does. Apple is highly competent in optimizing chips–which it designes itself–for the needs of its own software. Apple may simply be able to leverage more computing power in its devices than Google can in third-party Android devices. Some AI computation can happen in the cloud, but that costs milliseconds and could raise security risks.

After watching this week’s announcements about the new AI powers in Google’s assistant, I’m reminded that there’s far more to releasing winning apps and services than sheer technological prowess. Google has on numerous occasions released products that were technically impressive but betrayed a woeful misunderstanding of what people might find useful (Google Wave, Google Glass, etc.).

Apple doesn’t need to demonstrate superiority over Google in the pure science part of artificial intelligence. It can win by coupling its AI chops with a superior understanding of how people will best benefit from AI in day-to-day life.

It all still comes down to the battle for time on the user’s phone screen throughout the day.

If you see the tech world as a competition between major platforms like Amazon, Apple, Facebook, Google, Microsoft, and Samsung–as I do–Google made several announcements at its I/O developer conference on Wednesday that could affect the balance of power.

 

Fast Company , Read Full Story

(29)