Google Debuts Shopping Graph, Machine Learning Tool to Make AI Easier

Kicking off on Tuesday, the Google I/O developer conference tends to be more than just an extravaganza for the techie set. It’s also a spotlight for the company’s vision and priorities — and shopping is clearly among them.

The search giant just took the wraps off of its new Shopping Graph, a resource that pulls in and connects massive loads of product data, in addition to a new Google Cloud tool that gives brands, developers and other companies an easier and faster way to use machine learning and artificial intelligence.

The Shopping Graph is similar to Google’s popular Knowledge Graph, an information resource that pulls in large amounts of data from across the Google ecosystem. Here, the tool focuses on product, drawing from Google resources, as well as information from published reviews or from the merchants and brands themselves.

“A key takeaway for the Shopping Graph is that the technology works in real-time, so people can discover and shop for products that are available right now. That’s a capability that wasn’t available just a couple of years ago,” Bill Ready, Google’s president of commerce, payments and NBU, or “next billion user” group, explained to WWD.

“For instance, if you’re looking for a certain brand of sneaker in a certain size and colorway that’s available to pick up near you, and you add those parameters to your search, the Graph will connect those attributes and surface the most relevant results,” he continued. “Retailers update their inventory, the Shopping Graph responds in real-time to account for the pair you bought, so the next person looking for them will have an updated view of the inventory. The Graph can also recognize products mentioned in YouTube videos, so we can also surface helpful influencer reviews or fashion content about that same pair of sneakers.”

View Gallery

Related Gallery

François Pinault’s New Contemporary Art Museum Bourse de Commerce in Paris

This is no fly-by-night project: Google has been working on this Shopping Graph for a couple of years. It’s the sort of project the tech company relishes — using data and technology to solve a fundamental problem that has dogged businesses, which in this case is about how to bring essential product info to the surface.

It takes a lot of data to do that, and according to Google, the comprehensiveness of this effort brings significant new capabilities to merchants and consumers that were never available before.

One such feature is a computer vision-based search tool. When users view any screenshot in Google Photos, the platform now puts up a suggestion to search the photo with Lens, so they can find a particular product. The technology behind it isn’t necessarily new, but even the best, most innovative search tools won’t work well if there’s not enough data to parse.

This approach seems particularly handy, considering how often people snap photos of interesting looks or products on the go, then leave the pics to languish without following up.

Ultimately, the goal is to allow for more robust product search results that connect products, expert and user reviews, nearby inventory, useful videos and top brands and retailers, Ready added. “For instance, if you’re looking for a mid-length, blue jacket in size medium that’s made from cotton, the connected attributes help us find and then surface products that have those exact qualities, plus news articles about relevant trends or YouTube videos on how to style a blue jacket.”

Google’s focus in shopping lies in product discovery. Courtesy image

Alongside the announcement, Google also revealed that it simplified its Shopify integration to make it easier for merchants to upload products and inventory in a few clicks. Previously, it could take hours or even days, depending on the complexity of their catalogue. Naturally, this move benefits the Shopping Graph, too — more updated product information going online means the graph has more to work with.

The effort boils down to the overarching goal of boosting product discovery, with Google flexing its muscles in data to do it. Already the Google-verse serves more than a billion shopping sessions per day, and it’s not done yet. It’s continuously looking for new tech-driven shopping experiences, like Lens using the smartphone camera to search what people see and “make the world around you your personal showroom,” said Ready.

“You may have also seen recent integrated shopping beta tests we launched with YouTube,” he added. “You can expect to see more new shopping experiences for consumers across Google surfaces.”

Some of that work also comes from different areas of the company.

On Tuesday, Google Cloud also announced general availability for Vertex AI, its managed machine learning platform that brands and other companies can use to speed up deployment and maintenance of artificial intelligence models.

In everyday terms, it does the heavy lifting for ML or AI processes, so companies can quickly develop intelligent features without building everything from scratch. According to Google Cloud, Vertex AI requires nearly 80 percent fewer lines of code to train models as compared to other platforms.

The toolkit, which spans computer vision, natural language processing and other systems, aims to make it easier and faster for brands to create better experiences for their customers.

Modiface, the L’Oréal-owned augmented reality provider for the beauty industry, uses Vertex AI and it seems incredibly impressed at what the Google tool can do.

AI is often used in beauty augmented reality to make the digital overlays — whether for trying out different hair colors or shades of lipstick — look and respond more realistically. It can also drill into deeper analysis based on what the computer vision sees, before providing product recommendations.

If Modiface built out all of the AI itself, it would have taken years, said chief operating officer Jeff Houghton.

As an example, he pointed to L’Oréal’s skin diagnostic experience: “You can go on a brand website, upload a selfie image and it will look at your face [and identify] different clinical concerns that can be addressed with products that that brand carries,” he explained to WWD. “How we actually go about making that recommendation from just an image is, we’ve trained our AI algorithm with thousands of different data points. And to train that AI algorithm, we actually need to run billions of different computations — that way, the algorithm can understand the best recommendation, when it sees an image it’s never seen before.

Modiface is a beauty tech provider that uses AI, ML and AR for features like beauty try-ons and skin analysis. L’Oréal acquired the company in 2018, and it’s been using Google Cloud Platform for months for AI and ML. Courtesy photo

“That’s actually the piece where we’re leveraging Google Cloud Platform and their AI tools,” Houghton said.

Previously, Modiface would have had to do all the computations on local machines, and it would have taken a lot longer and had limitations on the complexity of what the AI could do. “But now all of that training — feeding images to the AI and letting it iterate billions of times — it’s all done on the cloud,” he said.

Google Cloud has been doing all of Modiface’s AI work for months now. Not only has it sped things up, it’s also opened up new ways to develop features.

One way uses what’s called Generative Adversarial Networks. “Basically, you’re teaching an AI to generate an image for you by showing it a bunch of example images. And it’s trying to generate a new image that fits those examples,” Houghton said. “Two years ago, that was not an area that anyone was exploring. And now, if you’re trying any of those more complex filters on the web, it’s actually what’s happening behind the scenes.”

GANs take a lot of computational power, as it involves two AI algorithms that compete to come to the best result — which is something that’s extremely hard to do on a traditional machine, he noted. By offloading that to Google, Houghton’s data scientists and developers are free to focus on other things, like building beauty services that appeal to consumers, he said.

That must be music to Google’s ears.

“We had two guiding lights while building Vertex AI: Get data scientists and engineers out of the orchestration weeds, and create an industry-wide shift that would make everyone get serious about moving AI out of pilot purgatory and into full-scale production,” said Andrew Moore, vice president and general manager of cloud AI and industry solutions at Google Cloud.

He believes that the new tool will enable “serious deployments for a new generation of AI that will empower data scientists and engineers to do fulfilling and creative work.”

Source: Read Full Article