Google Lens Brushes Up Against Contextual Commerce

All News All News Except Press Releases Fintech Imported

Combine visual smarts with digital context and you’ve created an online shopping method that can respond to consumer impulses and discovery in ways that can feel pleasingly intimate – and can even lead to significant profit boosts.

That’s one angle through which you can view a recent development involving Google Lens.

New Role for Lens

The product, which is Google’s object recognition platform, has taken on new duties. It will function as part of Google Image Search to, in the words of one report, tap into “the company’s computer vision work to figure out the contents of an image and provide more details about exactly what you’re looking at.”

Take one Google-approved example of how this might work: A consumer who searches online via the keyword “nursery” might still have a challenging time finding the exact type of desired crib. But by using Google Lens technology, Google can more deeply analyze an image of a crib to find the wanted model in the right color — or, perhaps, the lamp that stands behind the crib in the nursery image, helping to encourage the consumer toward making a purchase.

“On Google Images results, a new Lens button will appear at the bottom of each picture. Tapping on that will show you what it thinks are interesting parts of each photo, and show you similar products,” explained another report. “You can also draw on another part of the image, like a bookcase in the background, to tell Lens to find similar products. Sadly, Lens for Google Images is only available on mobile devices for now, but hopefully it’ll come to desktop browsers soon.”

Visual + Contextual Commerce

Using camera and camera technology in such a way is becoming a more important part of retail, part of the rising trend of visual commerce. The most recent example of that is the deal announced earlier this week between Amazon and Snap, an agreement that will enable consumers to buy products — or shop for similar ones — via images of those products or their barcodes.

But these developments and product launches aren’t limited to visual commerce. There is another trend at work, one called contextual commerce, and it’s easy to see how more sophisticated offerings related to visual commerce can help the other online retail trend.

What is contextual commerce?

At its most basic, it involves a process that revolves around discovery. A potential consumer might visit a social media site, or a site devoted to, say, home cooking, with the main goal of reading content, learning something or digitally hanging out with like-minded people. That person may have only a vague intention of buying something — or no intention at all — but follows an impulse and buys an item tied to the content and the original desire for discovery. That product might be a dress worn by an actress to an awards show, or — to return to the example above — a lamp for a nursery.

Contextual commerce is a real and growing force.

According to research in the Contextual Commerce Report from PYMNTS and Braintree, 48 percent of consumers have tried that shopping experience at least once. And those who have tried contextual commerce are generally seeking efficiency, as 59 percent of consumers who have tried it report using the online retail method for a faster buying experience. “Who doesn’t like an expedited experience?” asked Azita Habibi, business development lead at Braintree.

And major digital operators are embracing the concept. Netflix recently announced its hire of Disney veteran Christie Fleischer as head of the company’s global products team, a new position. She’ll reportedly “oversee retail and licensee partnerships, publishing, interactive games, merchandising and experiential events.” Her job involves “developing the consumer products portfolio across all categories for Netflix original series and films.”

In short, she’ll be doing contextual commerce. And so will, it seems, shoppers who employ Google Lens in the future for searches centered around images.

Source: PYMNTS

Facebook Comments