The upsurge of mobile development has significantly changed how shoppers are carrying out their purchasing. The introduction of Visual Commerce, and dynamic product recommendations are changing how shoppers want to shop. As shoppers continue to spend more time on mobile devices, they want to use on-device features such as their camera or voice interactions to shop. An example scenario may be one where a shopper has a set of gloves they see being worn by a friend, and would like to purchase that same (or similar) pair by merely taking a photo of it.
These new approaches to shopping have converted how shoppers are browsing and searching for merchandise.
At a recent two day Hackathon, we aimed to provide a modern search experience to shoppers by integrating ElasticPath Cortex APIs with Google’s Vision Product Search. As a part of this project, we have successfully integrated Google’s Cloud Vision Product Search service, including image recognition with Cortex APIs. Shoppers can upload any picture through a PWA (Progressive Web App) or React-Native mobile app. Then, Google’s Cloud Vision Product Search services can recognize the objects within the image uploaded image and manage the association of product image to code. Using the product catalog data from our existing catalog, we can then perform a search within the index which Google’s Product Search services create for us to retrieve the product’s information (SKU, description, etc). In the end, we can use Cortex’ ‘Add to cart’ API to automatically add the most relevant product to cart and create a seamless shopping experience for shoppers.
We had considered a number of services with a background in computer vision. Amazon Rekognition and IBM’s Watson Visual Recognition were excellent contenders for our implementation, however being under a time-crunch, we had no time to train a custom model for our catalog. Additionally, we wanted to make use of the Product Search features that Google provides to find the results of best fit for our image search flow. The best option would be to upload images of our products to a service which can refer to those base images for matching subsequent images that can resolve to the product in our catalog. This is where Google’s Cloud Vision came into play.
Step 1: Google’s Vision Product Search (Backend Experience)
Our first step was to get a projection of our sample catalog onto Google’s Cloud Vision Product Search service using ElasticPath’s Import Export tool. We created the necessary .CSV comprised of our catalog data, and uploaded them to Google to create the corresponding product sets.
CSV of our sample Vestri catalog’s Accessories category to upload to Google Products:
We uploaded our catalog images to Google Cloud Storage, and validated that our products were indexed by Google along with our catalog images. We ensured our catalog images were also scanned successfully by
Cloud Vision Product Search (the time for these images to be indexed and classified by Google was a couple hours) and created reference images that visually describe each product from product catalog. We then decided to leverage Google’s Cloud Vision Product Search API to perform image recognition against our catalog, and return the appropriate SKU for our products. Shoppers will use the system to take a picture of an actual product, and query for this picture in the product catalog. For more information on Google’s Vision Product Search, check out https://cloud.google.com/vision/product-search/docs/. The system leverages Elastic Path’s Cortex API and adds the most relevant product from results to the cart without manual add to cart button click event.
Step 2: ElasticPath’s Reference Storefront (PWA) Progressive Web App (Frontend Experience)
As a part of the Elastic Path Storefront, we added an image upload component which allows shoppers to use pictures that are saved on their phones to find similar items on a customer’s website. Revisiting the scenario we previously discussed involving a shopper searching for a similar pair of gloves, we can demonstrate this by incorporating the new visual search feature directly into the shopper’s existing browse/search flow.
During the hack day at Elastic Path, more than ten teams have presented innovative and fabulous ideas. It is fascinating to experience the excitement, energy, enthusiasm, and innovative approach of each team.
At the time and point of this implementation, we used catalog snapshots to generated the data to upload our products to Google Product Search. Given the current availability of ElasticPath’s Catalog Syndication tool, we could improve this implementation even further by pushing catalog feeds to Google directly rather than creating the snapshots ourselves.
We implemented an end to end visual search experience by using Google’s Vision Product Search, ElasticPath Reference Experience Components, Cortex API, and Elastic Path’s Commerce Platform. In the end, we learned how the latest Commerce trends such as visual commerce could be implemented easily using a truly headless commerce platform without making any changes to the actual platform.
Kudos to our excellent team with the implementation:
Shreyas Sali, Shaun Maharaj, Alvin Chan, Dusan Radovanovic, Alan Kc Wong, Matthew Kelly, Michael Szyszko