Established in 1995, eBay is a global marketplace in over 190 markets with 1.4 billion listings. With massive amounts of data, how does eBay do AI on such a big scale? And how does eBay identify the needs of millions of buyers and sellers to fulfill their shopping intent?
Volume and velocity are important factors in the Big Data world. In the US and Australia, something’s being sold online every five seconds, in the Uk, every two seconds, and in Germany, every second. New items, fixed price items, auction items, shipping for free - there’s a lot of shopping opportunity.
Today’s managed and adaptive eBay is powered by AI, which can be looked at as a system of concentric circles.
- User experience (outermost circle) includes buyers and sellers.
- User AI (middle circle) is a set of AI close to buying, selling, and shipping experiences.
- Core AI pillars (innermost circle) include natural language processing for search relevance, recommendation engine, computer vision, and more.
With 1.4 billion listings, eBay is dealing with a large amount of data; buyer and seller data, data behavior, transactional data, and more, all powered by an AI platform engine.
In this article, we'll cover topics like:
Core AI pillars
- Search and recommendation
- Computer vision
- Natural language
- Economics and logistics
1. Search and recommendation
Knowing what’s relevant for buyers, ranking items, organic listings, ads, premium listings, suggesting relevant products when buyers are looking at items, and more.
2. Computer vision
With over 50 million images a day being uploaded, eBay is dealing with petabytes of data. Enabling computer vision at scale involves using a deep understanding of images, allowing for experiences that are magical, trusted, and frictionless. Can shopping experiences be fun? Can eBay detect counterfeit items just by the image? How can eBay make it easier for sellers to list items? With computer vision, all these things are possible.
3. Natural language
Understanding query intent, structure in the data, named entity recognition, translation, and more. If a listing is created in the UK and a German buyer wants to view it in German, eBay needs to be able to enable that through natural language processing.
Knowing who a buyer is, intent, historical purchasing, and commerce behavior.
5. Economics and logistics
Letting sellers know what the best prices are, what’s on-demand, can items be shipped, delivery estimates, and more.
Enabling commerce experiences
- Visual shopping
- Image clean-up
- AI platform
Object detection and object recognition are especially important when buyers are looking at items and want to find something similar with a better price. eBay’s visual shopping experience lets buyers find these items across their 1.4 billion listings, which isn’t easy on such a large scale. There’s a model training lifecycle that goes into it along with model inferencing and indexing lifecycle, so that eBay can recognize, detect, and find the correct objects in uploaded images.
With computer vision, you can take an image, understand its aspects, match them to billions of listings, and help buyers fulfill their shopping journey. By mixing in image text and categories that were detected, you can take this multimodel approach and narrow down items that make sense to buyers.
Image clean-up is a recent feature that’s gaining prominence in the US, available on both iOS and Android. eBay’s selling team detected that clean-looking images help, especially ones with a white background. With ads on Google and Facebook, for example, cleaner images have betters clicks and engagement.
Taking an image and cleaning up the background helps buyers click more, but in some cases, having a background could be important. So it’s essential to know when to target specific cases in which image clean-up would make sense.
With only 5% of images having a white background, half of them have simple scenarios and the other half have a complex composition. With simple scenarios there’s usually just a plain background that contrasts with the foreground.
But with complex scenarios, it’s not always obvious what the primary object is. These backgrounds are typically cluttered or have improper lighting, which is easy to see with a human eye but not always with a computer. Loops, thin regions, reflections, and transparency make it even trickier.
By collaborating with the design, product, computer vision, and engineering teams, the engine can go through the process of cleaning images. This algorithm also involves the user in the process by giving them cues, like taking photos under better lighting.
- Can the image actually be cleaned?
- Can the background and foreground be separated?
- Can the image be cropped?
After the user focuses on the main part of the image they want to clean up, eBay can then do it automatically. But for those trickier images, eBay has a manual touch-up process, which gives users a brush, an eraser, zoom in, and zoom out options.
But how did eBay create the experience and combine it with computer vision technology?
Step 1: Collaboration between teams:
- AI: computer vision
- Mobile architecture
- Product and design
Step 2: The metrics
- Product level metrics. How many sellers actually use automatic or manual clean-ups? How many buyers click on the cleaned-up images?
- Algorithm. Quality score? Separability? How many? How intensive?
Step 3: The tech
- Computer vision engineering
- Architecture: server or on-device?
- Mobile definition-of-done (security, performance, etc)
Step 4: The launch
- Employee beta
- Seller beta
- Open access
Attributes from image
Deep learning helps eBay infer characteristics from images, such as brand, type of item, what occasion clothes are for, and other categories. Not all sellers list attributes on their items, so eBay can do it automatically for buyers to find what they’re looking for by inferring attributes and matching them to the category in the eBay taxonomy.
With trust being extremely important, eBay has to be able to detect what’s explicit content, counterfeit items, or illegal items in specific countries. By the image itself, eBay can detect these characteristics, and even when sellers try to circumvent the system by trying to complete transactions offline to not pay fees through eBay.
Looking at patterns helps to scale up by analyzing the millions of images flowing through the AI platform and computer vision. eBay can then map categories, extract attributes, and match them to items. Computer vision enables on the experience side, to evolve these processes and scale on a global level.
For computer vision to analyze the millions of images that flow through eBay, their practitioners need to be able to continuously train models. This starts on their workspaces in an interactive mode but can also be done through deep learning, with hyperparameter tuning.
They run it in pipelines in the workflow engine and can then record their experiments in the experiment management system and push the models to the model repository, for versioning of the models. Afterwards, they deploy to a model-serving architecture where inference takes place.
The serving engine supports different types of model architectures, so there could be a CPU-based inference, GPU-based inference, or supports tensor RT. But the key is to have a feedback loop, to make sure the model is performing well and those learning are fed back into the training lifecycle. The internal AI cloud infrastructure is then powered by Intel and NVIDIA.
Computer vision enables magical, trusted, and frictionless experiences by using continuously evolving technologies that are automating how both sellers and buyers interact with e-commerce platforms.