Carsales is beginning to explore how conversational AI interfaces like chatbots and voice-activated virtual assistants could be used by consumers when buying and selling vehicles.
AI and machine learning technical development manager Agustinus Nalwan told Adapt’s recent Cloud & DC Edge 2018 - which is co-organised by iTnews - that the company was “still very early” on its testing of AI use cases outside of image recognition.
So far, Carsales’ most recognisable AI output is Cyclops, an image recognition tool that recognises the angle of images uploaded to the Carsales website.
The project landed Carsales consumer IT project of the year in the iTnews Benchmark Awards.
Its success means the company’s AI capability has the backing to explore a broader range of use cases for artificial intelligence and machine learning technology.
“At the moment we are heavily invested into image recognition because the business can actually already see the return on investment,” Nalwan told iTnews.
“It’s successful, it can solve many problems, it can do many cool things, and it can really separate us from our competitors.
“But then we are starting to do other things as well on [the likes of] chatbots and voice with Alexa [on the] Echo dot, but those things we are still very early on [and] not really that practical yet.
“It is a good time to start thinking about them and seeing what we can do, but it’s hard to build something serious at the moment.”
Carsales’ first foray is a Zendesk-powered chatbot that helps classify after-hours customer support requests and questions lodged via its website.
“They start to type in a question like ‘I’m trying to sell my car, I’ve already paid [the listing fee] but how come it’s not up on our site?’ and then suddenly they will receive an email suggesting three topics that might be relevant,” Nalwan said.
More sophisticated use cases for natural language interfaces, whether text or voice, will rest on the ongoing evolution of the underlying technology.
Helping a prospective buyer find a car would likely involve a lot of back-and-forth between the customer and the natural language interface.
“It’s very chit-chatty and hard to build,” Nalwan said.
“You can’t really fit all the possible conversation flows into your head.
“There’s so many different ways people could ask questions about what kind of car they want to buy: the price, the fuel efficiency, safety.
“Some people might also just not like the ‘look’ of a car [they are recommended], but what does ‘look’ mean from that person’s perspective? You need to understand the person first in order to know the meaning of the word.”
Nalwan believes the challenges are not insurmountable, but that it would take “at least three-to-five years” for the “brain” behind natural language to develop to the point where it could handle complex chat with a human with a high degree of accuracy.
“I always believe that anything is solvable,” he said.
“But the [conversation] has to be generated by the AI, not [fall back on] predefined questions and answers: if they say this go here, if they say that, go there. That won’t work.
“You need an AI that can think like us.”
Nalwan used the conference to demonstrate the “superhuman ability” of the Cyclops 2.0 image recognition technology that Carsales has built.
Using a variety of images of the exterior of a car - some clearly showing the make or badge, others where that information was completely obscured - Cyclops was able to accurately recognise the car on each occasion.
Cyclops was created a year-and-a-half ago to help the company’s digital photography team classify images they took at car dealerships.
The company employs photographers to service dealerships in a particular area, taking photos of cars they want to list through Carsales.
“Once the photographer got home they needed to upload the photos into our library, and classify the angle of every single from dropdown menus,” Nalwan said.
“Classifying a day’s worth of photos could take 30 minutes; multiply by the number of photographers and working day and it cost the business $250,000 a year for a million mouse clicks.
“Those were very expensive mouse clicks."
Cyclops 1.0 solved that problem, but it soon became clear that the technology had even more potential.
“We have widely integrated Cyclops into all our imaging pipeline,” Nalwan said.
“All the photos [in our library] are coming from the website, but they are not just the one kind from photographers.
“We also have photos coming [directly] from dealers and also from private sellers. These two combined upload about four times the amount of volume of the photos coming from the photographers themselves."
Cyclops image recognition is being used as the foundation for several new products on the Carsales platform.
One, called 'visual comparison', allows customers to easily compare photos of a particular angle - for example, the spaciousness of the back to fit a child seat - for several makes of car.
A second product, called 'product quality advisor', prompts sellers to upload photos to their listing based on angles they have forgotten to show.
“Because we know every photo that is uploaded by the user, now we can actually remind them what’s missing [to help them sell their car],” Nalwan said.
A third image recognition capability is dubbed 'snap and sell'.
In the past, when a user wanted to list their car for sale, they had to navigate a series of lengthy dropdown menus to locate the make, model and badge of their car in order to start the listing process.
“That process was very annoying,” Nalwan said.
“Now with Cyclops 2.0 this is all gone. All you need to do is take a photo of the back of your car, and Cyclops already knows what it is.
“If you think it’s wrong you can correct it, otherwise click next.”
Cloud & DC Edge is an annual conference by Adapt Ventures and iTnews.