Amazon Web Services (AWS) will ditch Nvidia chips responsible for the processing of Alexa queries and will instead use its own in-house silicon, the company confirmed on Friday.
Alexa queries, issued through Amazon’s Echo line of smart speakers, are sent through the company’s data centres where they undergo several stages of processing before coming back to users with an answer, including translating the processed text into audible speech.
The company said that the “majority” of this processing will now be handled using Amazon’s own “Inferentia” computing chips. These were first launched in 2018 as Amazon’s first custom silicon-designed chips for accelerating deep learning workloads.
Amazon has said that the shift to Inferentia for Alexa processing had resulted in a 25% latency boost and 30% lower cost. The firm hopes the same will happen with its Rekognition system, which has also started to adopt the Inferentia chip.
The cloud giant didn’t specify which company previously handled Rekognition processing, but the service has come under some scrutiny from civil rights groups for its involvement with law enforcement. Police were temporarily banned from using it earlier in the year, following the Black Lives Matter protests.
Nvidia and Intel are two of the biggest providers of computing chips, often for data centres, with companies like Amazon and Microsoft included in their clientele. However, a number of firms have begun to move away from vendors and are bringing the technology in-house. For example, Apple has recently moved away from Intel chips in favour of the A14 Bionic processors, which will be used going forward.