Amazon.com on Tuesday announced a new artificial intelligence chip for its cloud computing service as competition with Microsoft heats up to dominate the market for artificial intelligence.
At a conference in Las Vegas, Amazon Web Services (AWS) Chief Executive Adam Selipsky announced Trainium2, the second generation of chip designed for training AI systems. Selipsky said the new version is four times as fast as its predecessor while being twice as energy efficient.
The AWS move comes weeks after Microsoft announced its own AI chip called Maia. The Trainium2 chip will also compete against AI chips from Alphabet's Google, which has offered its Tensor Processing Unit (TPU) to its cloud computing customers since 2018.
Selipsky said that AWS will start offering the new training chips next year. The proliferation of custom chips comes amid a scramble to find the computing power to develop technologies such as large language models that form the basis of services similar to ChatGPT.
The cloud computing firms are offering their chips as a complement to Nvidia, the market leader in AI chips whose products have been in short supply for the past year. AWS also on Tuesday said that it will offer Nvidia's newest chips on its cloud service.
Selipsky on Tuesday also announced Graviton4, the cloud firm's fourth custom central processor chip, which it said is 30% faster than its predecessor. The news comes weeks after Microsoft announced its own custom chip called Cobalt designed to compete with Amazon's Graviton series.
AWS and Microsoft are using technology from Arm Ltd in their chips, part of an ongoing trend away from chips made by Intel and Advanced Micro Devices in cloud computing. Oracle is using chips from startup Ampere Computing for its cloud service.
© 2024 Thomson/Reuters. All rights reserved.