Nvidia launches A100 GPU with 80 GB of HBM2e memory and PCIe 4.0 interface – Computer
Nvidia is working on a new variant of the A100 GPU, with 80 GB of HBM2e memory and a PCIe 4.0 interface. Currently, the PCIe variant of this accelerator It only ships with 40 GB of video memory.
Recently launched PCIe A100 with 80 GB of memory HBM2e Nvidia data center website, without official announcement. nvidia earlier made an 80GB variant of the A100 GPU, but it only shipped as an SXM4 unit, which is mounted directly to the motherboard. The company had not previously offered the 80 GB A100 as a PCIe interface card.
Nvidia has yet to announce a release date for the new PCIe variant, but anonymous sources say Video Cards GPU is expected to be available next week. The new 80 GB variant will have 2 TB / s memory bandwidth, as will the 80 GB SXM4 module.
The Nvidia A100 is a data center GPU based on the Ampere architecture, which is also used in the company’s GeForce RTX 30 graphics cards. The chip has an area of 826 mm² and is made up of 54 billion transistors. The A100 has 6,912 CUDA cores. The Structure of These Cores Although different from CUDA cores in recent GeForce consumer graphics cards from Nvidia, these core numbers are unmatched by the company’s RTX 30 graphics cards.
|Nvidia A100 specifications|
|Form||A100 PCIe||A100 SXM4|
|Memory||40 GB / 80 GB||40 GB / 80 GB|
|Memory bandwidth||40 GB: 1555 GB / s
80 GB: 2039 GB / s?
|40 GB: 1555 GB / s
80 GB: 2039 GB / s
|the above. tdp||40 GB: 250 W
80 GB: nnb
|40 GB: 400 watts
80 GB: 400 watts
“Professional reader. Award-winning player. Zombie buff. Addicted to social networks. Bacon maven. Web scholar.