Inspur releases new AI server products to support the latest NVIDIA A100 PCIe gen4

category:Internet
 Inspur releases new AI server products to support the latest NVIDIA A100 PCIe gen4


According to the introduction, nf5468m6 and nf5468a5 can be configured with 8 pieces of dual width pciea100 in 4U space, supporting the latest pciegen4, and the bidirectional communication bandwidth is 64GB / S; in addition, 40Gb of hbm2 memory can increase the memory bandwidth by 70% to 1.6tb/s; nvlinkbridge design can provide the P2P communication performance of up to 600gb / s between 2 GPU cards; for multi task training and development scenarios, MIG function (multi instan Cegpu) can divide a single A100 into up to seven independent GPU instances and handle different computing tasks respectively. At present, Inspurs AI servers supporting ampere architecture have been delivered in mass production. Source: Xue Jingyu, editor in charge of Netease Technology Report_ NBJS10393

According to the introduction, nf5468m6 and nf5468a5 can be configured with 8 pieces of dual width pciea100 in 4U space, supporting the latest pciegen4, and the bidirectional communication bandwidth is 64GB / S; in addition, 40Gb of hbm2 memory can increase the memory bandwidth by 70% to 1.6tb/s; nvlinkbridge design can provide the P2P communication performance of up to 600gb / s between 2 GPU cards; for multi task training and development scenarios, MIG function (multi instan Cegpu) can divide a single A100 into up to seven independent GPU instances and handle different computing tasks respectively.