Samsung Unveils HBM2E Memory for GPUs, B/W Up to 1,64 TB/s

    0
    1083

    Although high-bandwidth memory has its strengths like a wide bus, lesser die space usage and low power consumption, it didn’t quite pick up in the consumer market. However, Samsung hasn’t quite given up on HBM. The company revealed the industry’s first HBM2E memory. The so-called Flashbolt HBM2E increases performance by 33% and offers double capacity, both on-die as well as per package.

    Samsung’s HBM2E memory was showcased at NVIDIA’s GTC19, and considering that AMD isn’t doing too well with its HBM2 based Radeon VII GPUs, it’d be safe to say that this is meant for NVIDIA’s future workstation processors.

    HMB2E Memory

    The memory uses TSVs (through silicon vias) to connect eight 16GB dies on an 8-Hi stack configuration. Each package boasts a bus-width of 1024-bit coupled with a massive 3.2 Gbps data transfer speed per pin, resulting in a bandwidth of 410 GB/s per KGSD.

    NVIDIA GeForce RTX 2060, 2070, 2080 Pricing in India: RTX ON

    It’s quite clear that this memory won’t be coming to the consumer market, and will be aimed towards datacenters, AI and neural network based workloads, HPC and of course content creators. Samsung’s HBM2E memory will allow for unprecedented bandwidths in GPUs thanks to the improved stacking and the sky-high transfer speed. With a memory width of 4096 bits coupled with 64 GB of memory, bandwidths of up to 1.64TB/s can be achieved which at the moment sounds like an over-enthusiastic joke.

    Samsung HBM2E

    AMD’s Vega and Fiji cards were the only mainstream graphics cards to include HBM memory and to be fair, they didn’t do all that well and were crushed by NVIDIA’s GDDR5X/GDDR6 flagship offerings. The shortage of HBM, coupled with the high pricing didn’t do much to help either which further drove the prices up. We expect future cards from AMD to leverage GDDR6 memory, especially in case of the much anticipated Navi lineup not just to keep the pricing in check but also to avoid shortages.

    Further reading:

    Leave a Reply