Memory’s Role in Advancing AI – Six Five On The Road at Dell Technologies World

By Patrick Moorhead - May 22, 2024

On this episode of the Six Five On The Road, we are joined by Samsung Semiconductor’s Christina Day, Director of DRAM Product Marketing, for a conversation on how memory technology is essential for the advancement of AI.

Their discussion covers:

  • The critical role of memory in accelerating AI, with a focus on high-density DRAM and SSDs
  • How Samsung memory is integrated into Dell Servers to power advanced computing solutions
  • The future of memory technology, emphasizing increased bandwidth, density, and efficiency to support faster, smarter AI development
  • The impact of Processing-In-Memory (PIM) technology on reducing energy consumption and enhancing AI processing speed and performance

Learn more at Samsung Semiconductor.

Watch the video below, and be sure to subscribe to our YouTube channel so you never miss an episode.

Or listen to the audio here:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

TRANSCRIPT

Lisa Martin: Hey everyone. Welcome to Six Five On The Road from Dell Technologies World 2024. Lisa Martin here with Dave Nicholson. We’ve had some great conversations over the last couple of days. Michael Dell walked on stage yesterday to literally thunderous applause from tens of thousands of people. And one of the things besides AI, which you may have heard of, that was really evident in the keynote that we saw yesterday was the strength of Dell’s partner program and the depth of it as well. We’re very pleased to welcome Christina Day to the program, DRAM Product Marketing at Samsung Semiconductor. It’s great to have you, Christina. Thank you for joining us.

Christina Day: Thank you so much. We’re very happy to be here.

Lisa Martin: So you guys had a great session yesterday, moderated by my esteemed colleague here, which was so much fun. You really talked about the emerging memory products for AI. I was in the audience, packed house. What are the top three things that you think the audience took away from that to share with our audience here?

Christina Day: Yeah, absolutely. I think what people really took back was that there’s a very high demand in HBM, DRAM, as well as even SSD because all of these three solutions are critical to the role of AI because AI is generating so much content and all of these three products have to work in conjunction with each other for the AI to work.

Lisa Martin: Yesterday I mentioned AI. Dave likes to joke about how many times have we said that term in conversations? But lots of news on AI yesterday in the keynote. Talk a little bit about the partnership, the long-standing partnership at Samsung Semiconductor has had with Dell and the symbiosis that you have in helping it to really fulfill it and manifest its AI ambitions?

Christina Day: Sure, absolutely. As you heard yesterday in the keynote, Dell is an end-to-end solutions for their customers, and Samsung is here. We collaborate and partner together. We have to understand their challenges and their workloads, their data intensive requirements. So all of our solutions, we partner together and understand those challenges and help solve them for them, for the data access capability, for the storage capability. So Samsung’s leading memory technology and innovation really helps out.

Dave Nicholson: Okay, let’s get down to it. I want to hear some cool numbers. What’s the state-of-the-art now, in terms of what you’re delivering today and what’s coming down the line that you can share, specifically in the DRAM and the DDR space?

Christina Day: Yes.

Dave Nicholson: Where are we now? Are we, what? 4K of RAM on the motherboard? Is that about right?

Christina Day: No.

Dave Nicholson: Now not 4K?

Christina Day: We’re talking about-

Dave Nicholson: How much?

Christina Day: In general purpose servers from average of 400 gigabytes of DRAM space for general purpose servers.

Dave Nicholson: Okay.

Christina Day: Now we’re talking about generative AI servers. We’re talking about terabytes of DRAM requirements.

Dave Nicholson: And how big are those individual modules that you’re shipping now?

Christina Day: So now we can ship 256 gigabytes of a memory module.

Dave Nicholson: Okay. So a DIMM, Dual In-Line Memory Module, meaning there’s two of them. There’s a pair of them together always, or no?

Christina Day: No, not always.

Dave Nicholson: Okay. But the single unit is 256 at this point.

Christina Day: 256, correct.

Dave Nicholson: So four of them is a terabyte.

Christina Day: That’s right.

Dave Nicholson: Of memory.

Christina Day: Yes.

Dave Nicholson: Not storage, not SSD.

Christina Day: No. Of DRAM memory. Yes.

Dave Nicholson: Okay.

Christina Day: So you’re looking at about up to two terabytes of memory that is used in AI servers now, and then with SSDs with storage, it was explosion of data that is generating right now. So you and I, I’m sure you played with ChatGPT, and I have too-

Dave Nicholson: Oh, yeah.

Christina Day: Right? And you’re expecting that quick response-

Lisa Martin: Yes.

Christina Day: And the imagery that everybody’s looking for. So all of that content has to be stored somewhere, but not only is it on DRAM, but it’s also on SSD because certain content is not needed to be accessed constantly or consistently.

Lisa Martin: Right.

Christina Day: But there’s so much content and it has to be stored on SSD, which now people are requiring 128 terabytes of SSD space.

Lisa Martin: What are some of the ways memory is impacting high density DRAM? I think of speed, latency, but walk us through some of those key improvements that what you’re talking about is delivering?

Christina Day: So the CPU core counts are increasing, and the next generation of core counts, Samsung memory is also supporting that. So as our innovation continues to develop, we have to support all those core counts and the high bandwidth requirements and the performance. So we are certainly developing and progressing in that way.

Dave Nicholson: We talked about referencing the session that we had. That was a lot of fun, by the way. We talked about this concept of the memory wall, the idea that despite the advances in memory that you provide us with, thank you very much, there’s still a disparity between how hungry these XPUs are and how quickly memory can deliver. Now, part of the way you address that is through high bandwidth memory that resides more closely to those XPUs. What about next-gen PCIe technology? Does that help us? How would you characterize the current state-of-the-art when it comes to the memory wall?

Christina Day: Yeah. So we’re also addressing with new technologies such as CXL. CXL memory has a PCIe interface, and it’s in the E3.S form factor. So it actually fits into the SSD, the normal slots you would put into an SSD. So that memory wall, yes, certainly all this content, you can hit a memory wall. So CXL helps to expand or extend that memory capability.

Dave Nicholson: Now, is that for expanding the aggregate amount of memory that an individual server can access, or is there a shared component to that? I don’t know the answer. Does CXL allow pooling of memory to be shared by servers or not?

Christina Day: No, actually, people are looking to expand the memory.

Dave Nicholson: Okay. So it’s just because you can have it located a little bit further away.

Christina Day: That’s right.

Dave Nicholson: Because we’re talking centimeters of difference.

Christina Day: Sure.

Dave Nicholson: You can have more because you can physically put more memory.

Christina Day: Exactly.

Dave Nicholson: And any idea of what the projected numbers are there? You’re talking terabytes of memory, right?

Christina Day: Yes. I don’t know.

Dave Nicholson: No, no. It’s a big number, folks, but it is mind-boggling when we start talking about the amount of actual memory and how fast the performance is.

Lisa Martin: Well, and the impact of memory on the success of AI is incredibly impactful. Talk a little bit about that.

Christina Day: Yes. So if you look at ChatGPT adoption, it took five days to reach 1,000,000 users. So huge success, and it’s continuing to grow. And if you think about it, memory is a critical component of AI. Without memory, there is no AI.

Lisa Martin: Right.

Christina Day: So absolutely, it’s a huge success. And as AI continues to grow and develop and innovate, Samsung’s there, we’re constantly innovating as well. So yes, and if you think about the new technologies that are coming, such as autonomous driving, that takes about one to eight petabytes of content a day. So think about all the memory that is required to not only process that content, but also store that content.

Lisa Martin: You guys showed that slide yesterday. You just referenced this in terms of ChatGPT reaching, what? 1,000,000 users?

Christina Day: 1,000,000 in five days.

Lisa Martin: In five days.

Dave Nicholson: In five days.

Lisa Martin: Versus Netflix, which was what? 1,700 days.

Dave Nicholson: 1,700 days.

Christina Day: 1,700 days.

Lisa Martin: So the speed, and it’s just been only about 18 months since ChatGPT was launched, and it’s been revolutionizing everything there. But you also talked about the speed. We have this expectation in our personal lives, in our business lives that we can access anything, we can transact. It’s going to be right there when I want it, when I need it, and it’s going to be relevant to me. I think that expectation isn’t going away. I think it’s only going to get… Not worse, but good. The demand will increase for being able to deliver information content, relevant content like that.

Christina Day: Absolutely.

Lisa Martin: Or faster.

Christina Day: Yes. Yes. And I’ll go back to the example of autonomous driving. It’s instantaneous, right?

Lisa Martin: It has to be.

Christina Day: You’re driving automatically without steering the wheel, without even pressing on the gas pedal. That’s instantaneous content that is being delivered to you. So think about the high bandwidth and the performance that is required to automate that.

Lisa Martin: Well, and talking about autonomous driving makes me think of the criticality of that speed. It could be life and death situations. Same thing with impacts to other organizations like healthcare, life sciences, for example, that are where information is needed to be relevant, accurate in sub seconds.

Christina Day: Yes. Yeah. It could be a life or death situation. So absolutely, yes.

Dave Nicholson: So does the DRAM that you work with, does it generate power or does it consume power?

Christina Day: So it consumes power.

Dave Nicholson: It consumes power?

Christina Day: Yes.

Dave Nicholson: Is that a problem, Christina? Is power going to be a constraint moving forward? A lot of people are telling us that. What are you doing at Samsung to address this problem?

Christina Day: So yes, power is certainly a humongous challenge, and we had a cloud service provider tell us that in 20 years, it took them to reach about 500 megawatts of power, but in two and a half years, they’re going to reach two gigawatts of power. So absolutely, power is a huge constraint and a huge concern. And what we’re doing with Samsung is we’re constantly innovating. So the example is the DRAM. We introduced the 32 gigabit DDR5. It’s a mono die, so you don’t have to stack the memory to reach higher capacity. That saves up to 40% of power.

Lisa Martin: Oh, wow. That’s significant.

Christina Day: It’s significant.

Dave Nicholson: Which translates into less latent heat also. Right?

Christina Day: Yes. Less power consumption, more efficiency, better bandwidth.

Lisa Martin: If we have a crystal ball or sometimes I like to say a magic eight-ball, and we look into the future of memory technology innovations. We talked about the demand, we talked about the speed at which we’ve gotten there. What do you see on the near horizon?

Christina Day: Oh my gosh, there’s so much. I’m excited for the future. I think AI is just going to continue to grow. Anything’s possible. Anything’s possible.

Lisa Martin: Yeah. I feel like we look at the horizon and it just keeps going and going and going.

Christina Day: That’s right.

Lisa Martin: And there’s probably problems that we haven’t even thought of yet that we’re going to be working to solve very soon.

Christina Day: And if you think of-

Lisa Martin: Which is exciting.

Christina Day: Yeah. ChatGPT, it just took everyone by surprise. And in the future, I think there’s going to be something else that’s going to take everybody by surprise.

Lisa Martin: It is. What are some of the… Speaking of horizon, some of the next things for the Samsung-Semi-Dell relationship?

Christina Day: So we’re continuing to innovate and grow and develop together. We are looking at things such as processing in memory or PIM.

Lisa Martin: Yes.

Christina Day: And that basically integrates the processing and the memory together so that it reduces the data movement, which is good for performance and bandwidth, as well as lower power consumption.

Lisa Martin: I was going to say, energy consumption has got to be a benefit there.

Christina Day: Yes, absolutely.

Lisa Martin: Excellent. Well, Christina, it’s been a pleasure having you on the program. Thank you for joining us and explaining what you guys are doing with Dell, the future of memory, its impact to AI, its impact to DRAM. We appreciate your conversation.

Christina Day: Thank you so much.

Lisa Martin: All right. For our guest and for Dave Nicholson, I’m Lisa Martin. You’re watching Six Five On The Road from Vegas, baby. This is Dell Technologies World, 2024 coverage. Stick around. More great content coming up next.

Patrick Moorhead
+ posts

Patrick founded the firm based on his real-world world technology experiences with the understanding of what he wasn’t getting from analysts and consultants. Ten years later, Patrick is ranked #1 among technology industry analysts in terms of “power” (ARInsights)  in “press citations” (Apollo Research). Moorhead is a contributor at Forbes and frequently appears on CNBC. He is a broad-based analyst covering a wide variety of topics including the cloud, enterprise SaaS, collaboration, client computing, and semiconductors. He has 30 years of experience including 15 years of executive experience at high tech companies (NCR, AT&T, Compaq, now HP, and AMD) leading strategy, product management, product marketing, and corporate marketing, including three industry board appointments.