The Six Five On the Road with Bratin Saha from AWS at re:Invent 2022

The Six Five On the Road with Bratin Saha from AWS at re:Invent 2022

The Six Five On the Road at AWS re:Invent 2022. Patrick Moorhead and Daniel Newman sit down with Bratin Saha, VP & GM, AI & ML, AWS.

Their discussion covers:

  • Announcement of SageMaker & Omics
  • Progress on Amazon Monitron
  • Key trends driving machine learning & artificial intelligence

Be sure to subscribe to The Six Five Webcast so you never miss an episode.

You can watch the full video here:

You can listen to the conversation here:

Disclaimer: The Six Five On the Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we do not ask that you treat us as such.


Patrick Moorhead: Hi, this is Pat Moorhead and we are live at AWS re:Invent 2022 and it is rocking Daniel and I. We are talking about our most favorite thing when it comes to tech and that is the cloud. Dan, how are you?

Daniel Newman: I’m doing great. It’s always great to have the Six Five on the road here at AWS. It’s such a popular conversation, the topic, cloud, the enterprise, digital transformation. It’s all happening here. By the way, just trying to get down here, you’re bumping shoulders, you’re running in because I think everybody in IT is here.

Patrick Moorhead: Pretty much. I mean, I always gauge all shows here by CES crowds and this is a monster show and you know when you can barely walk through the hallway, something is going on and a huge attraction. One of the biggest attractors in IT and digital transformation is taking data and doing meaningful things with it through AI and machine learning. We happen to have a repeat guest here on the Six Five. How are you doing, Bratin?

Bratin Saha: I’m good. Thank you for having me. It’s always nice talking to you.

Patrick Moorhead: Yeah, I mean last time we talked, we had this great conversation about democratizing AI, making it simple for people, putting it in the hands of, I don’t know, people like Daniel and I even, right, who don’t necessarily get down to the metal on a GPU or a Trainium or an Inferentia, but abstracting it so even the citizen developer can get access to it. But we are here, it’s another event. Big announcements.

Daniel Newman: Yeah. We had the chance to talk to Bratin actually at re:MARS. So this is a second time sitting down. You’re now a Six Five alumni and you have a big mandate at AWS and I would love for you just to quickly talk a little bit about the work you do for the company, just to set the context out there because we’re going to get into everything Pat just said, but you are highly qualified.

Bratin Saha: Well, I hope so. Thank you for the kind words. So at AWS, I run all of our AI and machine learning businesses, the AI services that’s at the top of the stacks, SageMaker, that’s the middle of the stack, and then the deep learning armies and engines that’s at the bottom of the stack. We built one of the fastest growing businesses in the history of AWS. More customers doing machine learning on AWS than anywhere else. So it’s been a fun ride so far.

Patrick Moorhead: What have you done with your life? I mean, I’m like what he’s done with his career, I’ve done with, I mean-

Daniel Newman: Are you really going to make me assess this and-

Patrick Moorhead: I don’t know. I mean we’re live.

Daniel Newman: I’ve written seven books.

Patrick Moorhead: Okay, I got you. You’re good. But you don’t run the largest AIML cloud service on the planet.

Daniel Newman: That’s why he’s on our show. He’s the guest.

Patrick Moorhead: Okay, let’s go. Sorry, keep going, Dan.

Daniel Newman: Oh no, absolutely. So let’s start with the launches. I mean look, we like to have a little fun, little banter here, but this is really important stuff and the problems you’re working on solving are big problems that pretty much every company, every government entity, universities, everyone’s kind of looking to AWS at least as part of the solution. So what’s your big exciting moments here that you’re announcing this week at re:Invent?

Bratin Saha: Yeah, so when we look at machine learning, one of the things we look at is what are the key trends that drive machine learning innovation? If you are a customer, what should we be paying attention to? What should we be leveraging? Two of them, we talked about at re:MARS, ML industrialization, ML democratization. I think they remain important. We are going to continue to push them.

Then there is this new class of models that are coming up called foundation models and we use them internally at Amazon. Just to give you an idea, if you went back to say 2017, the state of the art models had 20 million parameters. 2019 they had about 300 million parameters. Now they have more than 500 billion parameters, right? So that’s 1600 times just in three years. That’s super, super, super. What these have done is they have opened up amazing capabilities. You’re now looking at a world where potentially in the near future, you could have AI and machine learning be an assistant for creative tasks, okay?

And so, we have these generative models now on AWS and SageMaker, that creates the most popular foundation models. Now, they have chosen AWS as the cloud. They’re using SageMaker and there’s just an enormous amount of innovation that’ll come out of that. I’m really excited by the capabilities that we are going to open up for our customers there.

Then there is this variety of data. We have talked about the data, but what I’m talking about is the variety of data, multi-modal data. One of the things that we launched is SageMaker’s geospatial machine learning capabilities. What we had up until now was machine learning was good at answering the who and the what, but you couldn’t really answer the where and the when, okay?

With this geospatial launch, you can now answer the where and the when. And so we have customers like BMW, who are actually using it to say, “Where should I be having electric charging stations?” We have customers like Zario that are using it for agriculture like, “How should I be planting? Where should I be planting?” And so on and so forth.

Then the other thing that I’m really excited about that’s really taking off now in terms of maturity is machine learning powered use cases, so document processing, contact center Monitron, which is industrial equipment. If you think about what we are doing with contact center automation, today, what happens is when a customer is calling in to a call center, then contact center supervisors listen in on a fraction of the calls to see is the customer satisfied, do they need more information, but they can’t listen in on every call. What we have done now is we have enhanced Amazon Transcribe, which is used for contact center automation and we’re using speech recognition models to detect customer sentiment. So, we can detect things like raised voices or someone saying repeatedly, “I want to talk to a manager,” or someone saying, “I want to cancel my subscription.” That’s a clear indication of a frustrated customer. When that happens, you can send a real time alert to the contact center supervisor.

Patrick Moorhead: Okay. So we have talked, I’m going to ask a question of, I mean there’s so much content coming out here. First of all, for the neophytes in AI and ML out there, what are the benefits of a larger model?

Bratin Saha: So you can think of what a larger model does is it has more learning capacity. I’m kind of simplifying the thing because ultimately, you want a model is important and the data is important, the amount of data that you’re training it, right? Now, all things being equal, when you have a larger model, it has more learning capacity. What that means is I’m able to learn more from data, I’m able to identify more patterns and I’m able to make better predictions.

Patrick Moorhead: Now, I appreciate that. For, let’s say Monitron, now Monitron’s been out for two years now?

Bratin Saha: Couple of years ago.

Patrick Moorhead: Couple of years now. Do large models help things like Monitron?

Bratin Saha: Large models in general are applicable to a variety of places. Monitron could be one of them. The special thing about Monitron is it’s its own solution. You need no machine learning expertise for it. It comes with its own sensors, gateway and all that. So you plug the sensors to your equipment, they stream your equipment’s temperature and vibrations and then machine learning models detect problems with your machines. And so we have customers like Coke Industries using it, Baxter using it. Digital transformation is a cliche, but I think it’s one of those things that’ll really transform industry of manufacturing.

Patrick Moorhead: Yeah. By the way, not even just from having smarter factories and smarter manufacturing, smarter transportation, but completely changing business models because, when you know and you can predict what’s going to happen before it happens, you can get into doing some of the things that you might sell as a good, selling as a service, right?

Bratin Saha: A lot of just in time things can become…

Patrick Moorhead: In fact, you might design it to be different if you’re selling the widget versus offering the widget as a service. You might make it last longer. You might make it last less time because you know exactly when that thing needs to be replaced or serviced as opposed to a person coming out every month, checking it out, and by the way, half the time it’s going to be wrong. I’m just imagining every servo motor, every thing that goes around and makes a noise inside of a distribution center or a manufacturing plant. I feel like that’s game changing.

We see a little bit of that transformation as a service in odd things like oxygen. There’s a company that I interviewed two or three years ago that used to supply oxygen bottles to a manufacturing site and now it’s as a service and putting sensors everywhere to make that happen and their profit margins have gone up, their customers are incredibly happy because all they want is oxygen. Some of these things aren’t necessarily sexy. I think they’re sexy because it’s moving forward, making more money and tech doing cool stuff, but I like Monitron because it’s so transformational in industries that haven’t transformed yet.

Bratin Saha: Yeah. It makes a lot of things easier. Like you said, I don’t just have to guess what is going on. I can really be predictive about things. The other thing we have is Amazon Textract that we use for document processing. Now, a lot of customers have been telling us we would like help with mortgage processing because a mortgage loan package can have 500 pages and it can take 45 days to close. Almost half of that time, 20 days, is just extracting information from these pages. And so, what now analyzed lending does is you give it a form and it extracts all of the relevant information and not just that. It can find pages that require review by human underwriter. So if you have a mortgage form and it’s missing a signature, it can flag that page for an underwriter.

Daniel Newman: It’s literally unstructured data becoming structured data in real time.

Bratin Saha: Exactly. It’s a great analogy actually because I almost think of it as people talk about no ETL and all that. This is basically taking unstructured data and converting it into information. It’s kind of doing what you’re saying and this whole notion of machine learning powered use cases, and we have all talked about how it’s going to transform industries. Now, that’s really happening. Pennymac used to spend hours every day processing documents. Now, they can process 3000 page PDFs in less than five minutes.

Patrick Moorhead: That’s amazing. Another great example is in our healthcare system in the United States, we ship most all of those papers to a different country to be processed. That is crazy, takes a long time, and also as the frustration of a customer thinking they paid and they didn’t or there’s a transcription error and it makes a mistake on even some sort of an outcome. So, there’s this time delay and it costs money. In a perfect world, it would all be digital, but because every state, every county, every country, every region has different rules for it, it could be 50 to a hundred years before we move off of paper.

Bratin Saha: And they have different formats and the same templates don’t work. With AI, all of this goes away because you’re just semantically getting information out.

Patrick Moorhead: Well, and the smarter it is. So for instance, we’ve had robots in manufacturing, gosh, I worked in a factory 35 years ago and we had robots. But the big difference though is the cost and time to set up and the tolerance errors were nil. But with today’s machine learning and the ability that the machine acts more like a person and if that window is slightly off, it knows to-

Bratin Saha: Adjust.

Patrick Moorhead: … to adjust. That is brand new and the ability to even get into a 5G connected, instead of you can actually move the devices and the robots around. That’s very similar to, in an odd way, is this Textract and what it’s doing because it’s taking this old brownfield type of environment and modernizing it, not by completely just A, you need to throw everything out and start over, but living in the reality of where we are today and making it better. That is pretty awesome.

Bratin Saha: Yeah. Every form has handwritten stuff. Templates don’t work on that. That’s why you need AI. All of these have corner cases that don’t work. That is where bringing in AI, making the corner cases work, makes the solution [inaudible 00:14:21], and that is what makes it possible for widespread deployment. I think that notion of it’s no longer a brittle thing that only works in one scenario, it’s something that generalizes and actually works in the real world, I think is transformational.

Daniel Newman: There’s so many instances of where you’ve got all this amazing technology and all this compute and processing and this killer UX and then in the end, it ends up printing out a 500 page mortgage that ends up being at the end of this thing. So you can do the whole process on an app on your phone and it can actually read your credit. But in the end, like I said, there’s still some analog manual process that just destroys-

Patrick Moorhead: Flood…

Daniel Newman: … the whole experience. By the way, destroys their sustainability, because all that printing and paper is totally unnecessary. I mean you got industries like real estate mortgage, you got industries like automotive where they still print all the forms every time. I mean, they still have the dot matrix printers in automotive dealers because that’s-

Bratin Saha: Healthcare, if you look at Health,  Health has used Textract for automating the claims processing. They have been able to automate almost 90%. Anthem has been using Textract for automating. So like you were saying, there’s the automation aspect, the improved productivity aspect, the less paper aspect, the “I can go invest in other things,” aspect. So it’s really transformational.

Daniel Newman: So let’s talk about a smarter application because I got something, you guys were talking kind of about these kind of dull and boring applications, but I think someone alluded to the word supply chain at some point in this conversation. I really do wonder with someone like yourself that thinks about this all day long, we’ve had a massive problem fixing it. I think some of these are structural, some of them are policy. But where does something like AIML help in what we’ve been through in the last two years? Where do you start to see, I’m sure you guys are involved in this, solving problems to get stuff where it needs to go faster?

Bratin Saha: I think there’s a lot of machine learning and AI we can apply to do things like congestion management. A lot of this has been congestion, things just being congested and moving things flow, better prediction of demand supply, where does it need to go to relieve bottlenecks? So I think we can do a lot of that application to just make the thing flow a lot smoother.

Patrick Moorhead: Oh, you did make a major announcement here at the show called Amazon Supply Chain. So using AIML, so that’s-

Daniel Newman: I was keying him up.

Patrick Moorhead: Oh well, okay.

Daniel Newman: No, you did good. You did it, you did it, you did it.

Patrick Moorhead: Which is just smart. I mean ETL out of SAP or Oracle and pull it in there and start having fun. I did want to talk about Omics. Am I saying it correctly?

Bratin Saha: Yeah.

Patrick Moorhead: I mean that seemed to be a giant. By the way, we’re running out of brand names. That’s why these names start to get funkier and funkier. But tell us about Omics. What does it do? What problem is it solving?

Bratin Saha: So fundamentally what we are trying to do with Omics is making it easier to do analyzing genetic information. Now, there’s genomes, which is genomics, but then there’s also other forms of analysis you can do like proteomics and other stuff. Fundamentally, what a lot of our healthcare customers want to do, especially in the field of precision medicine, is they want to be able to look at genetic information, correlate it with your health information, and then use that for personalized treatments or other forms of therapeutic stuff that is personalized to your needs.

Patrick Moorhead: That is absolutely the future.

Bratin Saha: Of medicine.

Patrick Moorhead: That is an absolute global game changer.

Bratin Saha: Yes. It promises to make things a lot more effective and it’ll probably make it a lot more efficient. So what Amazon Omics does, you can think of it as genomics processing as really kind of three things going on. One is you have a genomic sequencer that spits out all of the genomes. Then you are taking the genomes and you’re comparing it with what we call reference genomes.

So let’s say I’m taking a person’s genome and I’m comparing it with the reference and I’m seeing where the two of them differ because then I know, “Okay, there was a mutation here.” So those are called variants. So once I’ve found all the variants, all the mutations, then I try to correlate it. I say, “Okay, there is a mutation in this position that might have resulted in this thing going on.” So what Amazon Omics does is it makes it really easy for you to go from genomes down to these variants and then we have Amazon health leg that you can use for storing electronic health records and you can use for storing X-rays and so on. So now you can go in and you can do an analysis and say, “Okay, here is the genetic information, here is the electronic health records. How do I correlate that to analyze the two and get to personalized treatments?”

Patrick Moorhead: Okay, dumb question, but I have to ask, I’m assuming it’s AI and ML, is it truly a big data problem?

Bratin Saha: The Omics one or-

Patrick Moorhead: Right, Omics?

Bratin Saha: Yeah. These genome files can be petabytes.

Patrick Moorhead: Okay, thank you. I didn’t know that, but thank you.

Bratin Saha: They can be petabytes. So you’re looking at the human genome as 3 billion bases, but typically we’ll over sample it, and so you can have billions of bases and that’s petabytes of data and then you’re analyzing that and then trying to see where are those variants.

Daniel Newman: It’s super interesting. I’m kind of sitting here trying to visualize putting, I’m like building it, I’m trying to put it all into a container and moving around. I’m kidding. But the direction of Amazon, by the way, and you leading in the role you are is clearly going more towards healthcare. Obviously-

Bratin Saha: I don’t know if I would call it going towards. It’s not quite going towards healthcare. This is helping our healthcare customers, providing them the compute, the AIML, the storage capabilities so they can innovate faster, right?

Daniel Newman: But I mean the company’s made a lot of investments. The portfolio of Amazon is diversifying. They’ve made some big buys in everything from primary care. I mean, so the company sees healthcare as a big future and obviously AWS, let’s face it, I think you built AWS a little bit to solve, company’s got a history of building new businesses to solve other business problems that it had. AWS was a bit born out of solving its own problems.

Bratin Saha: 90% of our roadmap is really based on what our customers tell us to do. Especially in the case of Omics, we have been working with a lot of our healthcare customers and they say, us building this infrastructure for compute, now think of this petabyte storage. You need a storage infrastructure to do it efficiently. You need a query infrastructure to query those petabyte size files. And so what they’re saying is, “There’s a lot of infrastructure mark that we have to handle. Why don’t you guys do all that undifferentiated infrastructure work so we can focus truly on the healthcare work?” So we aren’t really doing any of the healthcare discoveries here. We aren’t really doing any of the healthcare domain specific stuff. We are simply building the infrastructure so that our customers can get away from building infrastructure and focusing on the healthcare solutions.

Patrick Moorhead: The last question that I think we have time for here is, by the way, I’m really enjoying this discussion.

Bratin Saha: Thanks.

Patrick Moorhead: This is great. What are the different ways that your customers can build applications for AI and ML?

Bratin Saha: So we see our customers approaching machine learning in one of three ways. There’s some sort of customers who say, “You know what? I’m going to build my own machine learning infrastructure, my own machine learning models and their own apps.” For them, we just give them optimized hardware and optimized software, and then they build the whole stuff. Then there are most customers who say, “I don’t want to get into the infrastructure. I just want to build the machine learning models.” And so for them, we have SageMaker where we build the infrastructure, they build the machine learning models. Then we have some customers who say, “You know what? I don’t even want to be building those machine learning models. Give me APIs that I can use and I’ll use them in my applications, document processing, personalization and all that.” And so for those customers who don’t want to build models, who don’t want to get into machine learning, we have this higher level API AI services. So we cater to all of these customers and we meet them where they are.

Patrick Moorhead: I mean, as analysts, we have to put everything in a box and then put a name on it and then many times we don’t even use the same names. But at least we have narrowed in on IaaS, PaaS, and SaaS. I like your three layer cake. In fact, I wrote about it years ago, and I like the consistency that you’re bringing to the table. I think Daniel and I both believe that simplicity is really going to be the thing that makes the change. So more PaaS and more SaaS to simplify it. SageMaker is dramatically improving every element of the step from Dataproc, prepping all the data, garbage in and garbage out, all the way to running the models on the smallest little machine on the edge. So-

Bratin Saha: AstraZeneca, they migrated to SageMaker and they were able to reduce the lead time to do machine learning from three months to one day. Three months to one day.

Patrick Moorhead: That’s just mind boggling.

Bratin Saha: Mind boggling.

Patrick Moorhead: I mean, there’s very few things in life that have an order of magnitude shift like that. So-

Bratin Saha: But that’s to your point, which is there are various levels of simplification and us simplifying the infrastructure part with SageMaker or the infrastructure thing with Trainium and things or even the AI layers-

Patrick Moorhead: Little editorial, and sorry to hog this here, but I think really what’s missing is a lot of enterprise apps that are, nothing’s off the shelf, but let’s call it pick your favorite enterprise SaaS app, really needs to get on board quicker to be able to enable it. Because there’s some enterprises, they’re not going to go there, right? They’ve SaaS-ified and born in the cloud. I mean, they’re almost all SaaS, small businesses, right? Headed to SaaS. I think that is one of the bigger breakthroughs that as an industry, we need to roll that forward. I’m going to look forward to what you can do there.

Bratin Saha: Sure. Happy to. Oh, go ahead.

Daniel Newman: No, I was just going to say, I just really appreciated the opportunity. I was thinking as you were saying that, in the end, it’s going to be a lot of people just want to use NLP to ask you to query and then a query finds the data, creates the model, starts outputting data and visualize. I’m saying that is kind of the future. There’s going to be a small subset of PhD types that have been doing this for decades. A lot of it’s going to be me going, “Where should I be sourcing these materials from? I should be able to,” and it’s going to go out and the software’s going to write itself.

Bratin Saha: And in fact, if you look at a lot of the latest generative stuff-

Patrick Moorhead: My favorite kind of software.

Bratin Saha: … then prompts and you’re just asking questions and the thing is generating it. Now with Code Whisperer, it can generate code like a developer can do. So all the things that you guys are talking about is coming to fruition.

Patrick Moorhead: By the way, I don’t love all your names, but I love Code Whisperer.

Bratin Saha: Thank you.

Patrick Moorhead: Can I just tell you that? Anyways, more editorial. Who’s asking the questions here?

Daniel Newman: I don’t know.

Patrick Moorhead: Just a guess.

Daniel Newman: I’m going to interview you. But Bratin, thank you so much for joining the Six Five here-

Bratin Saha: Thank you. It’s always a pleasure, always a pleasure.

Daniel Newman: … on the road at AWS re:Invent. It’s great to have you. You are an alumnus. You will be coming back more often.

Bratin Saha: Happy to.

Daniel Newman: We’re going to make sure to have you on the show.

Bratin Saha: Happy to.

Daniel Newman: Everyone out there, there’s a lot to learn here. You may have to watch this one twice, maybe three times. But we really do appreciate you tuning in. If you like what you heard, hit that subscribe button. Join us for all of our sessions here for Six Five on the Road at AWS re:Invent. We have a lot of great guests, a lot more great content. But for this one, it’s time to say goodbye, Pat.

Patrick Moorhead: We’re out of here.

Daniel Newman: We’ll see you later.