The Six Five team discusses the recent Zoom AI Terms of Service dust-up.
If you are interested in watching the full episode you can check it out here.
Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.
Daniel Newman: When you’re not understood and your strategy is sometimes not understood, the market tends to want to tell stories for you. And Zoom had a week, this week. Zoom, let’s talk topic two. Basically got absolutely eviscerated in the press media and by users online for something that I would like to say… I’ll just start with the end. Stephen Covey is one of my favorite authors. One of The 7 Habits of Highly Effective People is to begin with the end in mind. Well, the end is that this big story ends up being a nothing burger, but let’s just talk a little bit about what happened.
Patrick Moorhead: I almost put that in the headline.
Daniel Newman: Nothing burger?
Patrick Moorhead: Yeah, it was good.
Daniel Newman: I put that in my-
Patrick Moorhead: Oh, I know. I’m sorry. I changed it. I probably should have left that in there. It was good.
Daniel Newman: Yeah, sure. Then it would have been able to give me credit for awesomeness.
Patrick Moorhead: I’m giving you credit right now, buddy. If I were there with you, I would patch you in the back.
Daniel Newman: Can I get a high five? Everybody? And that’s just listening. We just high-fived. It’s important sometimes that you walk people through what’s happening when they’re not able to watch the video. The video’s so good, you should watch it. We’re very fun to watch. Sometimes we’re typing or tweeting, often not paying any attention to one another. It’s really funny. I’m kidding. I’m kidding. I digress.
All right, Zoom. So here’s the thing, it’s weird when the outrage of people starts to consume over things that to me seem pretty obvious. Now, basically Zoom came out with a new term of service. And in their new term of service, there was some opaqueness in the language of how it would be able to utilize data from meetings to be able to train AI. Now, I’m going to come back to that in a second. Before I even come back to that, I just want to explain, none of you ever read your term of services. So I’m very interested in who this person was actually that read this in the first place. Kudos to you for taking the time. secondly-
Patrick Moorhead: It was Hacker News by the way.
Daniel Newman: Second, now, please go read the term of service of the rest of the apps that you’re using to find out that the data that you are creating and using for your email, for your web browsing, for your productivity tools, probably your CRM is being used at least in some anonymized capacity to help train and improve the products that you’re using. It’s pretty universal in SaaS. This is why at least originally I was like, “Whoa, why are people so pissed off?” But I think what happened here is there was some big ambiguity in the language that made people believe that they were training their generative tools with the actual meeting content. Meaning it wasn’t clear of any anonymization. It wasn’t clear that this was specifically related to the generated outputs that were being used when people use generative AI tools from Zoom. It just read like, “Oh, if you’re using Zoom, then it’s recording your meetings and it’s training it with the specific data.”
No real clarity on whether your data was safe, whether you’re protected, whether your meeting data could be used or getting into the hands of competitors or somehow be used to train a large language model. I parallel this a little bit to some of the distress and outrage that OpenAI has faced in its early days because there’s a significant amount of clarity lacking as it pertains to when you put data into an LLM or you put data into a generative tool, how that data then gets captured. The overall consensus is don’t put anything in. You don’t want to be out there in the wild. And I think that’s created some paranoia including me, and I’m sure you, Pat, with what data gets put into a Bard or a OpenAI or a Zoom for that matter.
But in the end, what like I said, mostly happened here is it’s a detail and it is all this is about is Zoom using service generated data. Meaning when people are using ML and AI to create generative data, it can be telemetry, product usage, diagnostic or generated text that’s not the actual but generated summaries and stuff like that. It could then take a summary of data to help it train and improve that model. So Pat, I got to be honest with you, I could spend a little bit more time beating this bush. The more I looked at it, the more I realized it was a lot of foe outrage. It was a lot of, oh my God, this is the worst thing that’s ever happened. And it felt to me very targeted, very unrealistic. And anybody that’s using any app, whether it’s a social media app, a SaaS app or any other data app probably is allowing their data to be used for similar functions, features and utilities.
So suck it up people when you’re using these apps, this is what’s going to happen. If you want the apps to get better, they’re going to use your data. If you have a big problem with it, put the app on-prem and use some terrible on-prem based collaboration software where you have 100% control. There we go.
Patrick Moorhead: That was good analysis, Dan. And I want to take us back to 2020 when we were all remote working, doing our thing and there was a big dustup that Zoom had on security. Zoom said it had certain features. It wasn’t routing through China, it was inaccurate on the type of encryption it was using, that it said it was using, I could find a way how I could get there. It was a different type of encryption and people were really leaning into, including myself, how many bits were there. And then that turned into this giant issue. Now three plus years later, right? I mean in 2022, Zoom for Government has Department of Defense impact level four certification, right? And that doesn’t even come up. Security rarely if ever comes up. And when you attach the history of that to this new thing where quite frankly I do believe that there are writers out there who are looking for a scoop, even if they know that there’s no malintent.
Now, you and I have the distinct privilege of being able to meet directly with the most senior management of Zoom. I wasn’t able to make their analyst event, you were. I was there last year, spent a lot of time with Eric Yuan. And for this, I spent some time researching and meeting with Zoom’s chief operating officer and chief product officer to really get the insights scoop of what was going on. And it’s interesting, I’ve been a person when I first saw what Google and Microsoft came out with, there was really no talk about how your corporate data would be used to make stuff better. And I’m like, “Using corporate data is going to be better. This is what people actually want to do.” And I feel like Microsoft and Google held back and Zoom trudged forward into this.
The only reason, again, I brought up the security thing is they handled this so much better this time, right? They hit it very quickly. We’re not going to have Eric Yuan every week or whatever he did talking about security updates because this is a giant nothing burger, okay? And in the end, Eric came out and I put a LinkedIn in the notes that really shows coming from Eric Yuan what the deal is. So I think they handled the issue really well. There is not a follow-up with work. I just think that the company needs to now educate it, try not to over rotate it. And quite frankly, if Zoom could overcome what happened in 2020, 100%, they’re going to overcome this. My fear is that every Zoom AI story from the press for the next three years will cite back to. And back in 2023, right? Terms of service. But quite frankly, that’s just the world we live in. If technology weren’t so important, shaping the world and shaping economies, nobody would care.
Daniel Newman: Yeah, I love that. And thanks for the history lesson how quickly we forget. It didn’t even–
Patrick Moorhead: We don’t talk about Zoom insecurity. Exactly.
Daniel Newman: Well, the Zoom bombings, it was a headline story for a long time. There was some whitelisting that you mentioned that people have quickly forgotten. And to their credit, they made the fixes and they got the right security authorizations. I spent time with Eric last week. The company’s investing, he’s got a great track record he’s built. If you don’t remember, he was the brains behind WebEx as well. This is not the first rodeo and I think the company’s done a good job of continuing to make it easy. So this is a hill to climb, but I don’t think it’s one they can’t surmount.