Close sidebar

Clear
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Ep 01
Dylan Fox of AssemblyAI
Coming Soon
NOW LIVE
Ep 02
Daniel Sturman of Roblox
Coming Soon
Coming 10/24
Ep 03
Mike Murchison of Ada
Coming Soon
NOW LIVE
Ep 04
Shensi Ding of Merge
Coming Soon
NOW LIVE
Ep 05
Alexandr Wang of Scale AI
Coming Soon
NOW LIVE
Ep 06
Jack Krawczyk of Bard
Coming Soon
NOW LIVE
Ep 07
Victor Riparbelli of Synthesia
Coming Soon
NOW LIVE
Ep 08
Cai GoGwilt of Ironclad
Coming Soon
NOW LIVE
Ep 09
Daniel Yanisse of Checkr
Coming Soon
NOW LIVE
Ep 010
Glen Wise of Cinder
Coming Soon
NOW LIVE
Ep 011
Kate Parker of Transcend
Coming Soon
NOW LIVE
Ep 012
Rene Haas of Arm
Coming Soon
NOW LIVE
Ep 013
2024 AI Predictions
Coming Soon
NOW LIVE
Learn more about Season 01
Ep 01
Andrew Bialecki of Klaviyo
Coming Soon
Now Live
Ep 02
Vlad Magdalin of Webflow
Coming Soon
NOW LIVE
Ep 03
James Theuerkauf of Syrup
Coming Soon
NOW LIVE
Ep 04
Suresh Vasudevan of Sysdig
Coming Soon
NOW LIVE
Ep 05
George Kurtz of CrowdStrike
Coming Soon
Ep 06
Ivan Zhou and Amit Kumar of Accel
Coming Soon
COMING 5/2
Ep 07
Marcelo Lebre of Remote
Coming Soon
COMING 5/9
Ep 08
Jon Noronha of Gamma
Coming Soon
COMING 5/16
Ep 09
Barr Moses of Monte Carlo
Coming Soon
COMING 5/23
Ep 010
Sanjay Beri of Netskope
Coming Soon
COMING 5/30
Ep 011
Jackie Burns Koven of Chainalysis
Coming Soon
COMING 6/6
Ep 012
Coming Soon
Coming Soon
Ep 013
Coming Soon
Coming Soon
Learn more about Season 02
Season 01 • Spotlight on AI
Season 02
Episode 03

Ada’s Mike Murchison on how AI is revolutionizing customer service

A conversation with the Co-Founder & CEO of Ada

Ada’s founding story is one the Accel team has always loved. Before its launch, founders Mike Murchison and David Hariri convinced seven customer service teams to let them work as support agents. After a year of exhaustive intake, they knew that for Ada’s customer service platform to add real value, AI was imperative. This was 2016 and at that time, an AI-first approach to customer service was unique—but the results were astounding. We discuss their journey on this episode of Spotlight On.

“We realized if we can put AI in the hands of customer service teams, allow them to help more customers and help them resolve more – we can transform the future of customer service. And that's what we've been focused on ever since.” - Mike Murchison

Let’s back up a bit. In 2014, Mike and David had just launched a different startup. The company grew quickly, but it had a major problem: keeping up with customer service demand. Their team went from wanting to talk to our customers a lot, to avoiding customer contact. That bothered them. But these problems weren’t new. Customer service is notoriously cumbersome and hard to scale. The problems also weren’t going away, so Mike and David saw an opportunity to build a solution. Two years later, Ada was born.

Today, enterprises like Meta, Verizon, Yeti, and Square use Ada to automatically resolve customer inquiries in any language or channel. In this "Spotlight On: AI'' episode, Mike discusses the early reactions to Ada’s customer service AI approach, challenges they’ve faced due to recent demand, advice for other founders on how to use technology to deliver real value, and the impact AI will have on the customer service workforce. Conversation highlights:

  • 00:00 - Intro
  • 04:00 - The core learnings from Mike’s experience working as a customer service agent, and how they approached AI in the early days  
  • 07:32 - Why Ada made an early decision to bring AI capabilities in-house, and how they built a strong technical team
  • 14:09 - How the rapid development of ChatGPT and LLMs impacted Ada
  • 25:00 - Advice on how to build an AI product that outlasts the hype

Host: Ben Fletcher, Partner at Accel

Featuring: Mike Murchison, CEO and Co-Founder of Ada

Learn more at Accel.com/SpotlightOn/Ada

Explore more episodes from this season:

Learn more about Accel's relationship with Ada:

Read More

Ben (00:17):

Today, I'm your host, Ben Fletcher, Partner at Accel, and I'm really excited because I'm joined here today with Mike Murchison, the CEO of Ada. Thanks for coming in, Mike.

Mike (00:26):

Great to be here, Ben.

Ben (00:27):

How are you doing? I'm doing really good. I'm excited to talk about Ada. I'm excited to talk about what it means to be an AI native company. I think we’ve got a bunch to cover today.

Ada’s origins, and understanding the core problems of customer service.

Ben (00:35):

Awesome. So Mike, we're going to talk a lot today about AI and about LLMs and how you all have been building for the future and taken advantage of a lot of the new technological advancements. But maybe take us back to the founding story of Ada and how things got started because this was 2016, 7 years ago and you all started as an AI-first customer support company. But walk us through the founding story and how things got started.

Mike (00:59):

ADA started in 2016, but two years before that we were working on a completely different product, A B2C social search engine called Volley. And the long story short is that Volley was growing very quickly and we encountered a customer service problem. We couldn't keep up with our customer service demand and scale our operations accordingly. We started to witness our relationship with our customers change as a function of our growth. And specifically, we saw our team and ourselves go from wanting to talk to our customers a lot, to really being focused on avoiding customer contact. And I think simply put, that really bothered us.

(01:45): We are very product-oriented people. We care so much about the feedback our customers give us and it felt very antithetical to that core belief to be focused on rejecting our customer's input, which is what our customer service operations were focused on.

Ben (02:05):

And I remember the story you called all these folks. You were thinking about how do you scale this? And then you ask all these folks, can we just sit in your call centers and can we learn? Can we figure out what your customers are asking and let's learn before we launch our next and before we build our product.

Mike (02:21):

That's right. We actually joined the seven different customer service teams from a dingy office on the east end of Toronto, and David and I performed customer service for what ended up actually being almost the first year of this company. We learned first of all that 30% or more of the inquiries we were responding to were repetitive and mundane questions. In some cases actually was upwards of 80% dependent on the vertical that the company worked. 

Ben (02:49):

It’s like, where's my package? Or password reset? You're just getting that over and over again. 

Mike (02:54):

That's right. Two, we learned that the agent experience of responding to customers inside incumbent, what I now call human-first software, it's highly negative. No one is waking up out of bed in the morning going, I can't wait to spend more time in my agent desktop. It's not a compelling product experience. And specifically the software company that makes those agent desktops, they're actually not incentivized to make the agent more productive because they sell agent seat licenses. And so automation and AI have always been this bolt-on consideration, not a core deeply considered value proposition that was being offered to the end user. And then third, our colleagues, they all wanted to talk to their customers in more modern channels. They wanted to talk to them over social channels, they wanted to text them. And the idea of turning on a social channel panel was always sort of rejected because again, the strategy was to talk to your customers less and not more. So it was on the back of those three core learnings that we set out to figure out how we could solve this. And we simply put, we just worked really hard. We became the number one or number two agents on each one of these teams just manually. We woke up all hours of the night. We were the first people to answer the simple inquiry. 

Ben (04:30):

Was this crafting a better agent platform that you wanted to build? Was this crafting how you were going to respond, how you're going to build the company? Was it always the goal of we're now going to go launch a better way to do this?

Technology as a value-delivery vehicle, and Ada’s AI approach

Mike (04:40):

The goal was always how do we improve the customer service experience? The technology just became a vehicle for delivering the value. It wasn't the value itself. I think that's something that's been so true since our foundings is this early emphasis. The technology is the vehicle. It's not inherently valuable. What's valuable is that we help an end customer, a user, we help them get the help they're looking for. We help them resolve their inquiry. And so the fastest way for us to do that initially was literally manually we did it ourselves, but over time, when we became, again, some of the most productive agents on these teams, we started become really familiar with the problems that were facing as agents that we could would actually, if we solved, would allow us to solve more inquiries for customers. And that's when the first real product of Ada was born. We let it run inside these accounts. We took an AI approach, we had access to so much data. We focused on making our AI as easy to use as possible because all our colleagues were non-technical. And when we let it run, we knew we were onto something because we didn't get fired. Like our managers in these companies, they were like, they didn’t care. What they cared about was the value was being delivered and more customers were being helped.

(06:02): And that was almost the perfect AB test that really led us to realize, okay, if we can put AI in the hands of customer service teams, allow them to help more customers, help them resolve more, I think we can transform the future of customer service. And that's, as you know, we've been focused on ever since.

Ben (06:18):

Yeah, I remember early on when you were walking us through the product and it was very deep on the integrations that you had built not just for the agent handoff. So you had all the data, you had the conversation history, you had the information about the customer, but it was also the deep hooks you had into the end systems. I can change your billing plan, I can change your ERP, I can update your CRM. There were a lot of bespoke integrations that you all had built into the product, so it was pretty much plug-and-play for your end customers when they wanted to actually make any of these actions. And I think even today where it's like the big unlock for chatGPT was when they launched their ecosystem and they launched plugins and early days for Ada, that's what you all were building for customer support was how do you actually take action and how do you actually do these things if you are going to have an automated agent that's helping your end customers?

Mike (07:10):

That's exactly right. It turns out that when a customer is asking for help, they're really looking for you to do something for them and the quality of customer experience tends to improve when you don't just tell a customer what they should do, you actually do it for them. Totally, totally. Instead of instructing someone how to reset their password, it's a lot better experience just to reset it for them.

Mike’s approach to building your own models, and bringing on technical talent

Ben (07:32):

And walk me through, I mean this is 2016 to 2019, the technical capabilities that you all are building out of the University of Toronto, you're going and grabbing folks that are coming from the valley, from New York. How you thought about getting that technical muscle so that you could build all these capabilities?

Mike (07:50):

Ada initially was really an NLU engine that made it exceptionally easy for you to understand the intent of a customer inquiry and then to run a workflow, conversational workflow that is matched or paired with the intent that's classified. And our approach to this has always been, again, what's the fastest way for us to increase the resolution. Initially we used off-the-shelf NLU engines that were available. I think we used Facebook technology at one point and a company called wit.ai. I'm going back now to the early days. We used Google's technology and dialogue flow and then over time we built our own NLU models that outperformed what was available off the shelf. And we got very, very good at that. We were using large language models as soon as they were really available initially to augment the bot builders inside Ada who were creating conversational flows that help could resolve inquiries for their customers. And what again happened over time as those models became more capable, we made the leap to put the large language model at the core of Ada and replace the NLU model. 

Ben (09:18):

These are your own models? You're weaning off of some of these other open source or some of these larger models and you all are thinking through, okay, how do we make sure we bring that in-house? And as the capabilities are coming out for large language models, you're making sure you're incorporating them into the brain or into the core systems and infrastructure for how Ada is built.

Mike (09:40):

That's right, and I think for us, that transition wasn't very challenging for us to make because again, we've been an AI native company from day one. Initially, the AI model or models at the core of Ada, were NLU models. Now they are large language models. It's never been a human who's been at the core of our software. So I think that transition is something that is very natural for us to do. It's still not without its challenges which we can get into. Our technological path with large language models will likely mirror the path we took with NLU, where we were using some models off the shelf that are third party and at the same time we run an internal process of built training and our own that we benchmark against them.

Ben (10:27):

Yeah, I remember it was always paramount for you all to own these capabilities and it was three, four years ago you all hired some very senior, very expensive folks that were focused on doing AI research and you would build a team in Seattle, in Israel, and these are folks that are doing cutting edge AI research. How did you all think about that in terms of setting the groundwork? Did you see this coming? Did you know that there were going to be the explosion of now what we call generative AI? How did you all navigate that even early days 2019, 2020?

Mike (11:06):

So I think we were very fortunate in that our vision has always been to resolve every customer service conversation. And that meant that we knew we had to automate conversations not only in messaging channels but also over the phone voice channels too. So we were very fortunate to be able to bring in a lot of exceptional talent who's particularly skilled at voice AI. And so there are unique challenges to voice AI that are quite distinct from messaging AI. And so there were some core folks we brought in who really helped us see around the corner with what was happening. That sort of foresight turns out coincided with a major shift that's happening in our industry, which is that the contact center software and the CX software are really converging. It turns out that businesses, they want their agents in one place.

(12:09): And in order to provide a great customer experience, I think we know this as consumers, it's really annoying to chat with a business and then for the person you speak to on the phone to have no idea what you're talking about if that interaction previously. So an omnichannel unified experience is what we're really after and we would think that our AI stands to be able to deliver that. So I think some of those folks that we brought in and we were fortunate to have the foresight to introduce really helped us see around the corner and kind of prepare for that unified future. In terms of the chat GPT moment in particular, which I think is also what you're asking about, we did not see that coming. The rate at which large language models improved was not something that I don't think anyone anticipated. We always envisioned a future where Ada would, you would never have to build an AI agent at Ada. You would only ever just manage one.

(13:03): We would take care of the generation for you. The rate at which we were actually able to deliver that really surprised everyone inside Ada and myself included. That's one of the reasons why we couldn't be more excited about what we're building now. Our product is at a place now where today where I thought it would've been, I had this conversation a couple years ago. I thought it would be like in 2028 or 2029 that we'd be here now.

Maintaining differentiated value after the AI explosion.

Ben (13:32):

Well fast forward, you all have built a sizable company, you've scaled the team, you've scaled revenues, you serve some incredible customers today, Facebook, and then this hits overnight, or not overnight, but pretty quickly where every company can now go and they can build their own FAQ bot. You know what, you all started in the early days by automating FAQs. Now you can go and grab any type of large language model, train your own dataset and you could go launch your own. How did you all deal with that? How are you navigating it and then what does that mean for the future of what you all want to build?

Mike (14:09):

It's never been easier to automate conversations with your customer base, basic conversations. I think for us, again, the way we think about value is we are delivering the AI platform that helps businesses resolve the most customer service inquiries with the least effort and the advent of chatGPT and the rate at which large language models are now capable of generating accurate language, I think that's really just opened the world's eyes to the shift occurring in our industry. And that shift is one in which businesses are transitioning from a world that is governed at least digitally by humans at the core of a software product to one in which AI is at the core of the software product more broadly. I think we're starting to experience the equivalent of AI transformation where businesses need to learn what it means to run their company when AI models are at their core and not human beings. And so I'd say that we've seen a real acceleration in interest in how to become – in the customer experience world – how do you become an AI-first company? How do you make your customer service operations AI first? And there are a few things that we focus on that our customers really appreciate in that regard that are pretty distinct from trying to do this on your own.

(15:56): The first is that we are very good at understanding what resolution even means. So we use large language models to automatically detect whether or not a conversation between you and your customer is truly resolved. That's a really big deal. We can now do this with greater than human accuracy. We can automatically review a transcript and understand did this customer actually get the help that we're looking for? That's the first thing, and we enable all our customers to experience that level of measurement. And then secondly, we focus on enabling our customers to hire an AI agent, which is they could hire their own agent if they wanted to relatively easily, but it turns out the big problem to solve is how do you make this agent really smart over time? Much like when you hire a human employee, they're not that productive on day one. You got to onboard them, you got to equip them with your strategy and clear goals, and you got to measure their performance over time and then you got to give them ongoing feedback to truly unlock the most value out of your employee. That just turns out that exact same thing is true in the AI native paradigm.

(17:18): Your AI isn't that smart on day one. It needs to be coached and regularly given feedback regularly to truly have maximal impact. So that's what we really focused as a company.

How Ada’s business model evolves, and monetizing the value of AI labor

Ben (17:31):

One of the things for us is it was changing so rapidly our customers wanted what the industry was expecting and then also the capabilities for what was happening in our industries. I know for us, we had many conversations around what we wanted to do, the products that we wanted to build, but one of the things was really around the economic model and around pricing. Maybe talk a little bit about how the business model started to change and anything else that kept you up at night or was a big challenge for you.

Mike (18:00):

Definitely pricing has been something you and I have spent a lot of time together on, and that's been a huge focus for our company. The big change for us has been really zeroing in on the core unit of value that Ada provides its customers and monetizing that. We've always been focused on resolution. We always want to resolve the most, but for most of our company's history, until relatively recently, we've charged per conversation. And what happened during the pandemic as a result of rising customer service volumes is that our business did exceptionally well. But what also happened during the economic recession is that some businesses overestimated the amount of volume that they actually needed, and that was hard. That was really challenging for us to support our customers through. We showed up for them.

Ben (19:00):

And that's in terms of I would say retrades on pricing or them wanting to recut contracts?

Mike (19:08):

That's right. I mean, when our business is rooted on, our business model is connected to the volume of customer service conversations that we power for you. When our customer's businesses are suffering, they have less customer service. We see that in our contract values

(19:24): Now, we also took that, while that was really challenging, we took it as an opportunity to understand, hey, how could we solve this for our customers and better align ourselves with our customers? What we came up with was a focus on not just any conversation that we power, but we decided to focus on monetizing only the conversations that are really valuable. And we call that an automated resolution, and we use language models to actually measure more accurately than a human now whether or not a conversation that resolved. And so what I'd say is, I think for folks who are thinking about their own pricing and packaging, and for AI native companies who are thinking about, Hey, how do I think about monetizing the AI labor that I'm providing? It's valuable to consider how do you reduce, what is the core unit of value that your software provides?

Ben (20:23):

So you all have had to shift and is this, I think it's been a segment of the customer base, but you've had to go from pricing per conversation to now pricing on how I actually resolved the inquiry from an end customer.

Mike (20:37):

That's right. And in some cases for us, that means that the total volume, the initial ACV that we monetize could be smaller, but over the long run, we know it's going to be far bigger and we know that it aligns the incentives between us and our customers and their customers in a way that wasn't possible before. So it was a tough call to make, but it's absolutely the right call for our customers and for our company.

Maintaining organizational focus in the age of AI, and the labor implications of automation

Ben (21:06):

Yeah. What are the things still that keep you up at night or make you nervous about where the industry's going and Ada's business model in whatever the new age of AI is going to look like?

Mike (21:17):

Focus keeps me up a lot at night. I think I am always sort of thinking, do we have the right level of focus as a company, we are focused on resolving the most customer service conversations, but at times in our company's history, I've defined customer service pretty broadly. And so we've built product that goes beyond customer service to internal customer service. For example, we help them, call it employee experience, not just customer experience. And so I think one thing that keeps me up at night is do we have the right focus? We certainly have ambitions of powering the entire customer journey. Right now we are focused exclusively on customer service, and I'm making sure that our definitions align with what the right focus is. And I think you only achieve the right level of focus by just beating the drum and aking sure that I think you're uncomfortable with it. My heuristic is, I should always feel, and our company should always feel uncomfortably focused.

Ben (22:33):

One side is, hey, AI is going to help bring us out of a recession because of the disruption that's happening and there's new economic models, there's going to be new jobs, there's going to be a lot more that's going to be coming out of it. And then the flip side of that is, well, there's a ton of automation that's happening, so you're going to eliminate jobs, are going to have to reallocate some of the resources to other things. I'm curious for you, which way do you see it? And it's probably mixed or in the middle or some of both, but where do you land on the future and how does AI play a role?

Mike (23:07):

I think it's a massive productivity unlock. I just think we're going to look back on this period, at least in the customer service industry, and we're going to be shocked that we used to perform customer service the way that we used to. I think that one of the uncomfortable realities of that shift, however, is that there is a tremendous amount of labor displacement that will happen as a result of that. And while it's so exciting to see our customers create new roles that didn't exist before, AI management roles inside our software and provide a new career path for customer service agents, it's also true that a lot of our elect to realize the savings they get from our software in the form of smaller customer service teams. And I think that that's the history of automation and there is something that for me, I find that really uncomfortable.

Ben (24:00): 

I'd say one of the things as we've learned more and as we've seen the value a lot of folks have asked is this hype cycle. And for me, one of the things that always is against that argument is just the sheer value and efficiency gains that we are seeing. I think about the ability to be able to handle more of the inquiries that are coming on, increase CSAT while you're able to handle more customer conversations. I think about the different applications that are happening and where you're able to get productivity gains from your engineering teams, from your customer support teams. And there's a lot of value that's being created. And so if you take away any of the hype or maybe any of the buzzy apps that will come and go, the underlying value is that you can get more done with less and you're able to build that. And I think you all are a great example of powering that and pioneering and bringing that into specifically customer support.

Separating the wheat from the chaff in a new generation of AI companies

Mike (25:01):

How do you think about cutting through the hype? You're seeing every company, I'm sure that is pitching you right now has positioned themselves as a generative AI company. How do you cut through what is actually going to be increasing, providing more value to customers versus something that's just a marketing line?

Ben (25:22):

Yeah. Well, one of the things that I feel very fortunate at is that I've been at Accel now for a good while, and one of the things that Accel always values is underlying good unit economics in the business. And so I think about when we've been able to partner with Webflow and Qualtrics and Atlassian, those were great businesses before they ever took capital, and we focused on great underlying unit economics. And so when we're looking at this next wave of artificial intelligence, the ways that we've looked at it have been companies that have been able to really provide value, and they've been able to provide that economic value and the unit economics make sense. Now, it may be fuzzy in the beginning in the way that they're having to invest a lot in these models. They're having to gather a lot of data and forward invest. But when you fast forward and you can see that there's true value and that over time that economic value is going to be realized and they're good underlying businesses, that gets us really excited. And so we just announced and partnered with a company called Synthesia

Ben (26:28):

And they're building 3D avatars. And when folks are looking at that, it's, Hey, they've come into this AI wave and a lot of folks have started to look at the business, but the underlying businesses they've gone into enterprises and sold the ability to not need to hire a production crew or you don't need your antiquated consultants that are going to come in and train your workforce or do your onboarding or create a new learning manual, and they've automated all of this software and build a very durable enterprise business. Those are the things that get me really excited and you can cut through and see where the real value is.

Mike (27:06):

I love how you put that. It strikes me that there's almost like a Julliard music audition analogy to this. The way that music auditions from my understanding work is they're always done blind. The evaluators are just listening to the musicians. They can't see the musician. And what you're doing is it's effectively the equivalent. You're looking at the pnl of the business. You're understanding – irrespective of the marketing and the flash and what the business looks like – and you're seeing do, does it have a sound unit economics? Is the value actually being realized? It just so happens that AI is the way that may be delivered, and it may be leading indicators and there may be other things, but you can actually see that there's true value that's being created, and that's kind of the way that we've been approached.

Mike (27:56):

I think that's likely, my suspicion would be that's going to be common amongst the top performing companies in this next generation of AI native companies is that what will be true amongst them is a relentless focus on the value that they create, not the technology itself. It delivers it. And the reason I think that's so true is because if you're really anchored on the value itself, you are open to other means of actually delivering it. And I think that's where business durability is built. It's the businesses that are truly anchored on how they solve the value they provide that are open to coming up with new ways of solving it in perpetuity.

AI coaches, data ownership and the long-term vision for Ada

Ben (28:43):

So for Ada, as you all are differentiating around the customer support stack, is it still going to be around the integrations and taking action or is it going to be around how you train the agent and how you do your automation or your AI so that it can respond? I mean, that's one of the biggest things is how do you create these feedback loops around the data into production and then gathering them and what are the guardrails? What are the weights? What are the biases? How do you want to build the model? How do you see Ada in the new AI first world continuing to be the authority there and bringing that forward?

Mike (29:20):

So I think definitely in the midterm, there's still a lot of value we provide by enabling your agent to take action. Teaching your AI to actually do something is pretty hard to do on your own, and Ada makes that way easier. In the distant future, however, I think really the vast majority of the value we provide is really as the coaching layer for your AI. I almost think of AI, Ada in the long-term, almost like an HR application for your AI employees. It turns out you need this interface to be able to understand how well is this AI agent performing and how do I help it improve faster? And I think it's going to be the company that drives the most rapid acceleration of the learning of the AI that's going to accumulate most of the value and will by virtue of that power, the most compelling customer experiences, let's say.

Ben (30:19):

Yeah, I think one of the most interesting things about this is there's just so much data that is going to be created and those companies are going to become reliant upon that data. So whoever owns it is training it, is able to plug that back in, that's where there's going to be so much value that's created. And so when I think about that, if I am a company that's going to help run your support, where is that data going to sit? Is that going to sit with Ada? Are you going to be able to help these companies as they're training? That is they're going to be a third party provider that helps them to clean up the data, to own the data, to bring it back in for how they're going to train their AI models. I think so much of the economic value is going to be reliant and dependent on who owns that data and who helps you to make those decisions. And you have to figure out how you do the data exchange, how you can handle privacy, how you can store it. It's so valuable that you have, but it may not be that the customer support company is the one that knows how to own and be able to train and use that data. 

How AI agents will change procurement and customer lock-in

Mike (31:28):

Yeah, I agree with that. I think that the way we're seeing this play out is a customer hires an AI agent with Ada. That AI agent is the equivalent of an intern to start, but they use our software, the business uses our software to make that intern top performing employee over time. And what's looking like it's starting to happen is that the relationship with that employee, the AI employee, becomes one in which firing that employee sort of becomes the equivalent of letting go of like 10,000 people. That's a really, really difficult thing to walk away from because you've turned this one employee into an army of 10,000 and you've sort of earned this productivity unlock and switching vendors or moving to a different application means letting that go. And so I suspect that that'll be the case across most software categories.

Ben (32:33):

Well, it's nice creating a lot more lock in then.

Mike (32:38):

I think it will. But I think it'll also change procurement processes. I think CIOs will have questions about this and what happens with the data, the instructions and coaching that they've given to improve AI over time.

Ben (32:51):

Yeah, I think there's going to be a big unlock there, and I think we're still in the early innings of figuring that out and how that data transfer and how that lock-in is going to be created

Mike (33:00):

And how the businesses even think about value. I think one of the psychology of buying AI native software is very different. We've spoken so much about the psychology of building an AI native company. There needs to be deeper consideration about the psychology of purchasing AI native software. It's actually pretty different. It's fundamentally different to buy a piece of software and to expect its value to be way greater in the future than it is today. And that requires different types of different type of modeling, and it requires a different type of education on behalf of sellers to the key stakeholders that they're engaging. And I think, I agree with you. We're in the early innings of understanding how to do this. 

Ben (33:45):

Thank you for doing this. It was awesome. Really appreciate you coming in. 

Mike (33:49):

Thanks for having me, Fletcher, It was fun.

Meet your host

Partner
Focus
Cloud/SaaS, Enterprise IT, AI

Ben Fletcher

Ben Fletcher is a Partner at Accel, a leading venture capital firm. He focuses on investments in enterprise IT, consumer and SaaS companies.
Read more on Accel.com
Season 1 of the Spotlight On podcast, by Accel
Artificial Intelligence

Artificial Intelligence

Cloud/SaaS

Cloud/SaaS

Growth Stage

Growth Stage