Today, I’m talking with Prashanth Chandrasekar, who is the CEO of Stack Overflow. I past had Prashanth on the amusement successful 2022, 1 period earlier ChatGPT launched. While the generative AI roar had tons of interaction connected each sorts of companies, it instantly upended everything astir Stack Overflow successful an existential way.
Stack Overflow, if you’re not acquainted with it, is the question and reply forum for developers penning code. Before the AI explosion, it was a thriving assemblage wherever developers asked for and received assistance with analyzable programming problems. But if there’s 1 happening AI is bully at, it’s helping developers constitute codification — and not conscionable constitute code, but make full moving apps. On apical of that, Stack Overflow’s forums themselves became flooded with AI-generated answers, bringing down the prime of the assemblage arsenic a whole.
You’ll perceive Prashanth explicate that it was wide much oregon little from the leap however large a woody ChatGPT was going to be, and his effect was axenic Decoder bait. He called a institution emergency, reallocated astir 10 percent of the unit to fig retired solutions to the ChatGPT problem, and made immoderate beauteous immense decisions astir operation and enactment to navigate that change.
Verge subscribers, don’t hide you get exclusive entree to ad-free Decoder wherever you get your podcasts. Head here. Not a subscriber? You tin sign up here.
Three years later, Prashanth says Stack Overflow is present precise comfy chiefly arsenic an endeavor SaaS business, which provides AI-based solutions that are tailored to antithetic companies’ interior systems. Stack Overflow besides operates a large information licensing business, selling information from its assemblage backmost to each those AI companies, ample and small.
That’s a beauteous large pivot from being seen arsenic the spot wherever everyone tin spell to get assistance with their code. So I had to inquire him: does Stack Overflow adjacent pull caller users anymore, successful 2025, erstwhile ChatGPT tin bash it each for you? Prashanth said yes, of course. You’ll perceive him explicate that portion AI tin grip elemental problems, for thorny, analyzable ones, you truly privation to speech to a existent person. That’s wherever Stack Overflow inactive brings radical together.
You’ll perceive america travel backmost to a single stat successful particular: Prashanth says much than 80 percent of Stack Overflow users privation to usage AI oregon are already utilizing AI for code-related topics, but lone 29 percent of that colonisation really trusts AI to bash utile work.
That’s a immense split, and it’s 1 I spot each implicit successful AI close now. AI is everywhere, successful everything, and yet immense numbers of radical accidental they hatred it. We spot it, successful the Decoder inbox, successful the comments connected The Verge, and connected our videos connected YouTube. Everyone says they hatred it — and yet numbers don’t prevarication astir however galore millions of radical are utilizing it and seemingly deriving immoderate benefit.
It’s a large contradiction and hard to unpack. But Prashanth was beauteous consenting to get into it with me. I deliberation you’ll find his answers and his penetration precise interesting.
Okay: Prashanth Chandrasekar, CEO of Stack Overflow. Here we go.
This interrogation has been lightly edited for magnitude and clarity.
Prashanth Chandrasekar, you’re the CEO of Stack Overflow. Welcome to Decoder.
Wonderful to spot you again. It’s been a blistery minute. It has been 3 years since the past clip we spoke, truthful it’s large to spot you again.
I should person said invited backmost to Decoder. You were last connected the show successful October 2022. One period later, ChatGPT launched.
[Laughs] That was an interestingly timed interview, close earlier the satellite changed.
Right earlier the satellite changed. Software improvement is surely the happening that has changed the astir since AI models person hit. There are a batch of caller products successful your beingness to speech about, and there’s what Stack Overflow itself is doing successful the satellite of AI. So I privation to speech astir each of that.
But first, instrumentality maine backmost to that moment. We had spent an full speech successful 2022 talking astir assemblage and moderation, however you were going to physique a funnel of radical learning to code, and learning to usage Stack Overflow. That was a large portion of our conversation. The technologist pipeline, some them learning to constitute bundle and being a portion of the bundle improvement community, was precise overmuch connected your mind. And then, each of bundle improvement changed due to the fact that of the AI tool. Describe that infinitesimal for maine due to the fact that I deliberation it contextualizes everything that happened afterwards.
It was decidedly a very, precise astonishing moment. It wasn’t an unexpected infinitesimal successful galore ways due to the fact that present came this exertion that evidently immoderate radical knew astir but not successful a mode that captured everybody’s imaginativeness utilizing this beauteous interface. We were successful the mediate of wrapping up our calendar year, and astatine that point, we were reasoning astir our priorities for the adjacent year.
It became precise wide what we needed to absorption connected due to the fact that this was going to beryllium this very, precise immense alteration to however radical devour technology. Welcome to technology. It’s perpetually changing, and I deliberation this question particularly is wholly unprecedented. I don’t deliberation determination is immoderate benignant of analogy oregon anterior question that I could look to, including the unreality and possibly the internet. I don’t deliberation we’re inactive afloat consuming what this is astatine the moment.
So, we went into what is the equivalent of a codification reddish concern wrong the company. It was an existential moment, particularly for our nationalist level due to the fact that the superior job, if you will, is each astir making definite radical get answers to their questions. Now, you person this really, truly slick earthy connection interface that allows you to bash that astatine a moment’s notice. We had to benignant of signifier our thoughts, and what I ended up doing was carving retired 10 percent of the company’s resources to specifically absorption connected a effect to this.
We acceptable a circumstantial day to respond successful a meaningful fashion, truthful we said the summertime of 2023. I was going to spell talk astatine the WeAreDevelopers Conference successful Berlin, and I efficaciously told the company, “We’ve got six months to spell and nutrient our response.” At least, it would beryllium our archetypal effect due to the fact that this is going to support iterating.
That’s however we mobilized the company. We acknowledged it was a codification reddish moment, we carved retired a squad of 10 percent, truthful astir 40 radical oregon truthful since we were a somewhat medium-sized company. Then, we got to work. That was the moment.
Take maine wrong that room. Very fewer radical ever get to send the codification reddish memo, right? This is not a happening astir radical ever get to do. Maybe you deliberation astir doing it, but nary one’s going to work your memo. Everyone has to work your memo. You’re the CEO.
Take maine wrong that country wherever you said, “Okay, I person identified an existential menace to our company. People person travel to america for answers to bundle improvement questions.” Again, the past clip you were connected the show, you were talking astir the thought that determination were objectively close answers to bundle improvement questions and that the assemblage could supply them and ballot connected them. Well, present you’ve got a robot that tin bash it and tin bash it arsenic overmuch arsenic you want, arsenic agelong arsenic you want. There are tools similar Cursor AI and Cloud Code that tin tally disconnected and bash it for you.
So you’ve got each that, and you say, “I request to instrumentality 10 percent of the company.” I’m funny however large the institution is. I cognize there’s been immoderate changes, but 10 percent of the institution is 40, 50 people. How did you place and say, “This is the infinitesimal I request to propulsion these radical successful the room. I’m making this decision, and the close reply is 40, 50 radical are going to acceptable speech their clip to present maine a program by this clip adjacent year?”.
The instinct has travel from a mates antithetic experiences. My acquisition close earlier this was astatine Rackspace successful the unreality services space. The concern I was moving astatine Rackspace was really astir responding to Amazon Web Services arsenic a unreality exertion threat. I was connected the squad that yet built that concern from the crushed up, and it was efficaciously 10 percent of Rackspace’s colonisation that went and created that. So, I had immoderate signifier seeing and responding to a disruptive menace that you encounter. It was my crook present to enactment that into question astatine Stack by appointing idiosyncratic similar myself erstwhile I was astatine Rackspace to bash precisely the aforesaid thing.
The different information constituent goes each the mode backmost a mates decades oregon much erstwhile I was successful concern school. My prof was Clayton Christensen, and helium wrote the publication The Innovator’s Dilemma. I’ve ever thought astir that successful the discourse of technology. In technology, it is simply a precise accordant taxable that each truthful often, you volition person disruptive threats, and there’s a precise circumstantial mode successful which you request to respond to that. It’s precise overmuch astir however past suggests you should carve retired an autonomous squad that has precise antithetic incentives and tin prosecute things successful a precise antithetic mode comparative to the remainder of your business.
And remember, Stack Overflow is truly 2 parts. We person our nationalist platform, which is this large web disruptor and which we should speech astir much broadly successful regards to the internet). The different broadside is the endeavor business, wherever we are serving ample companies with a backstage mentation of Stack Overflow. Thankfully, radical proceed to spot worth successful having a cognition basal that’s precise accurate. Increasingly implicit the past fewer years, it’s really adjacent go much invaluable due to the fact that you request truly large discourse for AI agents and assistants to work. I’ve got plentifulness of examples, truthful we tin speech astir that.
So, that’s wherever that effect came from. I had been done it successful a mates antithetic dimensions anterior to that. And conscionable successful presumption of however I communicated to the team, the memo was really similar a bid of memos. Every Friday, I nonstop a institution email. I conscionable sent 1 close earlier I got connected here. I americium beauteous transparent successful those: “Here’s what’s connected my mind, here’s what we should beryllium doing, present are immoderate large things that happened, present are immoderate radical who demonstrated halfway values.”
I’ve done that religiously for… I’ve been astatine the institution for six years, and I bash that each Friday. The squad fundamentally knows what’s connected my mind, and truthful it wasn’t 1 large memo to activate it. It was a bid of emails starring up to this infinitesimal saying, “here’s what we’re going to be, we’ve got to respond to this, here’s what we’re reasoning astir now,” and truthful connected and truthful forth. This went connected until I could enactment the emblem canvass down and say, “Hey, by the WeAreDevelopers Conference, we person to nutrient a meaningful effect connected the nationalist level arsenic good arsenic connected the endeavor beforehand due to the fact that it’s a large accidental to integrate AI into our SaaS application, which is simply a antithetic vector.” Hopefully that helps.
Did you really benignant the words “code red?”
I deliberation I decidedly utilized “disruptive.” I utilized “existential moment.” I utilized each those things, but I don’t cognize if I utilized the nonstop words, “code red.”
[Laughs] I conscionable deliberation astir that infinitesimal wherever you’re like, “All right, I’m going to deed the C and the O… I’m saying these words, it’s happening.”
It was precise clear. We person a precise circumstantial connection cadence with the company, similar galore others. The code and seriousness of what we were moving connected was precise evident to people, particularly erstwhile you carve retired resources and instrumentality radical distant from definite teams. People are going to ask, “Wow, what astir my staff?” Here you go, there’s the reason. So, it becomes precise obvious.
How did you marque those decisions to propulsion radical away? How did you determine which people, however did you determine which teams? Those are each trade-offs, right?
Yeah, nary doubt. I deliberation this is simply a hard occupation to solve. You surely privation precise talented people, but I deliberation you privation the types of radical who are consenting to interruption solid oregon spell against the atom and not beryllium encumbered by humanities norms. I precise specifically picked a operation of people. The radical who are starring it were newer and came from the extracurricular of the institution due to the fact that remember, we were going done a transformation.
I joined a institution that was engineering-led successful 2019 and each astir this nationalist platform, and we were transforming into this product-led organization. We specifically appointed a newer idiosyncratic who had travel from the outside, was funny successful gathering highly innovative, accelerated iterating products, and had the DNA and the thrust to bash it.
I besides personally stayed overmuch person to it. I, successful fact, ran merchandise for an interim play of clip with that idiosyncratic reporting straight to me. That was different mode to enactment very, precise adjacent to what was happening connected the crushed until the existent launch. The remainder of the squad was a operation of precise talented engineers, designers, and radical who had discourse of however the tract worked successful the past and who could supply america with each the unlocks that we needed.
I deliberation astir Stack Overflow successful what are astir apt 2 reductive presumption successful this context. You person inputs, you person outputs. The inputs are users answering questions. The outputs are the answers to those questions that radical travel and hunt for. There’s a full assemblage that makes that strategy run. The bundle level manages that assemblage with the lawsuit moderators, but it’s truly inputs and outputs. There are radical who are asking questions and radical who are answering questions.
Both sides of that are profoundly affected by AI. I deliberation we person travel to the unfastened web portion of the conversation, wherever the input broadside is being flooded by AI generated slop. In 2022, you had to ban AI-generated answers successful Stack Overflow. Then, connected the output side, the quality for AI tools to proviso the answers is overwhelming.
So, let’s conscionable interruption it into 2 parts. How did you deliberation astir the input side, wherever there’s a flood of radical saying, “Oh, I tin reply these questions faster than ever by conscionable asking ChatGPT and pasting the reply in. Maybe that’s not bully enough, but I tin conscionable bash it.” Then, however did you deliberation astir the output side?
We noticed 2 things close retired of the gate. One was the fig of questions that were being asked and answered connected Stack went done the extortion due to the fact that radical started using, to your point, ChatGPT to reply these questions. That fueled this spike, which is benignant of counterintuitive, but I deliberation radical conscionable felt like, “Wow, I tin crippled the system, truthful fto maine spell bash it.” Very quickly, we had to beryllium highly shrewd, and our assemblage members are astonishing astatine figuring retired what’s existent and what’s not. They were capable to telephone retired precise rapidly that these posts were really ChatGPT generated. That’s benignant of what initiated the ban, which we wholly supported and inactive support, by the way. You inactive cannot reply immoderate of the questions connected Stack Overflow with AI-generated content.
The crushed for that, Nilay, is due to the fact that our proposition is to beryllium the trusted captious root for technologies. That’s our imaginativeness for the company. So for us, it’s each astir making definite that determination are lone a fewer places wherever you tin spell and not woody with AI slop, wherever a assemblage of experts person really voted and curated it truthful you tin spot it for assorted purposes. On the input side, it made consciousness to bash that, and we proceed to bash that.
Fast-forward a small spot to now, and we person created each sorts of caller introduction points onto the site, adjacent though we’ve had precocious standards to inquire a question connected Stack Overflow. We just launched our AI Assist feature into wide availability earlier this week, and it’s been ace breathtaking to ticker however users are utilizing that. It is efficaciously an AI conversational interface grounded connected our 90 cardinal questions and answers.
Then, there’s the quality for radical to inquire subjective questions, going backmost to our past speech 3 years ago. Now radical are capable to inquire open-ended questions due to the fact that there’s a spot for Q&A, which is the canonical reply to a question. There’s besides a spot for treatment and speech due to the fact that there’s truthful overmuch changing. It’s not similar each the answers person been figured out, truthful let’s conscionable marque definite that radical person an quality to bash that. That’s aligned with our ngo of “cultivating community,” which is 1 of the three parts of our mission. The different ones are “power learning” and “unlocking growth.” So, we person done each these things to marque definite that we’re not restrictive connected the introduction constituent and the question-asking experience.
The different happening connected the reply broadside is that we realized it’s precise important to spell wherever the idiosyncratic is spending time. Now that the satellite has changed and radical are successful information utilizing Cursor AI and GitHub Copilot to constitute their code, our extremity is to beryllium the captious root for technology. So let’s amusement up wherever our users are. We’veactually go a batch much headless.
For example, we precocious launched MCP servers for some our nationalist level and our endeavor product. What radical are utilizing our level to bash present is to not lone invoke those MCP servers — let’s accidental if they’re penning codification successful Cursor and privation to cognize the quality betwixt mentation 1 and mentation 2 — but to besides to beryllium capable to constitute backmost to our level consecutive from Cursor if they privation to prosecute and get a deeper answer, which is precise unsocial successful the industry.
So, that’s been our merchandise principle: conscionable spell anyplace the idiosyncratic is. But ultimately, we conscionable privation to beryllium that trustworthy captious root for technologists, whether it’s wrong companies oregon extracurricular companies.
How bash you monetize successful a satellite wherever you’re headless, wherever you’re conscionable different database that someone’s querying from Cursor? How does that marque you money?
We marque wealth chiefly successful 2 ways. We person a 3rd way, thankfully, but the 3rd mode is the smallest part, truthful I’ll commencement with the biggest. Our endeavor business, what we telephone Stack Internal, is present utilized by 25,000 companies astir the world. Some of the world’s largest organizations, banks, tech companies, and retail companies usage this merchandise to stock cognition internally. Increasingly, they’re present capable to usage that trustworthy cognition to powerfulness their AI assistants and AI agents to spell bash assorted things.
A bully illustration of this is Uber, which is simply a lawsuit of Stack Overflow Internal and it has Uber Genie. It has thousands of questions and answers connected our platform. Uber Genie plugs into that contented done our APIs, and past it’s capable to spell into things similar Slack channels to automatically reply questions and thrust productivity truthful that you’re not bothering people. It’s rooted successful the organization’s cognition connected our platform.
So, the endeavor concern is our superior business. The 2nd concern is our information licensing business, which we really built lone implicit the past mates of years. One of the things we besides noticed was that a batch of the AI labs were leveraging our information for LLM pre-training and post-training needs, on with retrieval-augmented procreation (RAG) indexing, truthful we enactment up a full clump of anti-scrapers. We worked with third-party companies, and precise rapidly we got calls from a batch of them saying, “We request entree to your data. Let’s enactment unneurotic to formally get access.” We had to bash that, and present we’ve struck concern agreements with each azygous AI laboratory that you tin deliberation of, each unreality hyperscaler that you tin deliberation of — companies similar Google, OpenAI — and adjacent partnerships with Databricks and Snowflake, adjacent though they’re not doing LLM pre-training. That’s been our 2nd concern much recently.
And the 3rd one, which is the smallest portion of our company, is advertising. I deliberation astir radical presume that Stack Overflow is supported wholly by advertising, but it’s lone astir 20 percent of our institution revenue. We person a precise captive, precise important assemblage of developers who walk clip connected the site, truthful we person ample advertisers that privation to get their attraction connected assorted products. In fact, there’s a batch of contention now, truthful they progressively privation to bash that. That’s however we marque money.
So, successful the discourse of becoming headless, for america it’s astir our endeavor product. It works connected a subscription and with hybrid pricing. That’s however we marque wealth there. The information licensing is akin successful that if radical privation access, they’ve got to wage for that. Then, yes, advertizing is constricted to immoderate of the largest companies, and they wage america for that. But there’s ever going to be… I would accidental it’s an “and” versus an “or.” We’re not going to beryllium wholly headless. I deliberation we conscionable privation to springiness the idiosyncratic the enactment to beryllium headless. Plenty of radical inactive travel to the site, and successful that case, we’re capable to equilibrium that retired with these mechanisms.
Do you deliberation caller users are going to travel to Stack Overflow? Stack Overflow is simply a merchandise of the mobile era. There’s an detonation of bundle development. There’s an detonation of community. There’s a civilization successful the worth of gathering apps and services, and there’s caller tools. Stack Overflow is 1 of the cardinal gathering points for that assemblage successful that era.
New developers contiguous mightiness conscionable unfastened Cloud Code, Cursor, GitHub, oregon whatever, and conscionable speech to that. They mightiness ne'er really task retired into a assemblage successful a akin way. Do you deliberation you tin get radical to travel to Stack Overflow straight and question retired answers from different people, oregon are they conscionable going to speech to the AIs?
I deliberation that for elemental questions… By the way, erstwhile we saw the questions diminution successful aboriginal 2023, what we realized is that beauteous overmuch each those declines were with precise elemental questions. The analyzable questions inactive get asked connected Stack due to the fact that there’s nary different place. If the LLMs are lone arsenic bully arsenic the data, which is typically quality curated, we’re 1 of the champion places for that, if not the champion for technology. It’s inactive a precise progressive tract with a batch of engagement and a batch of monthly progressive usage.
The questions being asked are rather advanced, I would say. What we’re besides progressively seeing done the caller mechanisms that we’ve opened up… due to the fact that to reply your question, we privation to springiness radical different reasons to travel to the tract too conscionable getting their answers. So, we person had to broaden our site’s purpose, hence the ngo of “cultivate community, powerfulness learning, and unlock growth.”
What we’ve done is unfastened up caller introduction points, caller ways for radical to engage. We, for example, unlocked the quality for humans to chat with each different to get directional guidance. That’s been a precise fashionable diagnostic connected the tract wherever radical are engaging with different experts. For example, we person radical asking OpenAI API questions, and they tin spell into the OpenAI chatroom and prosecute with different radical who person akin questions, oregon Python experts.
We besides opened up the quality for radical to show their cognition with challenges, efficaciously similar hackathons. We’ve opened up a full bid of challenges, which are a precise fashionable diagnostic now. People walk clip to spell and lick these challenges that we post, and that way, they tin showcase their knowing of the fundamentals, which I deliberation is precise important successful presumption of wherever the satellite is going.
If radical are conscionable utilizing vibe coding tools and codification gen tools, companies bringing successful young endowment request to cognize that they’re relying connected radical who not lone took the shortcut, but besides recognize the fundamentals. We’re 1 of the fewer places wherever you tin really beryllium that you’ve learned the fundamentals. So, that’s the different crushed wherefore we’ve opened up these caller mechanisms.
Then, there’s the 3rd portion of the mission, which is unlocking growth. There’s going to beryllium a batch of occupation disruption due to the fact that of each this. If people’s jobs are going to alteration rather dramatically, inferior developers are going to request a home, adjacent though I deliberation it’s a shortsighted determination by galore companies to halt hiring them considering you request a pipeline. They’re going to request to link with different people, to beryllium capable to progress, learn, and get jobs. Jobs are a precise important part. We struck a concern with Indeed this past twelvemonth to spouse connected tech jobs. It’s conscionable to broaden the scope of our tract truthful that determination are galore different reasons different than asking the questions. They inactive do, but we besides privation to springiness them much reasons to travel to the site.
This comes to the large hostility successful each of this. I spot it playing retired successful each kinds of antithetic communities. I spot it playing retired successful our ain comments successful a batch of ways. You privation to physique a assemblage of radical who are helping different radical get better, and that is being disrupted connected each broadside by AI. Communities that are built astir radical are beauteous resistant to the incursion of AI.
This has decidedly happened connected Stack Overflow. Your moderators person fundamentally revolted implicit the quality to region AI-generated answers arsenic accelerated arsenic they privation to. When you partnered with OpenAI, a clump of users started deleting contented truthful it wouldn’t beryllium fed into OpenAI for training, and you had to prohibition a clump of them. How are you managing that balance? Because if you physique communities astir people, I would accidental the civilization — close present anyhow — is that those communities volition propulsion backmost against AI precise hard.
I would accidental 1 of the astir important things that we’ve focused connected (and that I’ve spent clip connected implicit the past fewer years), is this full propulsion and propulsion of however we deliberation astir AI successful the discourse of our site. Because it’s beauteous wide to america that if we don’t modernize the tract successful the discourse of america leveraging AI arsenic an introduction constituent that it’s going to beryllium little applicable implicit time. That’s not good. So, we’ve taken a precise assertive stance by incorporating AI into the nationalist level with AI Assist, which has been fantastic to see. I’ll locomotion you done the determination connected wherefore we did that. Then, we did the aforesaid happening connected the endeavor side.
If I deliberation astir the idiosyncratic basal astatine Stack Overflow, it’s benignant of similar a large nation, right? We’ve got 100 cardinal people, and there’s decidedly radical connected some sides of the spectrum. We person thing called the 1-9-90 rule. One percent are the hardcore users who person spent a batch of clip with their blood, sweat and tears curating knowledge, spending their clip connected the site, and contributing. Nine percent are doing it successful a mean way, and 90 percent are consuming and mostly lurking.
We inquire radical connected the tract whether oregon not they’re utilizing AI. Our ain surveys fundamentally say, if you took a look astatine the Stack Overflow 2025 Developer Survey, implicit 80 percent of our assemblage members are utilizing AI oregon mean to usage AI. Eighty percent. But the spot level erstwhile they’re utilizing AI is lone astir 29 percent. Only 29 percent of our idiosyncratic basal really trusts what’s coming retired of AI, which is really rather due considering wherever we are due to the fact that determination should beryllium skepticism of this caller technology.
So, there’s enthusiasm to effort it but not to afloat spot it. And with this 1-9-90 rule, I deliberation what we person is simply a halfway radical of users that are ever going to beryllium the protectors of the company’s archetypal mission, which was to make this wholly close cognition basal and bash thing more. Then, we person a precise ample fig of radical who are, let’s say, the adjacent procreation of developers, who are looking to leverage the latest and top tools. It’s precise wide to america based connected surveys and further probe that they privation to usage earthy connection arsenic the interface to beryllium capable to bash this.
It is the astir meaningful alteration successful presumption of machine subject development. If you look each the mode backmost to object-oriented programming galore decades ago, that wasn’t really specified a immense boom. It didn’t really make this benignant of change. But now, we’re successful this infinitesimal wherever everything’s been unlocked. It’s a immense alteration effort, and we’ve had to determine to respect the archetypal ngo and support accuracy astatine the bosom of it. We’re not comfy utilizing AI for answers, for example, due to the fact that it volition make slop. It hallucinates, hence wherefore the spot people is low. But wherefore don’t we incorporated earthy connection interfaces truthful that’s the preferred mode to engage? So, we ended up doing that, some connected the nationalist broadside arsenic good arsenic connected the endeavor side.
That’s been good received by the immense bulk of users, but determination volition ever beryllium a vocal number who volition propulsion backmost against incorporation. Beyond the site, there’s conscionable a level of broader interest astir what each this does to jobs, and what’s going to hap if we fto the feline retired of the bag. So,there’s that interest also, which is understandable.
Let maine enactment a good constituent connected that. I deliberation I recognize that successful a sharper way. If I americium idiosyncratic successful your 1 percent who spends a batch of clip connected Stack Overflow helping different people. The crushed I reply questions for escaped connected your platform, which you monetize successful tons of ways, is due to the fact that I tin straight spot that my effort helps different radical turn and that I’m helping different radical lick problems. That is 1 precise self-contained dynamic. The past clip you were connected the show, our full speech was astir that dynamic and however you got radical to enactment successful that dynamic and the worth of it.
Then suddenly, there’s a precise wide economical payment to the institution that owns the database due to the fact that it’s selling my effort to OpenAI, which is happening crossed the board. It’s going to bash these information licensing deals with each these AI providers, they’re going to bid connected the answers that I person painstakingly entered into this database to assistance different people, and present the adjacent procreation of bundle engineers is going to get auto-complete that’s based connected my enactment and I’ve gotten nothing. I’ve heard that from tons and tons of people. I’ve heard that successful our ain community, and I deliberation I person felt that arsenic assorted media companies person made these deals.
How bash you respond to that? Because it feels similar you were providing a database that you had to monetize successful immoderate ways, but the enactment radical had was the value, and present there’s different benignant of economical worth that is possibly overshadowing, recasting, oregon re-characterizing the enactment that radical have.
There are a mates of points there. One is astir this company’s archetypal DNA and wherefore radical came unneurotic to bash this thing. When I joined the company, I asked a question like, “What’s people’s inducement to walk clip doing this?” I asked the founders, specifically [co-founder] Joel Spolsky, astir this. His constituent was that the bundle improvement assemblage is precise altruistic. People conscionable privation to assistance each different retired due to the fact that radical recognize however frustrating… I utilized to constitute codification galore years ago. I precocious picked it backmost up with immoderate of the code-generation tools, which is absorbing to comparison and contrast. I conscionable retrieve however frustrating it was if you got stuck connected something. Stack was a immense boon erstwhile it was created to unlock this. It was genuinely retired of that. That was the reason.
Even earlier ChatGPT, we besides asked the question, “Should we incentivize users by paying them? Should we springiness them a monetary benefit?” That wasn’t a precocious inquire by a idiosyncratic base. We went and researched people. People were not successful for the money. Plus, it complicates things due to the fact that however bash you justice the outgo for a peculiar JavaScript question comparative to a peculiar Python question? It goes down a rabbit hole, which is untenable. So that’s one. What was the archetypal crushed radical got together; it was astir the mission.
Secondly, successful presumption of wherefore we person to bash this and if it’s unfair. The superior crushed we person to spell down the licensing way is due to the fact that the exemplary of the net has virtually been turned upside down. I cognize you speech astir this, Nilay, with the “DoorDash problem.” People relied connected the exemplary of the net wherever radical spell to hunt engines and websites and you monetize disconnected of ads. I truly empathize with contented sites that are heavy babelike connected advertizing due to the fact that I deliberation astir contented sites’ postulation is down 30 oregon 40 percent, thing similar that. There’s this immense effect displacement wherever companies that enactment these platforms person to… we’re a concern ultimately. So, what bash we person to do? We person to bash what is indispensable and follow a caller concern exemplary to survive, thrive, and bash each the things.
Thankfully for us, we had an endeavor business, which is autarkic of each of this. Thankfully for us, we inactive had the advertizing business, and ample advertisers inactive cared astir our community. So, information licensing lone felt close successful presumption of making definite that we tin efficaciously capitalize connected the infinitesimal and beryllium capable to put backmost into our assemblage truthful that radical who are determination for the close reasons saw the benefits. We’ve invested with each these caller features I conscionable mentioned. Whether it’s these caller contented types, challenges, chat, AI Assist, immoderate of these things, they each instrumentality resources to spell and build. So, we had to spell and leverage the funds we received to beryllium capable to spell bash that.
Now successful the future, we whitethorn see different ways. For example, should we wage our users, springiness them a portion of the information licensing revenue? Perhaps. We ever inquire that question. There are ever ways for america to continue, but this is the existent setup that we person close now. It’s balancing a batch of things.
You mentioned that to get to the information licensing deals, you had to enactment up a clump of anti-scraper tools. You had to spell into secondary and tertiary layers of the stack to get deals from Databricks and different kinds of providers. The AI companies were conscionable scraping your tract before. They astir apt inactive are. Whether oregon not they’re paying you, they’re astir apt inactive conscionable going done the beforehand doorway due to the fact that each of them look to beryllium doing that. Did you person to say, “We’re stopping you,” and past spell get the deal? Or did you say, “Hey, we cognize you’re doing this, but you person to wage america oregon we’re going to commencement litigating?”
It’s determination successful between. We enactment up the anti-scrapers precise quickly. We adjacent changed the mode successful which radical received our information dumps. Again, there’s a equilibrium due to the fact that we ne'er wanted to forestall our assemblage users from grabbing our information for their morganatic needs, similar their schoolhouse projects oregon PhD theses. So, we’ve continued to beryllium unfastened astir our information for our assemblage members, but they person to beryllium assemblage members, and determination tin beryllium companies looking to commercialize disconnected the data.
We were precise circumstantial astir the argumentation terms. We enactment up exertion that prevented radical from grabbing it, truthful we knew precisely who is scraping and who’s not scraping. We reached retired to immoderate of those folks and said,, “Look, basal down due to the fact that you’re putting a batch of unit connected our servers by doing what you’re doing, truthful instrumentality it casual successful here.”
But I deliberation my characterization of those companies is that they don’t care. Some of them attraction and privation to beryllium bully citizens, and immoderate of them perfectly bash not attraction and they would similar the smoke. You tin conscionable categorize them. There’s a crushed Amazon is suing Perplexity. They told Perplexity to halt it and Perplexity won’t. The New York Times, arsenic we’re speaking today, is suing Perplexity. Then, determination are different players acting successful antithetic ways and striking antithetic kinds of deals.
Walk maine done 1 of those deals. When you went and struck your woody with OpenAI, was it, “We’re going to halt you, and if you privation the doorway to beryllium unfastened again, you person to wage us?” Or was it, “You cognize this is wrong. We tin instrumentality each the method and ineligible measures, but we should really conscionable get to the woody correctly?” Walk maine done that conversation.
We were incorporating thing similar OpenAI into our product. Remember the codification reddish concern wherever we were astir to denote our AI response. So, we were really utilizing that exertion to bash what we had to bash to incorporated AI into the nationalist level and our endeavor product. We had a narration with them, and we besides said, “Look, this is not going to work. It’s not tenable, and this is the caller mode of working. Maybe we request a caller concern statement for you to usage the data. Let’s really person a conversation.” And recognition to them, they were precise partner-centric astir that. I was precise impressed by OpenAI and companies similar Google that are each precise unfastened to engaging connected this taxable and wanted to beryllium liable AI partners.
They got it immediately, adjacent earlier we asked them. It wasn’t this big, “Let’s spell person this speech from the crushed up and warrant wherefore it had to beryllium done.” We conscionable said, “Look, this is what needs to hap due to the fact that this is simply a caller concern model.” We got into the speech beauteous quickly. “What precisely are you looking for? Which format of information bash you privation to scrape the content? Do you privation bulk uploads? Do you privation API calls? What bash you want?”
So, we got into that full mix. And punctual you, Nilay, these are recurring revenue-type deals. These are not 1 clip payments. If you privation entree and you privation continued entree successful the future, you’ve got to support paying, adjacent with the humanities data. So that’s however these are acceptable up.
So yes, they were precise collaborative partners. But you’re right. There are contradictory players. They accidental thing and their actions beryllium different things successful presumption of however they’ve engaged. There are holdouts for definite and radical who are not precisely accordant with their word, and that’s unfortunate. I deliberation each institution similar america has to determine what to bash astir that. We’re successful assorted stages of these conversations with radical connected however to marque sure we sensibly get them to bash the close thing.
Now you person to sanction 1 of those companies. Who bash you deliberation is holding retired otherwise than their nationalist posture?
I’d alternatively not beryllium specific, but each the accustomed suspects that you’re covering are the accustomed suspects that we are encountering. That’s however I would enactment that.
Let maine inquire you astir the recurring gross portion and past I privation to get into the Decoder questions due to the fact that I deliberation they’ll beryllium illuminating aft this conversation. There’s a consciousness that we’ve done each the pre-training that we’re going to do, right? Scraping the net is not the aboriginal of these models, and determination needs to beryllium immoderate different leap.
Stack Overflow’s existing corpus of accusation is the invaluable thing. There’s a batch of information. There’s 20 years of worldly successful that database. What’s the worth of “you person to wage america again to bid the adjacent mentation of Gemini oregon GPT” and the worth of “there’s incremental accusation being added to the existing database?” Because that seems similar a wide divided to me.
The mode we’ve thought astir this is that each exemplary that’s being trained is trained connected immoderate corpus of information. You’re going from GPT X to Y. If you’re leveraging our archetypal information oregon immoderate derivative of that from a anterior exemplary successful the caller exemplary that you’re training, past you person to wage america for it. That’s efficaciously the ineligible request for doing that. So, it’s a cumulative aspect. Let’s not hide that. People person to wage for the cumulative data. It’s not conscionable that it was utilized backmost successful the day. And yes, comparative to 20 years, 1 year’s worthy of accusation is going to beryllium less, but that’s wherefore you’re getting 20 positive one. That’s the idea. So, that’s the mode the ineligible statement has been acceptable up.
Is it per year? Is it that each year’s worthy of information is simply a chunk of money? How does that work?
No, it’s cumulative, similar the full corpus: humanities information arsenic good arsenic thing going guardant for the pursuing year. All that is 1 accumulated information set, and that’s efficaciously charged arsenic one.
So, this year’s information doesn’t get pulled into Gemini 3’s information set, which conscionable came out, right? Every caller question and reply successful Stack Overflow since Gemini 3 came retired is not incorporated successful Gemini 3’s training.
Correct.
So you’re benignant of betting that they’re conscionable going to bid ever-bigger models. Is that however it’s structured successful your mind?
Yeah. And immoderate companies person asked for wide usage cases. There are pre-training usage cases. Even beyond that, you tin leverage the information successful galore antithetic ways for AI and non-AI usage cases, similar hunt usage cases. But correct. There whitethorn beryllium scenarios wherever much larger models are built, and our information is going to beryllium utile for those scenarios. But there’s going to beryllium RAG indexing, post-training needs, each sorts of scenarios. It’s rather absorbing to spot immoderate of the frontier labs inquire for precise circumstantial slices of information that they find useful.
Remember, there’s not lone questions and answers. We’ve got the remark history, the metadata history, the voting history, the past of User A going down this path. So, it’s a batch of fantabulous discourse for things similar reasoning and being capable to mimic the quality brain. It’s astir similar 1 quality encephalon that’s been documented.
This, I think, brings maine to the Decoder questions. You’ve restructured the company. There’s been immoderate rounds of layoffs. You’ve refocused connected the SaaS concern successful a existent way. I deliberation we should speech astir that. But there’s the thought that we’re going to bid ever-bigger models and that volition beryllium the increasing portion of the business, versus wanting immoderate slices, versus RAG really being the aboriginal for a batch of these different businesses. You would marque antithetic decisions based connected which 1 of those is going to turn faster, and I don’t deliberation anybody knows. Maybe you know. You tin archer maine if you cognize oregon you cognize idiosyncratic who knows,
But we’re successful a precise nascent play for each of this development. How person you structured the institution to adjacent retired each of that hazard and beryllium prepared for however radical volition really request the information successful the future? Maybe you know, I don’t deliberation I know.
It’s hard to foretell clearly. We’ve got immoderate superb minds, similar Demis Hassabis astatine Google and others, who are coming up with the adjacent generational leap of immoderate the equivalent of transformer exertion is to support going towards this eventual goal, which is AGI.
So yes, you’re right. It’s hard to cognize precisely what shows up and when. However, the mode we are structured arsenic a institution is successful efficaciously 2 parts. One portion is the endeavor business. We person a merchandise team, engineering team, and a go-to marketplace squad focused connected that. The endeavor products concern is precise clear. Then, the different broadside of the location is what we telephone “community products”. That squad focuses connected the nationalist platform, each the features that we’ve talked astir truthful far, AI Assist and each the subjective caller questions, and chat. This is the assemblage side. The information licensing concern sits successful that group, and truthful they are tied to the engagement of the site. So, there’s a virtuous rhythm there.
So, that’s however we’re split. Again, that includes merchandise resources, a tiny go-to marketplace team, engineering folks, etc. Also, there’s our assemblage absorption team, which spends a batch of clip with the moderators to prosecute there. So, it’s divided down the middle, and our different functions enactment both.
How large is Stack Overflow today? I cognize you laid disconnected astir a 4th of the institution successful 2023 due to the fact that of postulation declines. You’ve built different businesses. How large are you today?
We are astir 300 radical oregon so. Yeah, that’s our size.
Do you deliberation the gross you’re going to spot from information licensing oregon your SaaS concern is going to let you to turn again?
We judge so. We’re a increasing company. We’re profitable. So financially, we’re thankfully successful a precise bully spot. Now, it’s each astir placing bets connected the highest maturation opportunities. We judge creating this cognition quality furniture wrong endeavor done our Stack Internal merchandise is simply a phenomenal maturation accidental due to the fact that customers are pulling america successful that direction, which is fantastic to see.
That’s wherever you’re headed. Those are immoderate of your announcements. I privation to speech astir that successful a second, but I’m conscionable focused connected the aboriginal is the SaaS concern for enterprise. The aboriginal is information licensing. Are you inactive seeing declines connected the nationalist website?
I would accidental it’s stabilized for a fewer months. I deliberation the engagement and the enactment connected the tract are really beauteous stable. The driblet successful questions that I mentioned antecedently were each the elemental questions, and it seems to person gotten to a spot wherever analyzable questions are being asked. We person a accordant fig of radical connected the tract each day. We person thing called a heartbeat. In fact, anybody tin spell cheque it out. If you spell to StackOverflow.com and scroll to the bottom, you’ll spot however galore users are online astatine the moment. And truthful you’ll ever spot a precise accordant fig there. I deliberation it’s hard to foretell the future, but surely I deliberation the worst of it was backmost successful 2023, 2024, for sure..
The question I inquire everybody connected Decoder, arsenic you good know, is however bash you marque decisions. The past clip you were connected the show, you said that you wanted to beryllium connected the beforehand lines arsenic overmuch arsenic possible, and you wanted to beryllium informed by radical who are connected the ground. Has that changed successful the past 3 years? What’s your decision-making process?
Not really. I deliberation it’s precise important for leaders and radical similar CEOs to person the afloat discourse due to the fact that you can’t person filtered information. I spent a batch of clip with users and customers truly knowing what they attraction about. That’s however we adjacent decided connected thing arsenic arguable arsenic the AI Assist feature. It wasn’t evident to say, “let’s spell and physique that” if you were not listening connected the ground. Because if you conscionable perceive to the header statements, it seemed similar radical didn’t privation to integrate AI into Stack Overflow. But the world is that many, galore users — the 90 percent I mentioned — wanted a earthy connection interface. That’s what they are comfy utilizing these days, and that’s what they wanted. So that’s wherefore we decided to bash that.
One of the things that I spot everyplace is that split. You mentioned 1-9-90 before. There’s a precise vocal minority. We spot it successful our ain postulation connected TheVerge. We screen AI deeply, and we are told that everyone hates it. I recognize why. I recognize the comments. That’s what I’ll say. I get it.
Then, I spot the usage numbers. I spot the postulation connected our sum of AI tools. I spot companies similar yours saying that everyone’s utilizing it. There’s immoderate gigantic divided determination that is dissimilar immoderate different I deliberation I’ve ever encountered covering exertion implicit the past 15 years. Everyone says they don’t similar it, and past they’re utilizing the hellhole retired of it.
The lone different 1 I tin deliberation of that is somewhat comparable is however radical consciousness astir Adobe. Everyone uses the tools and everyone’s huffy astatine the Creative Cloud subscription fee. It’s fundamentally the lone examination I have. It’s not a bully one, it doesn’t representation one-to-one, but it’s arsenic adjacent arsenic I’ve travel to that split.
What successful your caput accounts for that divided with AI wherever radical don’t similar it, are precise vocal astir not liking it, and past we spot the numbers and everyone’s benignant of utilizing it anyway?
I deliberation it comes down to that information constituent I shared earlier: that 80-plus percent of our idiosyncratic basal wants to usage AI oregon is already utilizing AI for code-related topics, but lone 29 percent of that colonisation trusts AI. Trust is simply a precise heavy word. Why don’t you spot something? You don’t spot thing due to the fact that you don’t deliberation it’s producing precocious integrity, close answers. You whitethorn not spot it due to the fact that it whitethorn regenerate you 1 day, and you don’t similar that either.
But astatine the aforesaid time, you’re going to beryllium funny astir what’s going to beryllium specified an iconic force. So, you privation to support trying and utilizing it and possibly getting amended to ultimately, hopefully, leverage it to your payment truthful that you tin beryllium applicable arsenic an idiosyncratic developer oregon to spell a batch faster successful the future.
I deliberation that’s astir apt the reason, particularly with the developer audience. I deliberation they’re a precise discerning, let’s accidental analytical audience, and they tin beryllium prickly if things are not deterministic with the mode it has been for a precise agelong time. This is simply a precise probabilistic benignant of technology. It’s astir similar going to a casino and utilizing a roulette wheel. You’re going to get a antithetic reply each time. It’s not needfully comforting for idiosyncratic who’s penning precise circumstantial codification and looking for precise circumstantial outcomes.
I deliberation radical volition get utilized to that implicit time. It is simply a caput displacement alteration for radical penning software. That whitethorn beryllium the crushed wherefore radical are intrigued, due to the fact that it is truthful almighty arsenic a technology. Don’t get maine wrong. We usage vibe coding each implicit the spot astatine Stack. All the features I mentioned to you, our designers and merchandise managers vibe coded it archetypal to amusement it and get idiosyncratic feedback earlier we went and built it. We’ve embraced these tools internally for that benefit. So, determination volition beryllium ways successful which you consciousness comfy utilizing it, but I deliberation that’s the halfway reason.
I really privation to speech astir that dependency. You cognize that it’s not trustworthy, but you are gathering products with it. You are gathering products to alteration it. The large announcement this week is Stack Overflow AI Assist. You’ve talked astir it respective times passim this conversation. You’re betting that this is what radical want, right?
You’re betting that an AI-powered instrumentality connected Stack Overflow volition assistance much people. Maybe that happening is going to hallucinate similar brainsick and springiness radical the incorrect answer. Maybe it’s going to, I don’t know, tell Kevin Roose that it loves him connected the beforehand leafage of The New York Times. I conscionable similar teasing Kevin. Hi, Kevin.
How bash you marque that stake erstwhile you cognize that the users don’t spot it, but you inactive person to rotation retired the tools due to the fact that that’s wherever the manufacture is going?
We judge we’ve really unlocked a precise important facet of that spot contented and responded to it. Our AI Assist diagnostic is simply a RAG positive LLM solution. Effectively, it provides an reply that archetypal goes retired into our corpus of tens of millions of questions and answers. We person 80 to 90 million. Those are archetypal utilized to nutrient a response. If they don’t, there’s a fallback enactment wherever it goes and leverages our spouse OpenAI, for example, to beryllium capable to spell and nutrient trustworthy cognition from different parts of the web. So, it archetypal searches done our trusted, attributed cognition base. It produces the links truthful radical tin spell down that way and larn much astir it. Attribution is precise important to us, right?
That’s however we are navigating the hallucination element., We’re perpetually investigating it, and it’s not perfect. There volition ever beryllium improvements. But we’re besides looking astatine wherever the world’s headed, and if these models proceed to get better, past we should payment from those improvements. Ultimately, we should person the champion solution due to the fact that you’re getting grounded quality discourse positive the LLM strengths.
I deliberation the happening that I’m astir funny successful is the religion that the models volition proceed to get better. I’m not 100 percent definite that’s true. I’m not definite that LLM exertion arsenic a halfway exertion tin really beryllium intelligent. As you’re saying, radical are precise attracted to the earthy connection constituent of these models and the interface displacement that’s happening. There’s the level displacement that we’re each going to make bundle with earthy connection oregon fto the LLMs crushed and fundamentally self-prompt themselves into an answer.
There’s thing determination that seems risky. Are you perceiving that hazard today? Are you factoring that successful oregon are you saying, “This is wherever we’re astatine present and we person to proceed until thing changes?”
I deliberation the archetypal happening astir the LLM betterment level is that I’m with you. It’s hard to cognize however things are going to improve. When you conscionable deliberation astir the past six months and that it’s plateaued, boom, present comes Gemini 3. Again, we’re arrogant partners of Google. It is simply a effect shift. It blows each different exemplary retired of the water. Now, we’ve got a codification reddish concern successful different companies, different LLM competitors. So, it’s —
Sam Altman did benignant the words “code red,” by the way. I privation to accidental that. So that’s precise good.
Perhaps that was the mode for maine to spell backmost successful the day. The constituent is that it is astonishing that you’re capable to nutrient that benignant of a leap erstwhile things person seemingly plateaued. I don’t know. I can’t foretell that due to the fact that these folks similar Demis are deep, heavy successful the subject. So, that’s true, but there’s besides going to beryllium different innovations that we are not privy to. Like I was explaining previously, transformers were evidently a immense improvement successful this space. There whitethorn beryllium thing that these AI probe labs travel up with that we’re not adjacent alert of that’ll propulsion things. Ultimately, we cognize that the compounding effects are precise real. We’ve got unlimited compute, highly almighty chips, and GPUs that are present lowering their costs.
I was astatine AWS re:Invent this week wherever they talked astir the Trainium chips, withTrainium3 and Trainium4 being built out. There’s going to beryllium conscionable the proliferation, and past you’ve got entree to data, which we’ve already talked about. So, erstwhile these things harvester and compound together, it’s going to nutrient precise magical outcomes. I deliberation that’s the content and wherefore my ain assumptions are rooted successful the information that it’s going to amended overall.
The crushed I’m asking astir this successful the discourse of the tools you’re gathering and everyone utilizing it but lone 29 percent of radical trusting it is due to the fact that you’ve got to bring that fig mode up to scope the returns that each institution investing successful AI is trying to reach, including yours. I don’t cognize if the halfway exertion tin bash that. I don’t cognize if you tin stack a clump of technologies to bash that.
But I bash cognize that 1 mentation of the aboriginal of bundle improvement is everything vibe coding each the time. Another mentation of the aboriginal of bundle improvement looks similar penning intensely agelong prompts for models that are pages and pages themselves, which seems ridiculous to maine but possibly that is the future.
Another mentation looks like, “We instrumentality to humans astatine the cutting borderline of bundle development, and they are co-developing with an AI exemplary and past possibly asking Stack.” That feels similar a richer, much absorbing future, but it’s unclear to maine wherever we are connected that spectrum oregon however that adjacent plays out.
I deliberation that we person not lone a bird’s oculus presumption successful the discourse of our nationalist assemblage with this 29 percent information point, but our endeavor customers springiness america a wide presumption connected wherever they are. The ROI question is being asked precise heavy wrong companies, and I deliberation 2026 is going to beryllium the twelvemonth of rationalization. If 2025 was the twelvemonth of agents — wherever each tool’s being tried retired wrong these companies — there’s a precise unfastened scenery for CTOs to spell and bargain and trial assorted tools. So, I deliberation it’s been a tremendous clip for immoderate of the companies gathering these tools.
But 2026, the CFOs volition unit with, “Okay, productivity improvements person to travel from these. We’re going to prosecute little radical arsenic yearly readying happens.” There’s going to beryllium tremendous unit successful the strategy to beryllium what the existent worth is. Everybody astatine the elder level that I’ve talked to acknowledges that this is simply a large shift, and they each are leaning into it beauteous hard. They’re each waiting for the improvements. I deliberation astir companies volition accidental that they person seen improvements successful the tiny groups wherever they’ve tested these tools, but that’s a self-selecting radical due to the fact that they’re enthusiasts. They volition spot large productivity gains, which is astir apt true, but there’s an implicit drop-off successful productivity arsenic you deliberation astir the adoption crossed the enterprise. Probably for the aforesaid reasons, by the way. You’re telling employees to usage tools which whitethorn enactment them retired of a job, truthful wherefore would they privation to bash it? Or much fundamentally, if these tools are not cleanable and they’re hallucinating and volition beryllium held accountable, that’s not bully either.
Then of course, there’s the process change, the mindset change, that you person to wholly alteration your workflows of however you work, each the endeavor alteration absorption work. That is wherefore our solution, Stack Internal, is gathering this quality curation layer, oregon cognition quality layer. With the MCP server connected top, the cognition basal successful the middle, and past our quality to ingest cognition from different parts of the company, we tin make these atomic Q&A that are highly adjuvant to basal your endeavor cognition successful these AI agents. That is that solution. That’s wherefore we’ve truly gone hard astatine producing that.
We’ve seen a truly beardown effect from our customers. We person immoderate of the world’s largest companies leveraging, testing, and gathering this with us, similar HP, Eli Lilly, Xerox, each these companies. It’s been astonishing to spot them gravitate to america due to the fact that they privation to fulfill that ROI constituent that you’re making. What are the gaps? It’s spot again. So, they privation this spot furniture done a institution similar Stack that they tin really insert successful betwixt their information and their AI tools.
When you accidental the property of rationalization, what I perceive is that you deliberation the bubble’s going to popular successful 2026.
I deliberation the exuberance successful conscionable trying retired assorted tools and these unlimited AI budgets volition yet travel to roost. I’m not definite astir the bubble bursting. There’s decidedly going to beryllium corrections on the way. There’s nary question astir it if you look astatine history. The fig of vendors that are selling into these companies. What I’m precise amazed by is that there’s akin functionality. There’s 4 of them being tested wrong these companies.
All these companies whitethorn person gotten to $100 cardinal successful yearly recurring gross (ARR), but astatine immoderate constituent there’s going to beryllium churn erstwhile the CTO decides, “You cognize what, I’m lone going to possibly usage 1 and possibly a 2nd 1 arsenic a backup.” No antithetic from the cloud. If you deliberation astir the multi-cloud satellite backmost successful the day, radical didn’t person 3 clouds retired of the gate. Now you person possibly 1 superior unreality and 1 secondary cloud. This is different, but astatine the aforesaid time, you’re going to person similar 4 antithetic vibe coding tools successful my opinion.
When you deliberation astir Stack Overflow arsenic being that spot layer, that’s the worth add, that’s what you tin possibly complaint a premium for implicit time. You’re inactive babelike connected a instrumentality that lone 29 percent of your users trust. And I cognize you’re talking astir RAG and that your different systems are doing that. How bash you deliberation you bring that fig up with Stack Overflow? Is that imaginable for you to bash oregon does the ecosystem person to bash it for you?
I deliberation it’s an ecosystem constituent mostly speaking due to the fact that that is much a reflection of what radical person entree to beyond Stack, right? They’ve got entree to each these different options. What we tin absorption connected is being the astir captious root for technologists. So for us, it’s astir making definite this contented is fantabulous and precocious quality, but that it’s besides a large spot for radical to cultivate community, link with each other, larn and turn successful their careers.
I deliberation the mode we tin bash it is done the information that we are moving with each these large AI labs and the information that our trustworthy cognition that’s been painstakingly quality curated is going to travel into these LLMs, which yet nutrient trustworthy answers. We’re benignant of 1 furniture behind, but that’s wherever we operate. We run successful that spot furniture oregon the information layer, if you will, successful the discourse of LLMs. So, that’s our indirect publication to that 29 percent.
Prashanth, this has been a large conversation. You’re going to person to travel backmost sooner than 3 years adjacent time. What’s adjacent for Stack Overflow? What should radical beryllium looking for?
Our biggest absorption volition beryllium making definite that we physique this endeavor cognition quality furniture for companies to genuinely usage AI agents successful a trustworthy way. We are very, precise excited astir our Stack Internal merchandise connected the endeavor side, which we launched a mates of weeks agone astatine Microsoft Ignite, arsenic good arsenic helping our assemblage users link with each different to truly larn arsenic good arsenic turn their careers connected the nationalist platform, arsenic I’ve mentioned throughout.
There are going to beryllium truthful galore avenues and caller introduction points, similar our AI Assist, our subjective content, chat, and different things that radical hopefully find utile arsenic things alteration astir them precise rapidly. They tin beryllium portion of this astonishing assemblage and assistance each different out. So those are the 2 focuses, endeavor arsenic good arsenic our nationalist community.
All right. Well, erstwhile the bubble pops adjacent year, we’re going to person you travel backmost and we’re going to accidental you predicted it.
Thank you, Nilay. I admit it.
Questions oregon comments astir this episode? Hit america up astatine [email protected]. We truly bash work each email!
 (2).png)











English (US) ·