Reed Albergotti
Reed Albergotti
Feb 15, 2023, 12:41pm EST

Startup Replit launches a ChatGPT-like bot for coders


Sign up for Semafor Technology: The fight for the industry’s future. Read it now.

Title icon

The Scoop

Software developers will have access to a powerful new technology starting Wednesday: The ability to converse with an AI bot similar to ChatGPT that has been  trained by startup Replit to help write computer code.

The first-of-its kind technology called Ghostwriter Chat by the software development service is another way the latest version of AI is shaking up the way we live and work.

Founded in 2016, Replit is like Google Docs for developers, allowing them to write and collaborate on software in real time. It also offers a “bounty” service in which companies and individuals can ask the Replit community to create specific code for a fee.

“Coding turned out to be almost the perfect use case for LLMs,” Replit co-founder and CEO Amjad Masad told Semafor in an exclusive interview, referring to the “large language models” that power technology such as ChatGPT.

Masad believes the software will give the most skilled coders the ability to create “one-person billion-dollar companies” by exponentially increasing their ability to build complex software programs, including new artificial intelligence-enabled businesses.


The updated AI features include a “debugger” that will tell programmers when they’ve made a mistake in their code and offer suggested changes. Ghostwriter can also test-run the code to make it work.

The chat bot and real-time debugger are features that haven’t yet been introduced in Copilot, a feature in Microsoft’s GitHub that can autocomplete lines of code by predicting what a developer is planning to write. Copilot is powered by OpenAI’s Codex AI model.

“There’s definitely a race between us and Microsoft,” Masad said. “We’ll be first. They’ll probably copy us.”

He said the end goal for Replit is to create a “completely autonomous” AI coder that can be treated like a human software engineer. It’s aiming to create an early version of that within a year.

Title icon

Reed's view


Replit is an example of the kinds of new companies that will emerge from the latest advances in AI.


Replit’s advantage isn’t a new breakthrough in artificial intelligence itself. It is data.

Replit has dozens of companies using the “bounty” service, describing in plain English the software they want to build, and the 20 million coders who have used its service have written 240 million software programs. It also offers software hosting, so it can see how consumers use it. That whole process combined makes for a gold mine of data that can be used to train AI models on how to properly create computer code.

Right now, the company says its Ghostwriter is like a junior programmer. But if Replit can amass more data and continue to improve the model, Ghostwriter will get better over time.

Replit hopes that head start will give it a lead, or a “moat” to fend off competitors, even Microsoft’s GitHub, an industry juggernaut.

If Replit is right, and it’s possible to give developers an unprecedented leap in capabilities, it could lead to massive change.


Juan Ruiz, a 21-year-old developer in Phoenix, Arizona, has been using a beta version of the Ghostwriter chatbot to build software for his own AI startup, called Crear.AI, a content creator assistant that is particularly good at translating to and from Spanish.

Ruiz said he’s tried GitHub Copilot but that Ghostwriter is a more powerful tool, allowing him to instantly “spin up” software. About 70% of the time, the code just works. When it doesn’t, he rephrases the description or tweaks it manually. “It’s been a game changer for me,” he said. Building the startup “would have been harder and slower,” were it not for the AI help.

Title icon

Room for Disagreement

Replit will face a myriad of challenges in the road ahead.

Researchers have found that code generated by OpenAI’s Codex contained security vulnerabilities 40% of the time.

Stack Overflow, a question-and-answer site for coders, temporarily banned users in December from sharing responses generated by AI and ChatGPT. “The primary problem is that while the answers which ChatGPT produces have a high rate of being incorrect, they typically look like they might be good and the answers are very easy to produce,” Stack Overflow wrote.