Skip to main content

Enterprise-focused AI startup Cohere launches demo chatbot Coral and Chat API

An office is depicted underwater in front of a colorful coral reef and fish.
Credit: VentureBeat made with Midjourney

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Toronto, Canada-based Cohere, founded by ex-Googlers, has emerged as one of the leading startups amid the increasingly crowded generative AI marketplace with its focus on developing foundation models and other AI-powered technologies for enterprises.

Today, the company jumped into the fray of the AI chatbot race by releasing a new application programming interface (API), allowing third-party developers of other enterprises to build chat applications based off Cohere’s proprietary large language model (LLM), Command.

“Whether you’re building a knowledge assistant or customer support system, the Chat API makes creating reliable conversational AI products simpler,” wrote Cohere in a blog post announcing the service. It joins existing APIs from Cohere for content generation (Generate) and text summarization (Summary).

In addition, Cohere has provided its own free chatbot demo on the web, the Coral Showcase, to allow users to test out its chatbot on their own. However, you’ll need to sign in with your Google or Cohere credentials to access the environment. Coral initially introduced the Coral chatbot for customers in July, however, the API allows them to build it into their own internal or external-facing apps.

In VentureBeat’s tests of the system, the Coral chatbot powered by Command was noticeably slower than some competing, closed-source chatbots like OpenAI’s ChatGPT or Anthropic’s Claude 2 at returning responses, taking at least two or more seconds to generate them. However, the responses were largely accurate and up-to-date, and clearly written and did not contain visible hallucinations.

Screenshot of Cohere’s Coral chatbot demo environment. Credit: Cohere

It also cited sources and included links back to them. However, it failed to find some of the most recent information when asked about a specific company.

Screenshot of Cohere’s Coral chatbot demo environment. Credit: Cohere

RAG Time

Cohere touted the fact that its new chatbot API features Retrieval-Augmented Generation (RAG), a method of controlling a chatbot’s information sources, allowing developers to constrain them to their own enterprise data, or expand them to scan the entire world wide web, while still taking advantage of the chatbot’s original training and power to both interpret and generate text in natural language.

As Cohere writes in its blog post announcing the new Chat API, “RAG systems improve the relevance and accuracy of generative AI responses by incorporating information from data sources that were not part of pre-trained models.”

In the case of Cohere’s new Chat API with RAG, there are only two supported sources of additional information developers can add: a web search implementation or plain text documents from their enterprise (or another source).

“For example, a developer building a market research assistant can equip their chatbot with a web search to access the latest news about trends and competitors in their space,” wrote Cohere in its blog post, later noting, “We train Command specifically to perform well on RAG tasks. This means you can expect high levels of performance from Cohere’s model.”

Yet based on VentureBeat’s limited initial tests, the reliability was not always up to what we might expect from a market research assistant, failing to return some current news. However, our tests were extremely limited and only consisted of a few queries so far.

More features available now and coming up

In addition to the RAG-enabled Chat API, Cohere noted that its platform also allows third-party developers to connect three modular components from the startup.

These include a “document mode” allowing the developer to specify which documents they want their Cohere-powered chatbot experience to reference when answering user prompts, a “query-generation mode” that instructs the chatbot to return search queries based on the information the user submits in their prompt, and a “connector mode,” that lets the developer connect their chatbot to the web or another information source.

Cohere also noted that it plans to expand this connector/modular ecosystem.

The announcement comes hot on the heels of rival OpenAI’s move to reintroduce web browsing capabilities to ChatGPT yesterday, after a long period in which they were restricted following an initial March 2023 release and users bypassing website paywalls with the capability, as well as OpenAI’s earlier move to court enterprise users more directly with the announcement of its ChatGPT for Enterprise subscription service tier.