Marqo has raised $12.5 million in a Series A funding round to advance the adoption of its search platform that helps businesses build generative artificial intelligence (AI) applications that are more relevant and up to date.
The company’s vector search platform unlocks the value of unstructured data across end-user search, retrieval-augmented generation and other business-critical applications, Marqo said in a Tuesday (Feb. 13) press release.
“We founded Marqo because we recognized that vector search was going to be instrumental in realizing the full potential of AI in our day-to-day lives, but it is far too complicated for developers and enterprises to deploy,” Tom Hamer, CEO and co-founder of Marqo, said in the release. “We saw a need to invent a platform that not only generated superior vector embeddings but also empowered customers to build advanced search experiences within minutes, not months.”
Marqo’s vector search platform uses machine learning (ML) models to return “hyper-relevant” search results in real time, according to the release. It handles the entire process from embedding generation to storage and retrieval and can be implemented through a single application programming interface (API).
This implementation is less costly and complex than existing approaches to deploying vector search, which require stringing together embeddings models and separate databases, the release said.
The new round of funding was led by Lightspeed and brings Marqo’s total financing to $17.8 million, per the release.
Lightspeed Partner Hemant Mohapatra said in the release that the adoption of products like ChatGPT is driving the world of search from “keyword-based” to “natural language-based.”
“Marqo’s mission is to bring this transformational technology to every company in the world through a simple developer API and an enterprise-grade platform offered on-prem and on cloud,” Mohapatra said. “Their early growth has been phenomenal and we are excited to back this amazing team helping consumers to search the way they think.”
In another recent development in this space, Patronus AI and MongoDB said in January that they have partnered to bring automated large language model (LLM) evaluation and testing capabilities to enterprise customers.
The collaboration will combine the strengths of Patronus AI and MongoDB’s Atlas Vector Search product to enable enterprises to develop reliable document-based LLM workflows.