Attention: This is a site-wide announcement.

Algolia launches AI-powered NeuralSearch vector and keyword search API

Algolia introduces NeuralSearch, an AI-powered search engine that combines vector and keyword search for improved relevance, utilizing a large language model and Neural Hashing technology to enhance scalability and reduce computational costs.
November 30, 2022


Hosted search and discovery platform for enterprise Algolia Inc. today launched NeuralSearch, a vector and keyword search engine using a single application programming interface that provides end-to-end artificial intelligence processing for every query.

Algolia NeuralSearch uses an advanced large language model, the same AI technology that OpenAI LP’s ChatGPT is built on, to learn and improve user results constantly, as well as understand natural language queries. That means users can ask questions and receive fast answers without the need to carefully or repeatedly guess at what to type into the search form to get the response they want.

“NeuralSearch provides users with a smarter and more intuitive way to discover the most relevant content they want, when they need it, irrespective of the type of query presented,” explained Algolia Chief Executive Bernadette Nixon.

The platform functions by analyzing the relationship between words and concepts and generates vector representations in a database that captures the meaning in an abstract and contextual manner. Vector matching uses “nearest-neighbor” contextual tracking by focusing on the meaning of a word or phrase and searching based on how that matches other words and phrases that are similar to it in meaning. It can then combine contextual vector-based logic alongside Algolia’s already powerful keyword matching engine.

In order to address scalability issues that come with vector searches, for example the high burden of costs, Algolia said it pioneered a technology called Neural Hashing, which compresses search vectors from 2,000-decimal long numbers into static length expressions which makes computing more economical. Prior to the company’s breakthrough, it said, vector computation was too computationally expensive to run.

“By adding Neural Hashing of vectors to its existing keyword-based search within a single index, leveraging a single API, Algolia has the potential to disrupt AI-powered search with significantly better precision and recall, in a manner that requires less manual work to set up and update, while incurring fewer storage and processing costs,” said Hayley Sutherland, a research manager for conversational AI at the market intelligence firm IDC.

Customers in e-commerce and retail will particularly find NeuralSearch and vector searches significant, Sutherland added, because product discovery is often hampered by issues where users fail to find what they’re looking for in traditional keyword searchers. If they type something in such as “banana pudding” and it fails to find anything, they abandon their search even though they might accept “banana custard.” With a vector search, it can open up discovery for numerous similar and related products that might otherwise be neglected simply because keywords didn’t quite match the intent.

As an easy-to-use API, customers get access to Algolia’s AI-search software-as-a-service platform directly in their applications, which means that they can get this in production quickly. “Specifically, we provide the set-up, scaling and management of all search capabilities and services — all of which helps accelerate and power discovery,” said Nixon.

Algolia’s NeuralSearch incorporation of AI also means that as the index changes, or as new products and services are added to an enterprise offering, new content is uploaded or items take on new meaning, the AI-powered search will learn and adjust automatically. It is not necessary to retrain the underlying search algorithms manually because it’s constantly learning from user searches and will simply fine-tune keywords and concepts.

Original article here.