With billion-vector performance, Zilliz Cloud anchors emerging AI stack
Other databases boast about crunching millions of vectors per second. Zilliz Cloud is the first to process a billion. That could be the key to making large language models (LLMs) like ChatGPT tell the truth.
Zilliz Cloud is the managed service from Zilliz, the inventors of Milvus, the open source vector database used by more than 1,000 enterprises around the world. It’s purpose-built for AI and other applications powered by unstructured data. It represents data as high-dimensional vectors, or embeddings — the kind generated by machine-learning models — making it a perfect fit for uses including:
- E-commerce: Powering product recommendation engines with multiple sources of unstructured data, such as search history and past purchases
- Semantic text search: Processing and querying text across multiple vectors such as intent, location and previous search history
- Targeted advertising: Improving the relevance and effectiveness of ad delivery
- UGC recommendation: Identifying related content even if other users haven’t liked or engaged with it
- Risk-control and anti-fraud: Spotting vulnerabilities, anomalous behavior and illicit activities
- New drug discovery: Discovering new drugs by identifying similar chemical compounds.
With its vector-native architecture and hyperscale performance, Zilliz Cloud is also an ideal data source for LLMs like ChatGPT. LLMs have a well-known tendency to fabricate answers in the absence of information — a phenomenon known as ‘hallucination.’ However, hallucination can be minimized by supplying LLMs with external sources of domain-specific data.
Zilliz Cloud makes that easy. Using OpenAI plug-ins to connect to ChatGPT, it provides the basis for the emerging CVP (ChatGPT/Vector DB/Prompts-as-Code) technology stack.
Zilliz Cloud offers all the capabilities necessary to operate LLM applications at scale. It retains the low latency, superior scalability and heterogeneous compute (CPU + GPU) capabilities of Milvus and adds enterprise-class features such as auto-indexing, MFA, enhanced security and premier support.
It performs up to 40X faster than competing solutions. It scales to billions of vectors. It’s available on multiple clouds. It offers automatic ANNS indexing. And it’s backed by a world-class operations team that ensures it’s always up to date and secure, with a 99.99% uptime SLA and a zero-data-corruption guarantee.
“Previously, only companies like Microsoft or Google could take full advantage of tools like vector search. The rest could store millions of vectors. They just couldn’t search them. Now anyone can,” said Charles Xie, CEO, Zilliz. “Zilliz Cloud’s vector-native architecture makes massive scale and enterprise features available to everyone.”
Zilliz Cloud runs on AWS and GCP and is available on the AWS Marketplace to streamline procurement and billing. It encrypts data in transit, complies with the SOC 2 Type II standard and will soon support Role-Based Access Control.
Customers enjoy flexible payment options, paying only for what they use. New customers receive a $400 credit and discounts are available for high-volume workloads.
“A billion vectors per second is more than a number; it’s a necessity,” Xie said. “AI has vast potential but to reach it, it must be trusted. Hallucinations or wrong answers erode that trust. With the billion-vector performance of Zilliz we can help address that by expanding context and accelerating data retrieval.”
Visit AITechPark for cutting-edge Tech Trends around AI, ML, Cybersecurity, along with AITech News, and timely updates from industry professionals!