V2EX = way to explore
V2EX 是一个关于分享和探索的地方
现在注册
已注册用户请  登录
爱意满满的作品展示区。
librae
V2EX  ›  分享创造

搭建 AI LLM 应用的必备利器,可能是目前世界上最快的 AI 原生数据库,同时支持向量和全文搜索

  •  
  •   librae ·
    librae8226 · 335 天前 · 2321 次点击
    这是一个创建于 335 天前的主题,其中的信息可能已经有所发展或是发生改变。

    传送门
    https://github.com/infiniflow/infinity

    The AI-native database built for LLM applications, providing incredibly fast vector and full-text search

    Roadmap 2024 | Twitter | Discord | YouTube |

    Infinity is a cutting-edge AI-native database that provides a wide range of search capabilities for rich data types such as vectors, full-text, and structured data. It provides robust support for various LLM applications, including search, recommenders, question-answering, conversational AI, copilot, content generation, and many more RAG (Retrieval-augmented Generation) applications.

    🌟 Key Features

    Infinity comes with high performance, flexibility, ease-of-use, and many features designed to address the challenges facing the next-generation AI applications:

    ⚡️ Incredibly fast

    • Achieves 0.1 milliseconds query latency on million-scale vector datasets.
    • Up to 10K QPS on million-scale vector datasets.

    See the Benchmark report for more information.

    🔮 Fused search

    Supports a fused search of multiple embeddings and full text, in addition to filtering.

    🍔 Rich data types

    Supports a wide range of data types including strings, numerics, vectors, and more.

    🎁 Ease-of-use

    • Intuitive Python API. See the Python API
    • A single-binary architecture with no dependencies, making deployment a breeze.

    1 条回复    2023-12-27 10:03:39 +08:00
    qW7bo2FbzbC0
        1
    qW7bo2FbzbC0  
       335 天前
    这个和 milvus 有啥区别和独特的优势?
    关于   ·   帮助文档   ·   博客   ·   API   ·   FAQ   ·   实用小工具   ·   3145 人在线   最高记录 6679   ·     Select Language
    创意工作者们的社区
    World is powered by solitude
    VERSION: 3.9.8.5 · 22ms · UTC 13:34 · PVG 21:34 · LAX 05:34 · JFK 08:34
    Developed with CodeLauncher
    ♥ Do have faith in what you're doing.