Brian - Documentation
  • đŸ–ī¸Welcome
  • 👀Prompt-to-earn
  • 🤖Why an Intent Recognition Engine?
  • 🌐Brian AI models
  • 📜Prompt Guide
  • 🧠Use Cases
  • đŸ’ģPowered by Brian
  • â‰ī¸FAQ
  • Brian API
    • âš™ī¸APIs
      • đŸ’ģAPI Swagger
      • 💸/transaction
        • â›“ī¸Actions, Networks, and Tokens supported
          • Deposit and Withdraw to/from DeFi protocols
        • Transactions flow
          • EVM
          • Solana
          • Starknet
      • 📚/knowledge
        • 📑Knowledge Boxes
        • 🐝Public Embeddings
      • đŸ§™â€â™‚ī¸/agent
      • 🧑‍đŸ”Ŧ/parameters-extraction
      • 👨‍đŸ’ģ/smart-contract (alpha)
      • 📰/networks
      • â„šī¸/actions
    • đŸĨŗWhat's New?
    • 🔑API key
  • Brian SDK
    • 📚Brian Typescript SDK
  • AI AGENTS
    • 🤖LangChain<>Brian SDK
  • Brian App
    • 🍀Overview
    • 📤Send Transactions
    • 🙏Ask Brian
    • 🚚Deploy Smart Contracts
  • Misc.
    • Telegram Dev community
    • Website
    • Github
    • Medium
    • Twitter
    • HuggingFace
Powered by GitBook
On this page
  1. Brian API
  2. APIs
  3. /knowledge

Public Embeddings

Download and use the Brian public knowledge embeddings.

PreviousKnowledge BoxesNext/agent

Last updated 10 months ago

Public embeddings play a significant role in various fields and applications, and these are the reasons why we wanted to create such repository:

  • Knowledge Sharing: Public embeddings facilitate the sharing of knowledge and information in a structured and computationally efficient way. They encode semantic information, making it easier to understand and utilize data across different applications;

  • Interoperability: Public embeddings provide a common framework for interoperability among diverse systems, platforms, and languages. They enable data and models to be understood and used consistently across the AI and machine learning community;

  • Facilitating Research: Researchers in natural language processing (NLP), computer vision, and other AI domains heavily rely on pre-trained embeddings. A repository of public embeddings can serve as a valuable resource for researchers to access and compare different embeddings for their experiments and models;

  • Reducing Redundancy: Instead of reinventing the wheel and training embeddings from scratch, developers and researchers can use publicly available embeddings as a starting point. This reduces redundancy, saves computational resources, and accelerates the development of AI applications;

  • Enhancing Accessibility: By making public embeddings readily accessible, the repository can democratize AI and make advanced technologies available to a wider audience. This inclusivity can foster innovation and creativity in various domains;

  • Open Data and Transparency: Hosting public embeddings in a centralized repository promotes transparency and openness in AI research and development. It allows for scrutiny and peer review, fostering trust in the AI community;

  • Cross-Domain Applications: Public embeddings are versatile and can be used across different domains and applications, from text analysis and recommendation systems to image recognition and sentiment analysis. This versatility enhances their value;

  • Community Collaboration: A repository for public embeddings encourages collaboration within the AI community. Developers, researchers, and practitioners can contribute their embeddings, share their knowledge, and collectively improve the quality of available embeddings;

  • Education and Learning: Public embeddings can be a valuable resource for educational purposes. Students, educators, and AI enthusiasts can access these embeddings to learn and experiment with AI techniques without the need for extensive computational resources.

Storing and retrieving public embeddings on serves as a catalyst for advancing AI research, fostering collaboration, and making AI more accessible and transparent. It plays a pivotal role in the development of AI technologies with far-reaching implications across various industries and domains.

Embeddings table

Here's a table containing the Swarm reference, when the embeddings were uploaded and their current size. You can download them using the , or any client of your choice. You can expect this table to be updated every month.

Reference
Date
Size

18th of May 2024

61.2 MB

18th of June 2024

67.7 MB

17th of July 2024

67.7 MB

âš™ī¸
📚
🐝
Swarm
Swarm Gateway
abdb324d4e37a18220d7d3408c2dc281faf59c88df2bc81b0a9be1ea471aee14
18e950a2b50c5a01b988c343d8449cbb080ec5b93f2493a18569bb8e2984b6cf
c5c9229c0affcdb83d47b51475396e80429f3434f20932117ee5f76ea94eef2b