Vectara Agentic (py-vectara-agentic) is a Python library designed to help developers build powerful and customizable AI assistants and agents using Vectara's Agentic-RAG technology combined with the LlamaIndex Agent framework. It makes creating AI agents easier by providing helper functions to quickly create tools that connect to Vectara corpora, allowing for rapid development of intelligent assistants that interact with specialized knowledge bases.
Key Features:
Rapid Tool Creation: Create Vectara retrieval-augmented generation (RAG) tools or search tools with a single line of code to connect agents to specific knowledge corpora.
Agent Flexibility: Supports multiple types of agents like ReAct, OpenAIAgent, LATS, and LLMCompiler for diverse reasoning and interaction styles.
Pre-Built Domain Tools: Offers ready-made tools for verticals such as finance and legal to jumpstart specialized agent development.
Multi-LLM Integration: Seamlessly connects with multiple large language model providers, including OpenAI, Anthropic, Gemini, GROQ, Together.AI, Cohere, Bedrock, and Fireworks.
Use Cases:
Building AI assistants that answer financial or legal queries by connecting to domain-specific document corpora.
Creating customer support or research assistants that rely on up-to-date, structured data from Vectara.
Developing multi-step or sequential AI interactions with custom workflows tailored to business logic or user needs.
Technical Specifications:
Programming Language: Python 3.10 or higher.
Core Framework: Built on the LlamaIndex Agent architecture, enhanced with Vectara’s Agentic-RAG.
Deployment: Can be used locally or integrated into applications through a Python package; requires API keys for Vectara and supported LLM providers.
Agent Interaction Modes: Supports synchronous chat, asynchronous chat, and streaming responses for real-time applications.