Grooper 2025 - LLM Connector Config
Grooper 2025 - LLM Connector Config
LLM Connector is a Repository Option that enables large language model (LLM) powered AI features for a Grooper Repository.
About
LLM Connectors enable Grooper's AI-based features, including AI Extract and AI Assistants. Adding an LLM Connector connects Grooper to large language models (LLMs) such as OpenAI's GPT models and models in Microsoft Azure's Model Catalog (including Azure OpenAI models). An LLM Connector is enabled by adding it with the Grooper Root's Options editor.
LLM Connector is a Repository Option in Grooper. Repository Options enable optional features in Grooper. This means enabling LLM connectivity is entirely up to you and your organization. Grooper is enhanced by AI features enabled by an LLM Connector but can operate without it.
Once added to a Grooper Repository, LLM Connector is configured by adding an LLM Provider. The LLM Provider connects Grooper to service providers that offer LLMs, such as OpenAI, Microsoft Azure, and other providers that use OpenAI's API standard.
LLM Provider Options
- OpenAI – Connects Grooper to LLMs offered by the OpenAI API or compatible APIs using chat/completions and embeddings endpoints.
- Azure – Connects Grooper to LLMs offered by Microsoft Azure in the Model Catalog, including Azure OpenAI models.
- GCS – Grooper Cloud Services (prototype). This provider is still under development and may be ignored by most users.
LLM-enabled extraction capabilities
- Ask AI – LLM-based Value Extractor for natural language responses.
- AI Schema Extractor – Schema-driven JSON extraction from unstructured or semi-structured documents.
- AI Extract – LLM-based Fill Method for large-scale data extraction using Data Models.
LLM-enabled Data Section Extract Methods
- AI Collection Reader – Multi-instance section extraction optimized for large documents.
- AI Section Reader – Single-instance section extraction for complex or ambiguous layouts.
- AI Transaction Detection – Segments documents into transactions and extracts structured data.
- Clause Detection – Detects clauses using embeddings similarity.
- AI Table Reader – Extracts tabular data from semi-structured or unstructured documents.
LLM-enabled separation and classification capabilities
- AI Separate – LLM-based document separation using natural language understanding.
- LLM Classifier – Classifies documents using LLM reasoning.
- Mark Attachments – Uses generative AI to attach documents to parent documents.
Other LLM-enabled capabilities
- AI Assistants – Chat-based AI access to documents, databases, and web services.
- AI Generator – Generates text-based documents from search results.
- AI Productivity Helpers – Assist designers with regex, queries, data models, and more.
LLM connection options
Grooper primarily connects to LLMs using OpenAI and Azure providers, supporting a wide range of models and hosting options.
OpenAI API
Grooper's LLM features were designed around OpenAI models. Connecting to the OpenAI API is considered the standard method.
An API key and active payment method are required.
- Go to the Grooper Root node.
- Open the Options editor.
- Add the LLM Connector option.
- Open Service Providers and add OpenAI.
- Enter your OpenAI API Key.
- Enable Use System Messages (recommended).
- Save changes.
Azure AI Foundry deployments
Grooper connects to Azure OpenAI and Azure AI Foundry model deployments using the Azure provider.
Both Chat Services and Embeddings Services can be configured depending on the feature requirements.
- Go to the Grooper Root node.
- Open the Options editor.
- Add the LLM Connector option.
- Add an Azure provider under Service Providers.
- Configure Chat Service and/or Embeddings Service deployments.
- Set Model Id, URL, and Authorization method.
- Enter the API Key or configure Bearer authentication.
- Enable Use System Messages (recommended).
- Save changes.