Developer Quickstart Guide

Version: ≥5.5.x

This guide helps developers quickly understand EDDI's architecture and start building bots.

Understanding EDDI in 5 Minutes

What EDDI Is

EDDI is middleware for conversational AI—it sits between your app and AI services (OpenAI, Claude, etc.), providing:

  • Orchestration: Control when and how LLMs are called

  • Business Logic: IF-THEN rules for decision-making

  • State Management: Maintain conversation history and context

  • API Integration: Call external REST APIs from bot logic

Key Concept: The Lifecycle Pipeline

Every user message goes through a pipeline of tasks:

Input → Parser → Rules → API/LLM → Output

Each task transforms the Conversation Memory (a state object containing everything about the conversation).

Bot Composition

Bots aren't code—they're JSON configurations:

Quick Setup

Prerequisites

  • Java 21

  • Maven 3.8.4

  • MongoDB 6.0+

  • Docker (optional, recommended)

Run with Docker (Easiest)

Run from Source

Configuring AI Tools

If you plan to use the Web Search or Weather tools in your bots, you need to set up API keys in your environment or application.properties.

Web Search (Google):

  • eddi.tools.websearch.provider=google

  • eddi.tools.websearch.google.api-key=...

  • eddi.tools.websearch.google.cx=...

Weather (OpenWeatherMap):

  • eddi.tools.weather.openweathermap.api-key=...

See LangChain Documentation for details.

Your First Bot (via API)

1. Create a Dictionary

Dictionaries define what users can say:

Response: Dictionary ID (e.g., eddi://ai.labs.parser.dictionaries.regular/regulardictionarystore/regulardictionaries/abc123?version=1)

2. Create Behavior Rules

Rules define what the bot does:

Response: Behavior set ID

3. Create Output Templates

Response: Output set ID

4. Create a Package

Packages bundle extensions together:

Response: Package ID

5. Create a Bot

Response: Bot ID (e.g., bot-abc-123)

6. Deploy the Bot

7. Chat with Your Bot

Adding an LLM (OpenAI Example)

1. Create LangChain Configuration

2. Add LangChain to Package

Add this extension to your package:

3. Create Behavior Rule to Trigger LLM

Now when users ask questions, the LLM is automatically called!

Understanding the Flow

Let's trace what happens when a user says "hello":

1. API Request

2. RestBotEngine

  • Validates bot ID

  • Creates/loads conversation memory

  • Submits to ConversationCoordinator

3. ConversationCoordinator

  • Ensures sequential processing (no race conditions)

  • Queues message for this conversation

4. LifecycleManager Executes Pipeline

Parser Task:

Behavior Rules Task:

Output Task:

5. Save & Return

  • Memory saved to MongoDB

  • Response returned to user

Key Architectural Components

IConversationMemory

The state object passed through the pipeline:

ILifecycleTask

Interface all tasks implement:

ConversationCoordinator

Ensures messages are processed in order:

Common Patterns

Pattern 1: Conditional LLM Invocation

Only call LLM for complex queries:

Pattern 2: API Call Before LLM

Fetch data, then ask LLM to format it:

The LLM receives the API response in memory and can format it naturally.

Pattern 3: Context-Aware Responses

Use context passed from your app:

Access in output template:

Next Steps

Learn More

Use the Dashboard

Visit http://localhost:7070 to:

  • Create bots visually

  • Test conversations interactively

  • Browse configurations

  • Monitor deployments

Explore Examples

Check the examples/ folder for:

  • Weather bot (API integration)

  • Support bot (multi-turn conversations)

  • E-commerce bot (context management)

Build Your Own Task

Create a custom lifecycle task:

Register it in CDI and it becomes available as an extension!

Troubleshooting

Bot doesn't respond

  1. Check deployment status: GET /administration/deploy/{botId}

  2. Check conversation state: GET /conversationstore/conversations/{conversationId}

  3. Check logs for errors

Rules not matching

  • Verify dictionary expressions match your input

  • Check rule conditions are correct

  • Use occurrence: "anyStep" to match across conversation

LLM not being called

  • Ensure behavior rule triggers the LLM action

  • Check LangChain configuration is in the package

  • Verify API key is correct

Memory not persisting

  • Ensure MongoDB is running

  • Check connection string in config

  • Use correct scope (conversation not step)

Getting Help

  • Documentation: https://docs.labs.ai

  • GitHub: https://github.com/labsai/EDDI

  • Issues: https://github.com/labsai/EDDI/issues

Summary

EDDI's power comes from its configurable pipeline architecture:

  • Bots are JSON configurations, not code

  • Everything flows through Conversation Memory

  • Tasks are pluggable and reusable

  • LLMs are orchestrated, not just proxied

Start simple, then add complexity as needed. The architecture scales from basic chatbots to sophisticated multi-API workflows.

Last updated

Was this helpful?