AI Assistant for Nodinite
Transform how you explore and understand your integration landscape with the AI Assistant for Nodinite.
Ask questions in natural language and get instant, contextual answers drawn from your live monitoring data, repository model, and business metadata — no SQL queries, no complex filters, just conversation.
Note
The AI Assistant is in active development with experimental features. Today: Repository queries (Entities, Relationships, Custom Metadata). Coming soon: Full CRUD operations, Log Views, BPM insights, and Audit Logs (subject to GDPR, retention, and PII governance).

The Nodinite AI Assistant with integrated chat and configuration options.
What you'll achieve
- Ask questions naturally — "Which Finance integrations failed today?" instead of building complex queries
- Get instant answers — AI understands your entities, relationships, and custom metadata to deliver precise results
- Stay secure — Choose local AI (OLLAMA), cloud AI (Azure OpenAI), or connect external tools via MCP protocol
- Empower all users — From operators to business analysts, anyone can query Nodinite data intuitively
- Connect any AI tool — Use Claude Desktop, Cline, or any MCP-compatible client to access Nodinite insights
Tip
Need AI-powered diagnostics log analysis? See AI Diagnostics for autonomous scanning of monitoring agent logs with local or cloud AI. To enable the integrated AI Assistant, start with AI System Parameter and set
EnableMcpChattotrue. If you also want VS Code, Claude Desktop, Cline, or another external client, use the separate MCP Endpoint setup. If you want BizTalk-specific external tools or BizTalk C4 generation, enableEnableAiForBizTalkseparately.
Why AI in Nodinite?
Traditional monitoring tools provide raw data — the AI Assistant transforms it into understanding.
By interacting with Nodinite through natural language, users can surface insights, correlate events, and answer business questions instantly.
Business Value
- Smarter, faster decisions — Ask "Which integrations failed for our Finance domain today?" and get contextual answers in seconds
- Reduce time-to-resolution (MTTR) — From 30 minutes of manual log analysis to 60 seconds with AI — that's a 97% time savings
- Empower all roles — Technical operators, business analysts, and compliance teams can query Nodinite data intuitively
- Instant documentation — Generate summaries, impact analyses, and audit reports directly from your integration data
- Secure and compliant — Choose isolated local AI (OLLAMA), approved cloud services (Azure OpenAI), or connect external AI tools via MCP protocol
Quick Start — Configure the AI Assistant First
The integrated AI Assistant has its own Web Client setup path. External MCP clients use a separate MCP Endpoint setup and are not registered by the AI system parameter.
Step 1: Enable the AI feature
Open the AI System Parameter and set EnableMcpChat to true.
Step 2: Configure Provider, Model Name, Endpoint, and API Key
Open Administration → Settings → AI Assistant and fill in the provider settings.
| Setting | What to enter | Example values |
|---|---|---|
| Provider | The AI service you want Nodinite to use | OLLAMA, Azure OpenAI, OpenAI, GitHub Models |
| Model Name | The provider-specific model or deployment name | phi3:mini, gpt-4o, gpt-4o-mini |
| Endpoint | The provider base URL | http://localhost:11434, https://your-resource.openai.azure.com/, https://models.inference.ai.azure.com |
| API Key | The secret or token required by the selected Provider | Your provider API key or token |

Example of the AI Assistant settings dialog with Provider, Model Name, Endpoint, and API Key fields.
Step 3: Validate the integrated AI chat
- Open the AI Assistant button in the navigation bar.
- Ask a simple repository question such as Show me all Services.
- Confirm that the answer comes from your Nodinite Repository Model.
Step 4: Continue to the MCP Endpoint if needed
If you also want to connect Claude Desktop, Cline, VS Code, or another external tool, continue with Set up AI Assistant and MCP Endpoint or go directly to MCP Endpoint. External MCP registration is always separate and required for the client, even when you want to ask the same Repository, Architecture Diagram, or BizTalk questions. BizTalk-specific external capabilities additionally require EnableAiForBizTalk.
Important
The AI Assistant and MCP Endpoint were released in Nodinite 7.4.0. For VS Code and similar MCP clients, use the package-based
mcp.jsonsetup described on MCP Endpoint.
How It Works
The AI Assistant provides two ways to access your Nodinite data:
Integrated Web Chat
Built directly into Nodinite's web Portal — configure once and start querying.
- What it does: Chat interface in the Nodinite UI with direct access to your repository data
- Best for: Quick queries, operator troubleshooting, team members already working in Nodinite
- Configuration: :Administration: → Settings → AI Assistant (choose your AI provider and configure settings in the GUI)
MCP Endpoint (External AI Tools)
Nodinite exposes an MCP (Model Context Protocol) endpoint for external AI tools to access your integration data. See MCP Endpoint for available capabilities, compatible clients, and connection details.
Both modes respect your authentication and permissions. The integrated AI Assistant focuses on the Nodinite Web Client experience, while external MCP clients use their own setup path and can also be used for Architecture Diagrams and BizTalk workflows.

Example of the AI Assistant answering questions about your Integration landscape.
MCP Endpoint — Connect External AI Tools
Nodinite exposes a Model Context Protocol (MCP) endpoint for external AI tools. The integrated AI Assistant is configured in the Web Client, while VS Code and similar MCP clients use the package-based mcp.json setup described on MCP Endpoint.
What the AI Assistant Knows
The AI Assistant has comprehensive access to your Nodinite Repository Model, including:
Entities & Configuration
- All entity types: Endpoints, Domains, Applications, Business Processes, Custom Types
- Entity metadata: Names, descriptions, environments, lifecycle stages (Development, Production, etc.)
- Custom metadata fields: Business owners, criticality levels, compliance tags — any custom metadata you've defined
- Relationships: Dependencies, connections, and hierarchies (e.g., "Which Endpoints belong to which Applications?")
Business Context
- Domains and BPM: Business Process Models, organizational structure, which integrations support which business capabilities
- Tags and classifications: Search by any tag or custom classification you've applied
- Technical details: Connection strings, settings, configurations (respecting security permissions)
Current Limitations
- Log data: Not yet available (coming soon, subject to GDPR/retention/PII governance)
- Real-time monitoring: Not yet available (Repository data only for now)
- CRUD operations: Read-only today; create/update/delete capabilities in development
Example Conversations
Troubleshooting
"Which Finance integrations failed today?" → AI identifies failed integrations, suggests investigating dependencies
Business Impact Analysis
"If we take the Payment Gateway offline for maintenance, what will be affected?" → AI lists downstream dependencies and recommends maintenance windows
Compliance & Reporting
"List all integrations tagged as 'GDPR-Critical' in Production" → AI finds tagged entities and offers to export for audit
Onboarding & Documentation
"Explain how our Order Processing flow works" → AI describes integration flow across domains and offers to visualize in Mapify
Security & Governance
Data Privacy
- Local AI option: Use OLLAMA to run AI models entirely on your infrastructure — no data leaves your network
- Cloud AI option: Choose approved cloud providers (Azure OpenAI, OpenAI, GitHub Models) with enterprise agreements
- MCP connections: Authenticated via API keys with same role-based access control as Nodinite Web API
Access Control
- AI Assistant respects your existing Nodinite permissions — users only see data they're authorized to access
- External MCP connections use Nodinite's authentication system
- Audit logs track all AI queries and data access
Best Practices
- Start with local AI — Test with OLLAMA before enabling cloud AI in Production
- Review AI responses — AI-generated insights should be verified, especially for critical decisions
- Monitor usage — Track who's using AI Assistant and what questions are being asked (audit logs)
- Secure external access — If using the MCP endpoint with external tools, ensure proper authentication is configured
Advanced Use Cases
Automated Incident Response
Monitoring alerts trigger AI analysis via MCP → AI suggests remediation → Automation executes fixes → Summary posted to Teams/Slack
Custom Dashboard Generation
Natural language query → AI calculates metrics → Script renders dashboard in PowerBI/Grafana
Development & Testing
Developers query Nodinite environments during development: "Check if Test environment has latest CustomerAPI version"
Measuring Success
Key Metrics
- MTTR (Mean Time to Resolution): Track how AI Assistant reduces incident investigation time
- Self-service adoption: Percentage of queries answered without escalation to senior engineers
- Cross-team usage: Business analysts and non-technical users querying Nodinite data
- Audit compliance: Time savings in compliance reporting and documentation generation
Example ROI Calculation
Before AI Assistant
Average incident investigation: 30 minutes × 100 incidents/month = 50 hours/month × $75/hour = $3,750/month
After AI Assistant
Average investigation: 5 minutes × 100 incidents/month = 8.3 hours/month × $75/hour = $625/month
Savings: $3,125/month ($37,500/year)
Total estimated ROI with self-service queries, faster compliance reporting, and reduced escalations: $50,000-$75,000/year for mid-sized integration environments
Tips for Better Results
- Be specific: "Which SAP integrations failed today?" is better than "Show me failures"
- Use business terms: The AI understands Domains, Business Processes, and custom metadata fields
- Ask follow-up questions: AI maintains conversation context — build on previous answers
- Verify critical decisions: AI provides insights fast, but always validate before making configuration changes
Frequently Asked Questions
Q: What AI models does Nodinite support?
A: For integrated web chat: OLLAMA (local), Azure OpenAI (cloud), OpenAI (cloud), GitHub Models (cloud). For MCP mode: Any MCP-compatible client (Claude Desktop, Cline, Zed, custom apps).
Q: Can the AI Assistant change my Nodinite configuration?
A: Not yet — current version is read-only. CRUD operations (create, update, delete) are in development and will be released with admin approval workflows.
Q: Does AI Assistant access log data?
A: Not in the current version. Log data integration is coming soon, subject to GDPR, retention policies, and PII governance. Today: Repository Model only (Entities, Relationships, Custom Metadata).
Q: Is my data sent to external AI providers?
A: It depends on your configuration:
- OLLAMA: No — AI runs entirely on your infrastructure
- Azure OpenAI / OpenAI / GitHub Models: Yes — data is sent to cloud AI providers (ensure your enterprise agreement covers this)
- MCP mode: Depends on which AI client you connect (Claude Desktop sends data to Anthropic, Cline with local models keeps data local)
Q: Can I use AI Assistant in Production?
A: Yes, but with caution:
- Read-only queries: Safe for Production
- Local AI (OLLAMA): Recommended for sensitive environments
- Cloud AI: Ensure compliance with your organization's data governance policies
- External MCP access: Ensure proper authentication and network security if exposing the MCP endpoint
Q: How do I connect external AI tools via MCP?
A: Use the package-based mcp.json setup described on MCP Endpoint. External MCP setup is always required for the client, regardless of whether you want Repository questions, Architecture Diagrams, or BizTalk workflows.
Q: Can I customize the AI's behavior?
A: Not yet, but roadmap includes:
- Custom system prompts (e.g., "Always prioritize Finance domain integrations")
- Domain-specific knowledge bases
- Integration with your organization's runbooks and documentation
Q: What if the AI gives a wrong answer?
A: AI responses are probabilistic and should be verified for critical decisions. If you notice errors:
- Check if the question was ambiguous or lacked context
- Verify the AI has access to current data (Repository Model sync)
- Report issues to Nodinite support for model tuning
Q: How does AI Assistant compare to Mapify?
A: They're complementary tools:
- AI Assistant: Natural language queries, text-based answers, conversational insights, automation-friendly
- Mapify: Visual dependency graphs, interactive exploration, graph-based navigation, high-level architecture view
Use AI Assistant for asking questions and Mapify for visualizing relationships.
Q: What's the difference between AI Assistant and AI Diagnostics?
A: They serve different purposes:
- AI Assistant: Query your integration landscape (Entities, Relationships, Custom Metadata) using natural language — ask questions like "Which Finance integrations failed today?"
- AI Diagnostics: Autonomous analysis of monitoring agent diagnostics logs to detect issues, provide root-cause analysis, and suggest fixes
Use AI Assistant for exploration and queries, AI Diagnostics for automated log analysis and troubleshooting.
Next Step
- Set up AI Assistant and MCP Endpoint
- MCP Endpoint — connect external AI tools like Claude Desktop and Cline