- 5 minutes to read

AI - System Parameter

New 7.4.0

Starting with Nodinite 7.4.0, the AI Assistant and MCP server ship as part of the standard installation. The AI System Parameter is a JSON configuration that controls AI feature flags for the Nodinite Web Client. Today it exposes one flag — EnableMcpChat — which controls whether the AI Chat widget is visible in the Web Client.

  • ✅ AI Assistant and MCP server included in the standard Nodinite license
  • ✅ Disabled by default — enable only when ready to configure an LLM provider
  • ✅ MCP server ships as a standalone website — port assigned per environment in the Nodinite portal
  • ✅ Designed as JSON to accommodate additional AI feature flags in future releases

Important

Enabling EnableMcpChat alone is not sufficient to use the AI Assistant. An administrator must also configure the LLM / AI provider to use. See AI Assistant for provider configuration options.

AI System Parameter
Example: The AI system parameter in Nodinite Web Client with EnableMcpChat enabled, showing the AI Chat widget appearing in the bottom-right corner of the portal.

System Parameter Name Data Type Values / Example Comment
AI JSON JSON object (see example below) Default = {"EnableMcpChat":false}

How It Works

The MCP server is a standalone web site that ships with the Nodinite installation. Its port number is assigned per environment entry (customer) directly in the Nodinite portal — allowing multiple environments to run side-by-side on the same host without port conflicts.

graph LR subgraph "Nodinite Installation" WC[" Web Client"] MCP[" MCP Server
(Standalone Website)"] end subgraph "Nodinite Portal" ENV[" Environment Entry
(Port Assignment)"] end subgraph "AI Provider" LLM[" LLM / AI Provider
(OLLAMA, Azure OpenAI, etc.)"] end WC -->|"EnableMcpChat: true"| MCP ENV -->|"Assigned port"| MCP MCP --> LLM

The MCP server bridges the Nodinite Web Client to your chosen LLM provider. The port is configured per environment in the portal.


JSON Configuration Structure

The default configuration (AI Assistant disabled):

{"EnableMcpChat":false}

To enable the AI Chat widget in the Web Client:

{"EnableMcpChat":true}

Feature Flags

Flag Type Default Description
EnableMcpChat boolean false Controls the visibility of the AI Chat widget in the Nodinite Web Client. When true, the chat widget appears in the bottom-right corner of the portal.

Note

The JSON structure is intentional — future Nodinite versions will add more AI feature flags to this same parameter without requiring a new system parameter.


Prerequisites

Before enabling the AI Assistant:

  1. Administrator Access – You must be a member of the built-in Administrator Role to change system parameters.
  2. LLM Provider Configuration – Even when EnableMcpChat is set to true, an administrator must configure the LLM / AI provider in Administration → Settings → AI Assistant before the chat is functional. See AI Assistant for provider setup.
  3. Port Assignment – The MCP server port must be assigned in the Nodinite portal for each environment entry. Contact your system administrator or refer to your installation documentation.

For POC and Non-Sensitive Data Environments

For a quick proof of concept or on installations with non-sensitive data, you can use the limited free cloud service at ollama.com to get started without any local infrastructure.

Caution

The ollama.com cloud service is not intended for production-grade systems or any environment containing sensitive, confidential, or personally identifiable data. For production use, run OLLAMA locally on your own infrastructure, or use an approved cloud provider (Azure OpenAI, OpenAI, GitHub Models) with an appropriate enterprise agreement.

For Production Environments

Use one of the supported AI providers with appropriate data governance controls in place:

Provider Type Data Location Best For
OLLAMA (local) Local Your infrastructure Sensitive data, strict compliance, air-gapped environments
Azure OpenAI Cloud Microsoft cloud Enterprise Azure environments with existing agreements
OpenAI Cloud OpenAI cloud General purpose with broad model selection
GitHub Models Cloud GitHub cloud Developer teams and GitHub-native workflows

Important Notices

AI Hallucination

Warning

AI-related services may produce incorrect or misleading responses — a phenomenon known as "hallucination." Always verify AI-generated answers before acting on them, especially for critical decisions affecting production systems.

Potential Additional Costs

Note

Using third-party AI services (Azure OpenAI, OpenAI, GitHub Models, ollama.com cloud) may incur additional costs based on the provider's pricing model (typically per-token). Review your chosen provider's pricing before enabling in production. Running OLLAMA locally on your own infrastructure is free.

License

The AI Assistant and MCP server are included in the standard Nodinite license at no additional charge. Third-party AI provider costs are separate and billed directly by the respective provider.


Frequently Asked Questions

Find more solutions and answers in the Nodinite System Parameters FAQ and the Troubleshooting user guide.

How do I change the value?

To update a pre-defined System Parameter, follow the steps in 'How do I change the System Parameters'.

I enabled EnableMcpChat but the chat widget is not visible — why?

Check the following:

  1. Provider not configured — Navigate to Administration → Settings → AI Assistant and verify an LLM provider has been configured with a valid endpoint and model.
  2. Browser cache — Hard refresh with Ctrl+F5 (Windows) or Cmd+Shift+R (Mac).
  3. Role permissions — Verify your user account has the appropriate permissions to use the AI Assistant.

Does enabling AI require a restart?

No service restart is required when changing the AI system parameter. The change takes effect after saving.

Can multiple environments share the same MCP server?

No. Each environment entry in the Nodinite portal has its own port assignment for the MCP server, allowing multiple environments to run independently on the same host.


Next Step

AI Assistant — configure your LLM provider and start querying
MCP Endpoint — connect external AI tools like Claude Desktop

System Parameters
AI Assistant
MCP Endpoint
AI Diagnostics
Web Client