- 14 minutes to read

Update Log Database

Maintain your Nodinite Log Databases with confidence using the Update Guide in the Nodinite Web Client. This comprehensive guide empowers SQL DBAs and Nodinite administrators to:

  • Safely update Log Database schemas to the latest version
  • Manage SQL Server recovery modes (SIMPLE/FULL) automatically
  • Verify Logging Service permissions by running scripts with service credentials
  • Apply database updates to both Active and Archive Log Databases
  • Minimize downtime with step-by-step service management

Important

This guide applies to Nodinite version 7.2.0 and later. If you are using Nodinite version 6.x with the deprecated Install and Update Tool, refer to the legacy update instructions.

Note

From time to time, Nodinite releases database schema updates to improve performance, add features, or optimize indexing. The Update Guide automates script generation based on your SQL Server configuration.


Navigate to Update Guide

To access the Update Guide for Log Databases:

  1. Log on to the Nodinite Web Client.
  2. Navigate to Administration | Settings | Log Databases.
  3. Click the Update Guide button at the top of the page.

Update Guide Button
Example: 'Update Guide' button in the Log Databases administration page.

The Update Guide will analyze your current Log Databases and detect if any schema updates are available.


Update Prerequisites

Before starting the update process, ensure you have:

  • SQL Server access – Administrative permissions on the SQL Server hosting the Log Databases.
  • Backup completed – Always backup your databases before applying schema changes.
  • Maintenance window – Schedule updates during low-activity periods to minimize impact.
  • Logging Service credentials – Access to the Windows account running the Logging Service.
  • .DACPAC file access – Default location: C:\Program Files\Nodinite\{EnvironmentName}\LoggingService\ (where {EnvironmentName} is your environment name like DEMO, Production, or Test).

Tip

If the database is in FULL recovery mode, you can temporarily change to SIMPLE recovery mode to speed up the update process. The Nodinite Update Guide respects the LogDatabaseRecovery system parameter.


LogDatabaseRecovery System Parameter

The Update Guide automatically generates scripts based on the LogDatabaseRecovery System Parameter. This parameter determines the SQL Server recovery mode applied to Log Databases:

SQL Environment Recovery Mode
Always On Availability Groups (AOAG) FULL
Standalone or Failover Cluster SIMPLE
  • The Logging Service uses this value when creating new Log Databases.
  • The Update Guide uses this value to set the recovery mode flag when applying schema updates with the .DACPAC file.

Important

If you change the LogDatabaseRecovery system parameter, the Update Guide will automatically apply the new recovery mode for future updates. Existing databases are not automatically changed.

Default .DACPAC file location:

C:\Program Files\Nodinite\{EnvironmentName}\LoggingService\Nodinite.LogDatabase.dacpac

Note

{EnvironmentName} is the name of your Nodinite environment (e.g., DEMO, Production, Test) and varies according to your installation.

Learn more in the LogDatabaseRecovery System Parameter documentation.


Update Process Overview

The Update Guide provides a five-step wizard to safely update your Log Databases:

  1. Stop Logging/Monitoring Services – Prevent active database writes during schema updates.
  2. Put Archive Databases into Read-Only Mode – Protect historical data from accidental modifications.
  3. Command Line Scripts – Generate and execute SqlPackage.exe commands to apply schema updates.
  4. Set Archive Databases Back from Read-Only Mode – Re-enable write access to Archive databases.
  5. Restart Services – Resume normal operations.

Update Guide Workflow
Example: Update Guide workflow in the Nodinite Web Client.


Step 1: Stop Logging/Monitoring Services

Before applying database schema changes, stop the Nodinite Logging Service and Monitoring Service to prevent active writes to the databases.

In the Update Guide:

  1. Click the Stop Services button.
  2. Wait for confirmation that both services have stopped successfully.

Stopped Services
Example: Services successfully stopped.

Warning

No new logs or monitor data will be collected while services are stopped. Schedule updates during maintenance windows.

Once the services have stopped, you can proceed to the next step. Stopped Services
Example: Services successfully stopped.

Click the Next button to proceed to Step 2.


Step 2: Put Archive Databases into Read-Only Mode

Setting Archive Log Databases to read-only mode ensures historical data is protected during the schema update process.

In the Update Guide:

  1. Click the Set Read-Only button.
  2. The Update Guide will execute SQL commands to set all Archive databases to read-only mode.

Set Read-Only Mode
Example: Archive databases set to read-only mode.

Note

The Active Log Database is not set to read-only, as it must remain writable to apply schema changes.

Once the read-only mode has been set, you can proceed to the next step. Read-Only Mode
Example: Archive databases set to read-only mode.

Click the Next button to proceed to Step 3.


Step 3: Command Line Scripts

In this step, the Update Guide generates SqlPackage.exe command-line scripts for updating database schemas, and you execute them to apply the changes.

Note

This process could use a lot of disk space. If needed, shrink the Log Databases after each update to release allocated disk space.

Note

Make sure you are using the latest version of DACFX. If you install a new version of DACFX, be sure to tell your Nodinite instance that you have updated it.

Settings

The Update Guide uses the following settings to build the command-line scripts:

Dacpac executable file path:

C:\Program Files\Nodinite\{EnvironmentName}\LoggingService

Configuration options:

  • Encryption – Source Connection and Target Connection (enabled by default)
  • Trust Certificate – Source Server Certificate and Target Server Certificate (enabled by default)
  • Use Integrated Security – Uses Windows Authentication for SQL Server connections (enabled by default)
  • Ignore DACPAC Options – Advanced option to skip certain DACPAC deployment properties

Tip

The default settings work for most environments. Only modify these if you have specific security or authentication requirements.

Commands

Scripts are provided for:

  • Active Database – The current database actively receiving new log events.
  • Archive Databases – Historical databases containing older log data (if any exist and require updates).

Generated Scripts
Example: Generated scripts for Active and Archive databases.

Understanding the Generated Scripts

The scripts use SqlPackage.exe (part of SQL Server Data-Tier Application Framework) to deploy the .DACPAC file to your databases.

Example script for Active Database:

REM Update Active Log Database: Nodinite_DEMO_Log
REM Generated: 2026-01-22 10:30:00 UTC
REM Recovery Mode: FULL (controlled by LogDatabaseRecovery system parameter)
REM DACPAC File: C:\Program Files\Nodinite\DEMO\LoggingService\Nodinite.LogDatabase.dacpac

"C:\Program Files\Microsoft SQL Server\170\DAC\bin\SqlPackage.exe" /Action:Publish /SourceFile:"C:\Program Files\Nodinite\DEMO\LoggingService\Nodinite.LogDatabase.dacpac" /TargetServerName:"localhost" /TargetDatabaseName:"Nodinite_DEMO_Log" /SourceEncryptConnection:true /TargetEncryptConnection:true /SourceTrustServerCertificate:true /TargetTrustServerCertificate:true /Properties:RegisterDataTierApplication=True /Properties:BlockOnPossibleDataLoss=False /Properties:BlockWhenDriftDetected=False /Properties:DropIndexesNotInSource=True /Properties:DropConstraintsNotInSource=True /Properties:ScriptNewConstraintValidation=False /Properties:PopulateFilesOnFilegroups=False /v:CustomPathData="null" /v:CustomPathImagedata="null" /v:CustomPathIndex="null" /v:RecoveryMode="FULL"

Note

Replace DEMO with your actual environment name (e.g., Production, Test). Replace localhost with your SQL Server name. The /v:RecoveryMode parameter is automatically set by the Update Guide based on the LogDatabaseRecovery system parameter value (SIMPLE or FULL).

Key script parameters:

  • /Action:Publish – Deploys the .DACPAC schema to the target database.
  • /SourceFile – Path to the .DACPAC file (Nodinite.LogDatabase.dacpac).
  • /TargetServerName – SQL Server instance name (e.g., localhost, SQL-SERVER-01).
  • /TargetDatabaseName – Name of the Log Database to update (e.g., Nodinite_DEMO_Log).
  • /SourceEncryptConnection:true / /TargetEncryptConnection:true – Encrypt connections to SQL Server (recommended for security).
  • /SourceTrustServerCertificate:true / /TargetTrustServerCertificate:true – Trust SQL Server certificates (required for self-signed certificates).
  • /Properties:BlockOnPossibleDataLoss=False – Allows schema changes that might result in data loss (use with caution).
  • /Properties:DropIndexesNotInSource=True – Removes indexes not defined in the .DACPAC (ensures schema consistency).
  • /Properties:DropConstraintsNotInSource=True – Removes constraints not defined in the .DACPAC.
  • /v:RecoveryModeCritical: Sets the database recovery mode (SIMPLE or FULL) based on the LogDatabaseRecovery system parameter. The Update Guide automatically sets this value.

Customizing Scripts for Your Environment

You may need to customize the generated scripts in the following scenarios:

Scenario 1: Running scripts from a different server

If you need to run the update from a different SQL Server or workstation, copy the .DACPAC file to the target machine:

REM Copy .DACPAC file to SQL Server (replace DEMO with your environment name)
copy "C:\Program Files\Nodinite\DEMO\LoggingService\Nodinite.LogDatabase.dacpac" "\\SQL-SERVER-01\C$\Temp\Nodinite.LogDatabase.dacpac"

REM Update script to reference new location
"C:\Program Files\Microsoft SQL Server\170\DAC\bin\SqlPackage.exe" /Action:Publish /SourceFile:"C:\Temp\Nodinite.LogDatabase.dacpac" /TargetServerName:"SQL-SERVER-01" /TargetDatabaseName:"Nodinite_DEMO_Log" /SourceEncryptConnection:true /TargetEncryptConnection:true /SourceTrustServerCertificate:true /TargetTrustServerCertificate:true /v:RecoveryMode="FULL"

Scenario 2: Using SQL Server authentication instead of Windows authentication

If your environment uses SQL Server authentication, add authentication parameters:

"C:\Program Files\Microsoft SQL Server\170\DAC\bin\SqlPackage.exe" /Action:Publish /SourceFile:"C:\Program Files\Nodinite\DEMO\LoggingService\Nodinite.LogDatabase.dacpac" /TargetServerName:"SQL-SERVER-01" /TargetDatabaseName:"Nodinite_DEMO_Log" /TargetUser:"sa" /TargetPassword:"YourSecurePassword" /SourceEncryptConnection:true /TargetEncryptConnection:true /SourceTrustServerCertificate:true /TargetTrustServerCertificate:true /v:RecoveryMode="SIMPLE"

Scenario 3: SQL Server Always On Availability Groups (AOAG)

For AOAG environments, you may need to:

  1. Temporarily remove the database from the availability group.
  2. Apply the schema update to the primary replica.
  3. Allow AOAG to synchronize changes to secondary replicas.
  4. Re-add the database to the availability group.

Tip

Consult your SQL Server DBA for AOAG-specific update procedures.

Scenario 4: Custom .DACPAC file location

If you have copied the .DACPAC file to a different folder, update the /SourceFile parameter:

/SourceFile:"D:\NodiniteBackups\DACPACs\Nodinite.LogDatabase.dacpac"

Copy and Execute Scripts

  1. Copy the Active Database script from the left panel by clicking the Copy button.
  2. Copy the Archive Databases scripts from the right panel (one script per archive database).
  3. Execute the scripts to apply the database schema updates.

Note

If you have no Archive databases, or none that require updates, the Archive Databases panel will show: "No archive databases found, or none that require updates."

Important

Execute the commands above using a command prompt (cmd.exe) running as the Nodinite Logging Service account, or as a SQL Server sysadmin user.

Running as the Logging Service account ensures the service has proper SQL Server access rights for future automated database operations.

Run scripts locally on the Nodinite Application Server using the Logging Service credentials.

Why this approach?

  • Verifies permissions – Confirms the Logging Service has proper SQL Server access rights.
  • Future-proofs deployments – The Logging Service may automatically spawn new Log Databases when the Active database grows too large or reaches a time-based retention threshold (controlled by System Parameters).
  • Security best practice – Ensures service account has minimal required permissions.

Execution Steps

Step 4.1: Open an elevated Command Prompt

  1. On the Nodinite Application Server, open Command Prompt as Administrator.
  2. Optionally, run the command prompt as the Logging Service user account using runas:
runas /user:DOMAIN\NodiniteLogSvc cmd.exe

Tip

Alternatively, you can run the commands as a SQL Server sysadmin user if you have those credentials.

Step 4.2: Navigate to the SqlPackage.exe directory

cd "C:\Program Files\Microsoft SQL Server\170\DAC\bin"

Note

The exact path depends on your SQL Server Tools version. Common paths:

  • SQL Server 2025: C:\Program Files\Microsoft SQL Server\170\DAC\bin
  • SQL Server 2022: C:\Program Files\Microsoft SQL Server\160\DAC\bin
  • SQL Server 2019: C:\Program Files\Microsoft SQL Server\150\DAC\bin
  • SQL Server 2017: C:\Program Files\Microsoft SQL Server\140\DAC\bin

Step 4.3: Execute the Active Log Database script

Paste and execute the script for the Active Log Database (replace DEMO with your environment name and adjust server/database names):

"C:\Program Files\Microsoft SQL Server\170\DAC\bin\SqlPackage.exe" /Action:Publish /SourceFile:"C:\Program Files\Nodinite\DEMO\LoggingService\Nodinite.LogDatabase.dacpac" /TargetServerName:"localhost" /TargetDatabaseName:"Nodinite_DEMO_Log" /SourceEncryptConnection:true /TargetEncryptConnection:true /SourceTrustServerCertificate:true /TargetTrustServerCertificate:true /Properties:RegisterDataTierApplication=True /Properties:BlockOnPossibleDataLoss=False /Properties:BlockWhenDriftDetected=False /Properties:DropIndexesNotInSource=True /Properties:DropConstraintsNotInSource=True /Properties:ScriptNewConstraintValidation=False /Properties:PopulateFilesOnFilegroups=False /v:CustomPathData="null" /v:CustomPathImagedata="null" /v:CustomPathIndex="null" /v:RecoveryMode="FULL"

Important

The /v:RecoveryMode parameter value (SIMPLE or FULL) is automatically set by the Update Guide based on your LogDatabaseRecovery system parameter. Do not change this value manually unless you understand the implications for your SQL Server environment.

Expected output:

Importing to database 'NodiniteLog' on server 'SQL-SERVER-01'.
Creating deployment plan
Initializing deployment
Verifying deployment plan
Analyzing deployment plan
Updating database
Successfully published database.

Step 4.4: Execute the Archive Log Database scripts

Repeat Step 4.3 for each Archive Log Database script.

Important

If you encounter errors during script execution, do not proceed. Review the error messages, consult your SQL Server DBA, and resolve issues before continuing.

Common Errors and Solutions

Error Message Cause Solution
Login failed for user 'DOMAIN\NodiniteLogSvc' Logging Service account lacks SQL Server permissions. Grant db_owner role to the service account for the Log Databases.
Cannot find file: Nodinite.LogDatabase.dacpac .DACPAC file path is incorrect. Verify the file exists at the specified path. Copy from installation folder if needed.
Database is in use Active connections exist to the database. Ensure Logging/Monitoring Services are stopped. Close any open SQL Server Management Studio connections.
Timeout expired Large database or slow server. Increase the timeout value in the connection string: Connection Timeout=300;

Click the Next button in the Update Guide to proceed to Step 4.


Step 4: Set Archive Databases Back from Read-Only Mode

After successfully applying schema updates, restore Archive Log Databases to read-write mode.

In the Update Guide:

  1. Click the Set Read-Write button.
  2. The Update Guide will execute SQL commands to restore Archive databases to read-write mode.

Set Read-Write Mode
Example: Archive databases restored to read-write mode.

Note

The Active Log Database remains in read-write mode throughout the process.

If the operation is successful, the Update Guide will display a success message.
Update Step UnSetReadOnly Success
Example: Archive databases restored to read-write mode.

Click the Next button to proceed to Step 5.


Step 5: Restart Services

Resume normal Nodinite operations by starting the Logging Service and Monitoring Service.

In the Update Guide:

  1. Click the Start Services button.
  2. Wait for confirmation that both services have started successfully.

Started Services
Example: Services successfully started.

Tip

After starting services, verify log collection is working by navigating to Log | Search and confirming new log events are being received.


Frequently Asked Questions

Do I need to update all Archive databases?

Yes. All Archive Log Databases should use the same schema version as the Active database to ensure consistency and prevent errors when querying historical data.

What if I have 50+ Archive databases?

For environments with many Archive databases, consider:

  1. Script automation – Combine all Archive scripts into a single PowerShell script with a loop.
  2. Staged rollout – Update critical/recent archives first, then older archives during subsequent maintenance windows.
  3. Archival strategy review – Evaluate if older archives can be exported and removed from active SQL Server.

Can I skip updating Archive databases?

Not recommended. Skipping Archive database updates may cause:

  • Query errors when searching historical log data.
  • Incompatibility with new Nodinite features that rely on schema changes.
  • Difficulty troubleshooting issues across Active and Archive databases.

How long does the update process take?

Update duration depends on:

  • Database size – Larger databases take longer (expect 1-5 minutes per database).
  • SQL Server performance – Faster servers complete updates more quickly.
  • Number of Archive databases – More archives = longer total time.

Typical update times:

  • Active database (10 GB): 2-5 minutes
  • Archive database (50 GB): 5-15 minutes
  • Total for 10 databases: 30-90 minutes

What if the Logging Service account doesn't have permissions?

If the Logging Service account lacks SQL Server permissions, you have two options:

Option 1: Grant permissions (recommended)

-- Grant db_owner role to Logging Service account
USE [NodiniteLog]
GO
CREATE USER [DOMAIN\NodiniteLogSvc] FOR LOGIN [DOMAIN\NodiniteLogSvc]
GO
ALTER ROLE [db_owner] ADD MEMBER [DOMAIN\NodiniteLogSvc]
GO

Option 2: Run scripts as SQL Server administrator

Run the scripts using a SQL Server sa account or domain administrator with elevated privileges. However, this does not verify that the Logging Service account has proper permissions for future automated database operations.

Do I need to restart the Nodinite Application Server?

No. Restarting the server is not required. Only the Logging Service and Monitoring Service need to be stopped and restarted.

Can I automate this process?

Yes. For enterprise environments with multiple Nodinite instances, consider creating a PowerShell script that:

  1. Calls the Nodinite Web API to retrieve Log Database metadata.
  2. Stops services via Windows Service Control Manager.
  3. Executes SqlPackage.exe commands in a loop.
  4. Restarts services.
  5. Sends email notification on completion.

Warning

Always test automation scripts in a non-production environment first.



Next Steps