- 3 minutes to read

FAQ/Troubleshooting the Pickup LogEvents Service Monitoring Agent

Easily resolve issues and get expert help for the Pickup LogEvents Service in Nodinite Logging. On this page, you'll find:

  • ✅ Step-by-step troubleshooting for common problems
  • ✅ Direct support contact and information to speed up resolution
  • ✅ Quick access to installation, configuration, and related resources

If you encounter any problems, ensure you use the latest version and review the Prerequisites.

If you have issues you cannot solve, contact our Support, or email support@nodinite.com. Please include:

  • Installed version (x.x.x.x)
  • Diagnostic files

What is the max size supported?

Nodinite version 6 and 7 has a maximum supported size (single or batch) of 512 MB. The base64-encoded "body" field in a JSON log event increases the size of the data by about 33%. E.g, a 50 MB payload (body) becomes approximately 67 MB when base64-encoded.

If you need to go higher you cannot use the Nodinite Log API directly. Instead, configure the agent to use the Nodinite Configuration Database.

Test with very large file

To test with a very large file, you can create a large text file using the following PowerShell command:

Important: Notice the OriginalMessageTypeName field in the example below. This determines which Search Field Expressions will extract your business data. Without a proper Message Type, your event is logged but your data remains unextracted. Learn more about Log Event processing.

# Efficiently create a large JSON file (>50 MB) with a huge base64-encoded "Body" field.
# All JSON fields start with a capital letter.

# Output file path
$outputFile = "large_log.json"

# Get current local time with offset
$logDateTime = [DateTimeOffset]::Now.ToString("o")

# Target size for the base64 body (in MB)
$targetSizeMB = 51
$bytesPerChunk = 4MB
$totalBytes = $targetSizeMB * 1MB

$binaryData = New-Object byte[] $totalBytes
$rng = [System.Security.Cryptography.RandomNumberGenerator]::Create()
$rng.GetBytes($binaryData)

# Encode the entire binary blob to Base64 in one shot
$base64Body = [Convert]::ToBase64String($binaryData)

# Compose the final JSON using string interpolation and capitalized field names
$json = @"
{
  "LogAgentValueId": 42,
  "EndPointName": "INT101: Receive Hello World Log Events",
  "EndPointUri": "C:\\temp\\in",
  "EndPointDirection": 0,
  "EndPointTypeId": 60,
  "OriginalMessageTypeName": "Hello.World.File/1.0",
  "EventDirection": 17,
  "LogDateTime": "$logDateTime",
  "ProcessingUser": "DOMAIN\\user",
  "SequenceNo": 0,
  "EventNumber": 0,
  "LogText": "File OK",
  "ApplicationInterchangeId": "",
  "LocalInterchangeId": "00000000-0000-0000-0000-000000000000",
  "LogStatus": 0,
  "ProcessName": "My Process",
  "ProcessingMachineName": "localhost",
  "ProcessingModuleName": "INT101-HelloWorld-Application",
  "ProcessingModuleType": "FilePickup",
  "ServiceInstanceActivityId": "00000000-0000-0000-0000-000000000000",
  "ProcessingTime": 80,
  "Body": "$base64Body",
  "Bodies": null,
  "Context": {
    "CorrelationId": "064205E2-F7CF-43A6-B514-4B55536C2B67",
    "ExtendedProperties/1.0#Filename": "HelloWorld.json"
  }
}
"@

# Write to output file
Set-Content -Path $outputFile -Value $json -Encoding UTF8


Write-Host "Large JSON file created: $outputFile"

This script creates a large JSON file named large_log.json with a base64-encoded "body" field exceeding 50 MB. It efficiently generates random bytes in chunks, encodes them in base64, and writes them to the output file without consuming excessive memory.


Next Step

Install Log Agent - Pickup Service
Configure

Pickup Log Events Service Logging Agent
JSON Log Event