Audit Vault Documentation

Complete guide to installing, configuring, and using Audit Vault for tamper-evident audit logging in Jenkins.

Installation

Audit Vault can be installed in two ways:

Option 1: Upload HPI File

  1. Download the latest audit-vault.hpi file from your dashboard
  2. Navigate to Manage Jenkins → Plugins → Advanced Settings
  3. Under "Deploy Plugin", click "Choose File" and select the HPI file
  4. Click "Deploy" to install the plugin

Option 2: Manual Installation

  1. Copy the audit-vault.hpi file to your Jenkins plugins directory:
    cp audit-vault.hpi $JENKINS_HOME/plugins/
  2. Restart Jenkins to load the plugin

Note

Audit Vault uses SQLite for local event storage. The database file is created automatically at $JENKINS_HOME/audit-vault/events.db.

Quick Start

Get audit logging running in minutes:

1. Activate Your License

Navigate to Manage Jenkins → System → Audit Vault. Enter your license key and click "Validate License".

2. Enable Event Capture

Check "Enable Audit Vault" and select which event categories to capture (authentication, jobs, credentials, system).

3. Configure S3 Export (Optional)

Enter your S3 bucket name, region, and AWS credentials. Click "Test S3 Connection" to verify. Set the export interval (default: 60 minutes).

4. Save Configuration

Click "Save". Audit Vault immediately begins capturing events and storing them in the local SQLite database.

5. View the Dashboard

Navigate to Audit Vault in the Jenkins sidebar to view event stats, export status, and chain integrity.

Requirements

  • Jenkins Version: 2.426.x LTS or later
  • Java Version: Java 17 or later
  • Network: HTTPS connectivity for license validation
  • Disk Space: SQLite database grows with event volume (approximately 1 KB per event)
  • S3 Export (optional): AWS credentials with s3:PutObject permission on the target bucket

Global Settings

Configure Audit Vault in Manage Jenkins → System → Audit Vault.

Enable Audit Vault

Master toggle to enable or disable all event capture. When disabled, no new events are recorded but existing data is preserved.

License Key

Your ozen_av_ license key. Click "Validate License" to activate. The plugin requires a valid license to capture events.

Event Capture Configuration

Control which categories of events are captured. Each toggle controls a group of related event types.

Capture Toggles

  • Authentication Events: Login, logout, failed authentication attempts
  • Job Events: Job creation, modification, deletion, copying, renaming
  • Build Events: Build started, completed, deleted (includes trigger cause and result)
  • Credential Events: Credential creation, modification, deletion, access
  • System Events: Plugin changes, configuration modifications, node activity, security realm changes

Tip

For compliance purposes, enable all event categories. You can always filter events by category when querying or exporting.

S3 Export Configuration

Automatically export audit events to Amazon S3 for long-term storage and SIEM integration.

S3 Settings

  • Bucket Name: Target S3 bucket (must already exist)
  • Region: AWS region (e.g., us-east-1)
  • Key Prefix: Optional path prefix for exported files (e.g., jenkins/audit/)
  • Access Key ID: AWS access key with S3 write permissions
  • Secret Access Key: Corresponding AWS secret key
  • Enable Encryption: Server-side AES-256 encryption on uploaded objects
  • Export Interval: Minutes between export runs (default: 60, minimum: 5)

Required IAM Permissions

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "s3:PutObject",
        "s3:PutObjectAcl"
      ],
      "Resource": "arn:aws:s3:::your-bucket-name/*"
    }
  ]
}

S3 Path Structure

Events are exported with date-based partitioning for efficient querying:

{prefix}/yyyy/MM/dd/HH/audit-{timestamp}-{count}.ndjson

Example: jenkins/audit/2026/02/17/14/audit-1708182000-207.ndjson

Testing the Connection

Click "Test S3 Connection" to verify your credentials and bucket access. The test uploads a small test object and confirms write permissions.

Important

Events are buffered locally in SQLite and only marked as exported after successful S3 upload. No events are lost during network outages or S3 failures.

Event Types

Audit Vault captures 35+ event types across 7 categories:

Authentication

  • USER_LOGIN - Successful user authentication
  • LOGIN_FAILED - Failed authentication attempt
  • USER_LOGOUT - User logout

Authorization

  • PERMISSION_CHANGED - Permission modification
  • SECURITY_REALM_CHANGED - Security realm configuration change

Job Activity

  • JOB_CREATED - New job created
  • JOB_MODIFIED - Job configuration changed
  • JOB_DELETED - Job deleted
  • JOB_COPIED - Job copied from another
  • JOB_RENAMED - Job renamed

Build Activity

  • BUILD_STARTED - Build execution started (includes trigger cause)
  • BUILD_COMPLETED - Build finished (includes result: SUCCESS, FAILURE, etc.)
  • BUILD_DELETED - Build record deleted

Credential Events

  • CREDENTIAL_CREATED - New credential added
  • CREDENTIAL_MODIFIED - Credential updated
  • CREDENTIAL_DELETED - Credential removed
  • CREDENTIAL_ACCESSED - Credential accessed by a job or user

System Events

  • PLUGIN_INSTALLED / PLUGIN_UPDATED / PLUGIN_REMOVED - Plugin lifecycle
  • CONFIG_CHANGED - System configuration modified
  • NODE_ADDED / NODE_REMOVED / NODE_ONLINE / NODE_OFFLINE - Node activity
  • SECURITY_CONFIG_CHANGED - Security settings modified

Severity Levels

Each event type has a default severity:

  • INFO - Normal operations (logins, builds, config reads)
  • WARNING - Notable actions (permission changes, credential access)
  • ERROR - Failed operations (login failures, export failures)
  • CRITICAL - Security-sensitive events (security realm changes)

Tamper-Evident Logging

Every audit event includes cryptographic integrity guarantees.

SHA-256 Checksums

Each event's checksum is computed from its core fields:

SHA-256(id | timestamp | eventType | actor | resourceType | resourceId | summary | prevHash)

Hash Chaining

Events are linked in a chain where each event includes the checksum of the previous event (prevHash). This creates a verifiable chain of custody:

Event 1: checksum=abc123, prevHash=0000...0000 (genesis)
Event 2: checksum=def456, prevHash=abc123
Event 3: checksum=ghi789, prevHash=def456

If any event is modified or deleted, the chain breaks and integrity verification will detect the tampering.

Genesis Hash

The first event in the chain uses a special genesis hash (0000...0000) as its prevHash, establishing the start of the chain.

Integrity Verification

Verify the integrity of your entire audit trail with one click from the dashboard.

How It Works

  1. Walks all events from oldest to newest
  2. Verifies each event's prevHash matches the previous event's checksum
  3. Recomputes each event's checksum and compares it to the stored value
  4. Reports any broken links or corrupted events

Verification Result

  • Valid: All events verified, chain is intact
  • Failed: Specific event where the chain breaks is identified

Tip

Run integrity verification periodically and after any system maintenance. The result is displayed on the Audit Vault dashboard.

Dashboard

Access the Audit Vault dashboard from the Jenkins sidebar at Audit Vault.

Dashboard Sections

  • License Status: Current license state and expiration
  • Event Statistics: Total events, unexported count, events by category and severity
  • Chain Integrity: Result of the last integrity verification
  • Export Status: Per-destination export health, last export time, event counts
  • Recent Events: Table of the 50 most recent audit events with type, actor, summary, and timestamp

Export Format

Events are exported in NDJSON (Newline Delimited JSON) format, one event per line. This format is compatible with most SIEM tools including Splunk, Datadog, and Elasticsearch.

Event JSON Schema

{
  "id": "a1b2c3d4",
  "timestamp": "2026-02-17T14:30:00Z",
  "eventType": "USER_LOGIN",
  "category": "AUTHENTICATION",
  "severity": "INFO",
  "actor": "admin",
  "actorIp": "10.0.1.50",
  "resourceType": "user",
  "resourceId": "admin",
  "resourceName": "admin",
  "summary": "User admin logged in",
  "details": "Authentication via password",
  "metadata": {},
  "checksum": "sha256:abc123...",
  "prevHash": "sha256:def456..."
}

SIEM Integration

Point your SIEM tool at the S3 bucket path to ingest audit events. The date-based partitioning supports efficient time-range queries:

  • Splunk: Use the S3 input with sourcetype _json
  • Datadog: Configure an S3 log source with JSON parsing
  • Elasticsearch: Use Logstash S3 input with JSON codec

Licensing

Activating Your License

  1. Purchase a subscription from the product page
  2. Copy your license key from the dashboard
  3. In Jenkins, navigate to Manage Jenkins → System → Audit Vault
  4. Enter your license key and click "Validate License"

License Validation

Audit Vault validates your license periodically (every 24 hours). If the license server is unreachable, a 72-hour grace period allows continued operation.

Unlicensed Behavior

Without a valid license:

  • Event capture is disabled
  • Existing events remain in the database and can be viewed
  • S3 export is paused
  • A warning banner is shown to admins

FAQ

How much disk space does Audit Vault use?

Each event is approximately 1 KB in the SQLite database. A moderately active Jenkins instance generating 1,000 events/day would use about 1 MB/day or 365 MB/year. Exported events can be purged from the local database to reclaim space.

What happens if Jenkins crashes during event capture?

SQLite with WAL (Write-Ahead Logging) mode ensures durability. Events that were committed to the database before the crash are preserved. The hash chain resumes from the last committed event on restart.

Can I export to destinations other than S3?

Currently, S3 is the supported export destination. Any S3-compatible storage works, including MinIO, DigitalOcean Spaces, and Backblaze B2. Additional destinations (GCS, Azure Blob, SFTP) are on the roadmap.

Does Audit Vault capture build log contents?

No. Audit Vault captures metadata about builds (started, completed, result, trigger cause) but never captures build log contents. This keeps event sizes small and avoids storing sensitive output.

What data leaves my Jenkins instance?

Only two types of outbound communication:

  • License validation: License key and controller ID sent to ozenops.dev
  • S3 export: Audit events sent to your configured S3 bucket (if enabled)

No telemetry, analytics, or usage data is collected.

How do I verify integrity of exported events?

Each exported event contains its checksum and prevHash. You can independently verify the chain by reading the NDJSON files and recomputing the SHA-256 checksums in sequence.

Changelog

Version 1.0.0

Initial Release

  • 35+ event types across 7 categories
  • SHA-256 tamper-evident checksums with hash chaining
  • SQLite local storage with WAL mode
  • Automated S3 export with date-based partitioning
  • NDJSON export format for SIEM compatibility
  • Server-side AES-256 encryption for S3 exports
  • Real-time audit dashboard
  • One-click chain integrity verification
  • Configurable event capture per category
  • License management with 72-hour grace period