Suggestion for a cleaner job-log

Suggestion for a cleaner job-log

avatar
(anonymous user)
Product: PowerShell Universal
Version: 2026.1.2


Hi.

I have a job that runs very often, polling our e services platform for new tickets. 99% of the times there is nothing to fetch, so the job log is quickly getting filled up by jobs that are not interesting, so finding the jobs that actually did something is hard. Also, even though I’ve set the max history of this script very high, the log posts quickly get rotated out.

Wouldn’t it be nice with a cmdlet that you could put in a script, telling the PSU environment that this particular job shouldn’t be stored in the job log at all? Is this something that would be possible to implement?

All Comments (3)

avatar

This would be a nice feature, I have the same problem in our environment.. until an official solution arrives I’ve slapped together something custom to solve it. For anyone interested.

A trigger script that runs on ANY job completion:

param ($Job)

$jobId = $Job.Id

Write-Information "$jobId - $($Job.Script.Name) ran by $($Job.Identity.Name) @ $($Job.StartTime) for $(($Job.EndTime - $Job.StartTime).TotalSeconds) seconds."

if (!$Job.Schedule)
{
    Write-Information "Job is not scheduled - aborting."
    return
}

$output = Get-PSUJobPipelineOutput -JobId $jobId -Integrated #-AppToken $apiToken.Token

$HostOutput = Get-PSUJobOutput -JobId $jobId -Integrated

if ($HostOutput)
{
    Write-Information "[Job Output]"
    Write-Information ($HostOutput -join [Environment]::NewLine)
    Write-Information "[/Job Output]"
}

if ($output.Archive -eq $true)
{
    # Archieve job

    $apiToken = Get-PSUAppToken -Identity "JobArchiver" -Integrated | Where-Object -Property Revoked -Eq $false | Select-Object -ExpandProperty Token -First 1

    if (!$apiToken)
    {
        Write-Error "Failed to retrieve api token!"
        return
    }

    Write-Host "Archiving job.. " -NoNewLine

    $response = Invoke-RestMethod -Uri "https://localhost/api/v1/job/archive/$jobId" -Method "DELETE" -SkipCertificateCheck -Headers @{

        Authorization="Bearer $apiToken"
    }

    if($response.Archived)
    {
        Write-Host "✅ OK" -ForegroundColor Green
    }
    else {
        Write-Host "Not archived?" -BackgroundColor Red
    }
}


Write-Debug "Triggered by job:"
Write-Debug ($Job | ConvertTo-Json -WarningAction Ignore)


and then I return an object with Archive = $true at the end of every job that I want to discard. Example:

return @{
    Archive = ($actionCount -eq 0)
}


avatar

Nice workaround! I’ll look at it, should be easily adaptable for my environment.

avatar

I’ve had a need limit log entries. When the log table gets over a certain size the queries take too long to complete and cause issues.

I’ve taken a different approach which is to purge entries in the table older than 5 days by changing the setting in Powershell Universal Admin > Settings > Data > Database Retention Log

I have a script scheduled daily that archives the entries to log files. It might need a little tweaking depending on your set up.

<#
.DESCRIPTION
    Powershell Universal Job Output Export

.NOTES
    Author: Mike Simmons
    Created: 05/03/2026
    Reviewed: 05/03/2026

.CHANGELOG
	<< 01/01/1970 >> << Editor Name >>	<< Change Description >>

#>

# Days to keep log files
$RetentionPeriod = 90

# SQL Server
$SQLServer = '<sql server>'

# SQL Database
$Database = '<database>'

# Ignores certificate issues on SQL server
Set-DbatoolsInsecureConnection -SessionOnly | Out-Null
Set-DbatoolsConfig -FullName logging.errorlogenabled -Value $false -PassThru | Register-DbatoolsConfig
Set-DbatoolsConfig -FullName logging.errorlogfileenabled -Value $false -PassThru | Register-DbatoolsConfig
Set-DbatoolsConfig -FullName logging.messagelogenabled -Value $false -PassThru | Register-DbatoolsConfig
Set-DbatoolsConfig -FullName logging.messagelogfileenabled -Value $false -PassThru | Register-DbatoolsConfig

# Get logs from previous 25 hours 15 minutes (Allows 1 hour overlap and allows for daylight saving issues)
$Query = @"
    SELECT
        j.ScriptFullPath,
        o.Message,
        o.Timestamp
    FROM [$Database].[dbo].[Job] j
    INNER JOIN [$Database].[dbo].[JobOutput] o
        ON j.Id = o.JobId
    WHERE o.Timestamp >= DATEADD(MINUTE, -1515, GETDATE())
    ORDER BY j.ScriptFullPath, o.Timestamp ASC;
"@

$Logs = Invoke-DBAQuery -SqlInstance $SQLServer -Query $Query -Verbose

# Loop through each log and save to a file
# C:\ProgramData\UniversalAutomation\Repository\Logs\<ScriptFolder>\<ScriptName>\<LogDate>.Log
Foreach ($Log in $Logs) {
    $ScriptPath = $Log.ScriptFullPath
    $Message = $Log.Message
    $TimeStamp = Get-Date($Log.Timestamp) -Format "dd-MM-yyyy HH:mm:ss"
    $Date = Get-Date($TimeStamp) -Format "yyyy-MM-dd"

    # If script path has changed, it means the date or the script has changed and we need a new file
    # So we append any existing data, deduplicate and write the content of $Rows to the file.

    if ($ScriptPath -match '\\') {
        $Folder = $ScriptPath.split('\') | Select -First 1
        $ScriptName = ($ScriptPath.split('\') | Select -Last 1).split('.') | Select -First 1
        $LogPath = "$Repository\Logs\$Folder\$ScriptName\$Date.log"
    } else {
        $ScriptName = $ScriptPath.split('.') | Select -First 1
        $LogPath = "$Repository\Logs\$ScriptName\$Date.log"
    }

    if (-not(Test-Path $LogPath)) {
        New-Item -ItemType File -Path $LogPath -Force | Out-Null
        Write-Output "[INFO]:: Created $LogPath"
    }

    if ($LogPath -ne $LastLogPath) {

        Write-Output "[INFO]:: Processing script $ScriptName"

        if ($LastLogPath) {
            if (Test-Path $LastLogPath) {

                $ExistingData = Get-Content $LastLogPath

                if ($ExistingData) {
                    $ExistingDataLineCount = $ExistingData.Count
                    $ExistingData += $Rows

                    $Rows = $ExistingData | Select -Unique

                    Write-Output "[INFO]:: Appending $($Rows.Count) lines to existing $ExistingDataLineCount lines in $LastLogPath"
                }
            }

            Set-Content -Path $LastLogPath -Value $Rows -Force
            Write-Output "[INFO]:: Updated $LastLogPath"

        }

        # Reset Rows variable
        $Rows = @()
    }


    # If we don't have a new file, we continue to build the log by adding to the $Rows array
    $LastScriptPath = $ScriptPath
    $LastLogPath = $LogPath

    $Rows += "[$TimeStamp]:: $Message" -Replace "`n"

}

# Write the final file
if (Test-Path $LogPath) {
    $ExistingData = Get-Content $LogPath

    if ($ExistingData) {
        $ExistingDataLineCount = $ExistingData.Count

        $ExistingData += $Rows

        $Rows = $ExistingData | Select -Unique

        Write-Output "[INFO]:: Appending $($Rows.Count) lines to existing $ExistingDataLineCount lines in $LogPath"
    } else {
        Write-Output "[INFO]:: Creating new file $LogPath with $($Rows.Count) lines"
    }
}

Set-Content -Path $LogPath -Value $Rows -Force

if ($ExistingData) {
    Write-Output "[INFO]:: Appended: $LogPath"
} else {
    Write-Output "[INFO]:: Created: $LogPath"
}


# Clean up after retention period expired
Get-ChildItem "$Repository\Logs\" -Recurse | Where {$_.CreationTime -lt (Get-Date).AddDays($RetentionPeriod * -1)} | Remove-Item -Force | Out-Null