MSEndpointMgr

How to use a scheduled task to query SQL

Several months ago I created a script that would allow someone to extract information from the ConfigMgr SQL Database and e-mail that data. I wrote this as a workaround to SSRS subscription reports at the time as the amount of data pulled and e-mailed was so large it needed to be compressed before it could be sent. I then promptly forgot about it because I said to myself I won’t ever see a need to do that again. Oh how wrong I was.

Lately I’ve often heard from customers looking to extract data from ConfigMgr to add to the CMDB. Often times the vendor they are working with would like them to create a custom view in the ConfigMgr database to streamline the process and as we all well know that’s not allowed. So I threw out the idea of why not just create the dataset and dump it to a file share for pickup on a daily basis in the widely accepted format of CSV.

I figured though that since I’ve now been asked more than once about a way to extract data from ConfigMgr and save it to a CSV on a schedule I would do a quick write up on how to write a scheduled task to query SQL and save the data to a CSV.

The logic for this works in a few parts if you simply want to download the script and review it yourself your can do so from the ConfigMgr GitHub repository here:

https://github.com/MSEndpointMgr/ConfigMgr/blob/master/Modules/Scripts/get-SQLDataPowerShell.ps1

Report Collection Region

The script is comprised of several functions followed by an action region that calls the functions in the appropriate order. The first function which does a large amount of the work is the Invoke-ReportCommand. When working with the script you will need to change the query string if you want to edit or change the SQL query to store different information. The example script below gets some basic hardware information and if the device has a cellular network card or not.

function Invoke-ReportCommand
{
param
  (
    [Parameter(Mandatory = $False)]
    [string]$ServerName,
    [Parameter(Mandatory = $False)]
        [string]$DataBaseName
    )
    Write-Log "Setting the CSV Path to export the Data..."
    $DAILYCSVPATH = 'C:\SCRIPTS\REPORTS\' + (Get-Date).ToString('MM-dd-yyyy') + '.CSV'
    #Sets the path to where you want the report to be sent to this could be turned into a parameter change as needed. 
    Write-Log "Succesfully set all of the path locations now validating those paths actually exist"
    Write-Log -Message "Invoking SQL Commands to retrieve information"
    #The below command is the invoke SQL command that is then called this is the important part if you want to change
    #What data is going to be returned simply replace the content below with your SQL Query inside of the Double Quotes.
    Invoke-Sqlcmd -ServerInstance $ServerName -Database $DataBaseName -Query "With T1 as (
        select * from v_GS_NETWORK_ADAPTER
        where ProductName0 like '%broadband%'
        )
        Select distinct v_R_System.Name0 as 'Name'
            , v_GS_COMPUTER_SYSTEM.Manufacturer0 as 'Manufacturer'
            , v_GS_COMPUTER_SYSTEM.Model0 as 'Model'
            , v_GS_PC_BIOS.SerialNumber0 as 'Serial Number'
            , v_GS_PROCESSOR.Name0 as 'Processor'
            , v_GS_OPERATING_SYSTEM.TotalVisibleMemorySize0 as 'RAM in MB'
            , v_GS_LOGICAL_DISK.DeviceID0 as 'Drive Letter'
            , (v_GS_LOGICAL_DISK.Size0/1000) as 'Drive Size'
        --    , v_R_System.Client_Version0 as 'Client'
            , T1.ProductName0 as 'Cellular'
           from v_R_System
        left outer join v_GS_OPERATING_SYSTEM on v_R_System.ResourceID = v_GS_OPERATING_SYSTEM.ResourceID
        Left Outer Join v_GS_PC_BIOS on v_R_System.ResourceID = v_GS_PC_BIOS.ResourceID
        LEFT outer Join v_GS_PROCESSOR on v_R_System.ResourceID = v_GS_PROCESSOR.ResourceID
        LEFT OUTER JOIN v_GS_COMPUTER_SYSTEM on v_R_System.ResourceID = v_GS_COMPUTER_SYSTEM.ResourceID
        LEFT OUTER JOIN V_GS_System on V_r_system.ResourceID = V_GS_System.REsourceID
        LEFT OUTER JOIN v_GS_LOGICAL_DISK on v_R_System.ResourceID = v_GS_LOGICAL_DISK.ResourceID
        LEFT OUTER JOIN T1 on v_R_System.ResourceID = T1.ResourceID
        Where v_GS_SYSTEM.SystemRole0 = 'Workstation' and v_GS_COMPUTER_SYSTEM.Model0 not like '%Virtual%' and v_GS_LOGICAL_DISK.DeviceID0 = 'C:'
        " | Export-Csv -NoTypeInformation -Path $DAILYCSVPATH
    #The above query is then stored into a varibale and logged that its completed before being returned back.
    Write-Log -Message "Completed the data retrieval"
 }

The second function in this region is a way to clean up the reports that get created. This particular command doesn’t have to be used for reports it could be used for other things as well if you wanted to copy and paste this snippit.

function Remove-OldReports
#This is a function to remove any of the old reports that are older in nature in a path location.
{
    Param
    (
        [Parameter(Mandatory = $True)]
        [string]$DaysOld,
        [Parameter(Mandatory = $True)]
        [string]$Path
    )
    try 
    {
        $FilesToRemove = Get-ChildItem -File -Recurse -Path $Path -Force | Where-Object {$_.LastWriteTime -le (Get-Date).AddDays($DaysOld)} | Select-Object Name,LastWriteTime,FullName
        #Gets the list of files that meet the criteria specified in the parameter set. 
        ForEach($File in $FilesToRemove)
            {
                Write-Log -LogLevel 2 -Message "File was found that met the removal criteria $($File.Name) will be removed"
                #Logs what item is about to be removed
                Remove-Item -Path $File.FullName
                #Removes the item and then logs it was succefully removed.
                Write-Log -Message "File $($File.Name) was removed succesfully"
            }

    }
    catch
    {
        write-log -LogLevel 3 'Failed to run the removal'
        #Logs event for failure to remove. 
    }
}

I’m not going to review the helper function region here as most of the functions there are pieces that I have used in other scripts and recylced.

Perform Actions Region

The action region is where all of the functions are called together to perform the various actions. This is the actual ‘script’ section of this entire script everything else is a function that can be cut apart and moved around as needed for your environment or other scripts and kinda reflects how I think about writing scripts. Please note where the invoke-report command is called you will need to change the ‘DatabaseName’ paramater to be your ConfigMgr database – structured CM_SITECODE

############################################
#Region PerformActions
$LogPath = "C:\scripts\Logs\" + (Get-Date -UFormat "%m-%d-%Y-%S") + "_" + $MyInvocation.MyCommand.Name
#Sets the log path for where you want to generate logs - Modify the string at the start C:\scripts\Logs to change the directory reccomended ot not change the format of the file name.
$LogFile = $LogPath + ".log"
#Appends the .LOG to the end of the path staatement for use when referencing the file v.s. the log location itself.
Start-Log -FilePath $LogFile
#Sets the loging location globally for the duration of the session
Write-Log -Message "Starting Data Gather"
#Logs the start of the process
Write-Log -Message "Evaluating if the SQL powershell cmdlets are loaded"
#Begins evaluation to see if we can run this.
if(Test-Module -ModuleName SQLPS -Remediate $true)
{
    write-log -Message "All requirements have been met now executing the data gather step"
    Invoke-ReportCommand -ServerName $ENV:COMPUTERNAME -DataBaseName CM_P01
    #This actually performs the data generation - change the database to match the CM database future development may automatically determine the database name.
    Write-Log -Message "Data Gathering complete"
}
#remote reprots older than 7 days and logs older than 7 days.
Remove-OldReports -DaysOld '-7' -Path "C:\Scripts\logs\" 
Remove-OldReports -DaysOld '-7' -Path "C:\Scripts\Reports\"
Write-Log -Message "Cleanup has been completed now exiting"
#Endregion PerformActions
############################################

This section is what decides where the logs will be written to and calls all of the functions in the correct order. If you give the script a few tweaks it should work just fine for your environment! Once the script is set up and running it’s very easy to create a scheduled task that calls the script every day to save the data you could of course easily add a line to e-mail the CSV file instead.

First we need to create some folders where we want our script and our reports to live. I typically create a folder called C:\scripts and then some sub-folders for things like re-ports or scheduled tasks or logs. In our example here we will use C:\scripts\Logs and C:\scripts\reports.

Once the script is copied and working its really easy to create a scheduled task first we create a new basic scheduled task by opening task scheduler and then selecting the basic task option from the right hand side.

Then select basic task.

This will then open up a new prompt to set up a basic task.

This will then take you to the trigger state where you’ll select how often you want the script to run. I’ve selected for it to run every day in my example at 7:30 A.M. this will allow me to have a new inventory file daily.

You’ll then need to choos an action in this case there are three options select ‘start a program’ which will bring you to this screen and you’ll indicate you wish to start PowerShell with the following argument:

-file C:\scripts\get-xTSsqldataPowerShell.ps1 -ExecutionPolicy Bypass -NoProfile -WindowStyle Hidden

This will then bring you to the finish prompt you’ll select finish and then you’ll need to make one more change so that the SQL command runs as system instead of as a user do this by selecting change user or group and change the user to ‘system’

Once this is done simply run the task to validate it works.

Jordan Benzing

Jordan has been working in the Industry since 2009. Since starting he’s worked with Active Directory, Group Policy, SCCM, SCOM and PowerShell. Jordan most recently worked in the healthcare industry as an SCCM Infrastructure Team lead supporting over 150,000 endpoints. Jordan currently works as a Senior consultant for TrueSec Inc in the U.S. Most recently his focus has been in SQL Reporting for SCCM, creation of PowerShell scripts to automate tasks and PowerBI.

Add comment

Sponsors

Categories

MSEndpointMgr.com use cookies to ensure that we give you the best experience on our website.