(Findit) Locating previously written scripts

Another utility Function that I like to use a lot. Especially since I have quite a few scripts is this script that I and a friend worked on together:

function findit
{
    param
    (
    [Parameter(Mandatory=$True,Position=0)][string]$SearchString,
    [Parameter(Mandatory=$False)]$Path = "$env:USERPROFILE\Documents",
    [Parameter(Mandatory=$False)]$Filter = "*.ps1"
    )
$launcher = "psedit"    
if($host.name -eq "ConsoleHost")
{ $launcher = "notepad"}
elseif($Host.name -eq "Visual Studio Code Host")
{ $launcher = 'code' }

$s = Get-ChildItem -Path $Path -Filter $Filter -Recurse | Select-String $SearchString | select path, @{n="MatchingLines";e={"$($_.LineNumber.tostring("000")): $($_.Line -replace "^[ \t]*",'')"}} | group path | select name, @{n="Matches";e={$_.Group.MatchingLines  | Out-String}} | Out-GridView -PassThru 
 foreach ($t in $s){ iex "$launcher $($t.name)"}
}  

What this will do is Search through the path you pass and look for a file that has a specific item in it… it will then popup out-gridview and let you choose which files to open.

Until then

Keep Scripting

Advertisement

Parsing CCM\Logs

If you’ve ever worked with Configuration manager you’ll understand that there are quite a few logs on the Client side.  Opening and searching through them for actions that have taken place can be quite a task.  I needed to find when an item was logged during initial startup/build of a vm.  So I sought out tools to parse these logs to find out the status of  Configuration Manager client side. This post is about the tools/scripts I found and what I added to them to make it easier to discover and parse all the log files.

I started with the need to be able to just parse the log files.  I discovered that Rich Prescott in the community had done the work of parsing these log files with this script:

http://blog.richprescott.com/2017/07/sccm-log-parser.html

With that script in had I made two changes to the script.  The first change was to allow for all the files in the directory to be added to the return object.

 if(($Path -isnot [array]) -and (test-path $Path -PathType Container) )
{
$Path = Get-ChildItem "$path\*.log"
}

The second change allowed for the user to specify a tail amount. This allows for just a portion of the end of the log to be retrieved instead of the entire log.   That script can be found on one of my gists at the Tail end of this article.

 if($tail)
{
$lines = Get-Content -Path $File -tail $tail
}
else {
$lines = get-Content -path $file
}
ForEach($l in $lines )

 


function Get-CMLog
{
<#
.SYNOPSIS
Parses logs for System Center Configuration Manager.
.DESCRIPTION
Accepts a single log file or array of log files and parses them into objects. Shows both UTC and local time for troubleshooting across time zones.
.PARAMETER Path
Specifies the path to a log file or files.
.INPUTS
Path/FullName.
.OUTPUTS
PSCustomObject.
.EXAMPLE
C:\PS> Get-CMLog -Path Sample.log
Converts each log line in Sample.log into objects
UTCTime : 7/15/2013 3:28:08 PM
LocalTime : 7/15/2013 2:28:08 PM
FileName : sample.log
Component : TSPxe
Context :
Type : 3
TID : 1040
Reference : libsmsmessaging.cpp:9281
Message : content location request failed
.EXAMPLE
C:\PS> Get-ChildItem -Path C:\Windows\CCM\Logs | Select-String -Pattern 'failed' | Select -Unique Path | Get-CMLog
Find all log files in folder, create a unique list of files containing the phrase 'failed, and convert the logs into objects
UTCTime : 7/15/2013 3:28:08 PM
LocalTime : 7/15/2013 2:28:08 PM
FileName : sample.log
Component : TSPxe
Context :
Type : 3
TID : 1040
Reference : libsmsmessaging.cpp:9281
Message : content location request failed
.LINK
http://blog.richprescott.com
#>
param(
[Parameter(Mandatory=$true,
Position=0,
ValueFromPipelineByPropertyName=$true)]
[Alias("FullName")]
$Path,
$tail =10
)
PROCESS
{
if(($Path -isnot [array]) -and (test-path $Path PathType Container) )
{
$Path = Get-ChildItem "$path\*.log"
}
foreach ($File in $Path)
{
if(!( test-path $file))
{
$Path +=(Get-ChildItem "$file*.log").fullname
}
$FileName = Split-Path Path $File Leaf
if($tail)
{
$lines = Get-Content Path $File tail $tail
}
else {
$lines = get-Content path $file
}
ForEach($l in $lines ){
$l -match '\<\!\[LOG\[(?<Message>.*)?\]LOG\]\!\>\<time=\"(?<Time>.+)(?<TZAdjust>[+|-])(?<TZOffset>\d{2,3})\"\s+date=\"(?<Date>.+)?\"\s+component=\"(?<Component>.+)?\"\s+context="(?<Context>.*)?\"\s+type=\"(?<Type>\d)?\"\s+thread=\"(?<TID>\d+)?\"\s+file=\"(?<Reference>.+)?\"\>' | Out-Null
if($matches)
{
$UTCTime = [datetime]::ParseExact($("$($matches.date) $($matches.time)$($matches.TZAdjust)$($matches.TZOffset/60)"),"MM-dd-yyyy HH:mm:ss.fffz", $null, "AdjustToUniversal")
$LocalTime = [datetime]::ParseExact($("$($matches.date) $($matches.time)"),"MM-dd-yyyy HH:mm:ss.fff", $null)
}
[pscustomobject]@{
UTCTime = $UTCTime
LocalTime = $LocalTime
FileName = $FileName
Component = $matches.component
Context = $matches.context
Type = $matches.type
TID = $matches.TI
Reference = $matches.reference
Message = $matches.message
}
}
}
}
}
function Get-CCMLog
{
param([Parameter(Mandatory=$true,Position=0)]$ComputerName = '$env:computername', [Parameter(Mandatory=$true,Position=1)]$path = 'c:\windows\ccm\logs')
DynamicParam
{
$ParameterName = 'Log'
if($path.ToCharArray() -contains ':')
{
$FilePath = "\\$($ComputerName)\$($path -replace ':','$')"
}
else
{
$FilePath = "\\$($ComputerName)\$((get-item $path).FullName -replace ':','$')"
}
$logs = Get-ChildItem "$FilePath\*.log"
$LogNames = $logs.basename
$logAttribute = New-Object System.Management.Automation.ParameterAttribute
$logAttribute.Position = 2
$logAttribute.Mandatory = $true
$logAttribute.HelpMessage = 'Pick A log to parse'
$logCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
$logCollection.add($logAttribute)
$logValidateSet = New-Object System.Management.Automation.ValidateSetAttribute($LogNames)
$logCollection.add($logValidateSet)
$logParam = New-Object System.Management.Automation.RuntimeDefinedParameter($ParameterName,[string],$logCollection)
$logDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
$logDictionary.Add($ParameterName,$logParam)
return $logDictionary
}
begin {
# Bind the parameter to a friendly variable
$Log = $PsBoundParameters[$ParameterName]
}
process {
$sb2 = "$((Get-ChildItem function:get-cmlog).scriptblock)`r`n"
$sb1 = [scriptblock]::Create($sb2)
$results = Invoke-Command ComputerName $ComputerName ScriptBlock $sb1 ArgumentList "$path\$log.log"
[PSCustomObject]@{"$($log)Log"=$results}
}
}

view raw

get-ccmlog.ps1

hosted with ❤ by GitHub

I hope this helps someone.

Until then

Keep scripting

Thom

Uploading files to Azure Applications (kudu)

I needed to copy some content to my azure application that the Build and deploy that I constructed for it wouldn’t need to do every deploy every time.   So my quest began on how do I upload files to an Azure application.  The most common and recognized way of uploading files to azure applications is through webdeploy. I didn’t think I needed to package up and use webdeploy so I sought out a way to do this with PowerShell.  This post is about that pursuit.

Thanks to this article most of the work was done Copy files to Azure Web App with PowerShell and Kudu API.  All I needed to do was to put a loop around my file upload and use Octavie van Haaften‘s scripts.

So I started with get-childitem -recurse “$downloadFolder\content”.  Now that I had my content in a variable called $files I can put this in a foreach loop and use Octavie van Haaften‘s  Upload-FileToWebapp.

During the upload of the files I need to determine if the file from my local disk is a File or Directory.  I used the following classes to determine this:

[System.IO.DirectoryInfo] &  [System.IO.FileInfo]

If the item was a directory then I had to make the upload location match the location on disk.  I did this through a little bit of replacement logic and used the $kudufolder as my variable to use for the upload function from Octavie.


$kudufolder = ((($file.FullName).Replace($uploadfrom,'Content'))`
.replace('\','/')).trimstart('/')
$kudufolder = "$kudufolder/"
Upload-FileToWebApp -resourceGroupName myresourcegroup`
-webAppName mywebapp -kuduPath $kudufolder

The same holds true for the upload of a file. The only difference between the file and the directory is the /. When you are uploading/creating a directory / to kudu means a directory.


$kudufile = ((($file.FullName).Replace($uploadfrom,'Content'))`
.replace('\','/')).trimstart('/')
Upload-FileToWebApp -resourceGroupName myresourcegroup`
-webAppName mywebapp -localPath $file.FullName -kuduPath $kudufile

Here is the full script in the foreach loop with each check for a directory or file.


$downloadfolder = 'c:\temp\myAzureStorage'

$uploadfrom = "$downloadfolder\Content"

$files = get-childitem -Recurse "$downloadfolder\Content"

foreach($file in $files)
{
if($file -is [System.IO.DirectoryInfo])
{
$kudufolder = ((($file.FullName).Replace($uploadfrom,'Content')).replace('\','/')).trimstart('/')
$kudufolder = "$kudufolder/"
Upload-FileToWebApp -resourceGroupName myresourcegroup -webAppName mywebapp -kuduPath $kudufolder
}
elseif($file -is [System.IO.FileInfo])
{
$kudufile = ((($file.FullName).Replace($uploadfrom,'Content')).replace('\','/')).trimstart('/')
Upload-FileToWebApp -resourceGroupName myresourcegroup -webAppName mywebapp -localPath $file.FullName -kuduPath $kudufile
}
}


I hope this helps someone
Until then keep Scripting
Thom


 

Adding a user to a Group in Dynamics CRM 2016

Recently I’ve had to add a user to Dynamics CRM 2016. This post is about how I did that with a module from the PowerShell Gallery.

First thing I needed to do was to find something that was available for use against Dynamics CRM 2016.  So I searched PowerShell gallery and found this module: Microsoft.Xrm.Data.Powershell.  In addition to this module I found some handy samples to work with this module found here:  Microsoft.Xrm.Data.PowerShell.Samples 

I was able to take these samples and come up with a usable script to add users to a group in this application.   My purpose was to do the un-thinkable add users to a Admin group.   While probably not the best thing to do, in my situation it was what I needed to do.  So here is how I began.

I looked at this sample: UpdateCrmUserSettings.ps1

This helped me immensely in figuring out how I about connecting to my crm instance:


$adminUserCredentials = get-credential

$organizationName = 'MyOrg'

$serverUrl = 'http://mycrmserver.mycompany.com:80'

$loadedandCorrectVersion = (get-command -module 'Microsoft.Xrm.Data.Powershell' -ErrorAction Ignore).version -eq '2.5'
if(-not $loadedandCorrectVersion)
{
find-module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 | Install-Module -Scope CurrentUser -AllowClobber -Force
Import-Module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 -Force -RequiredVersion 2.5
}

$xcrmConn = Get-CrmConnection -OrganizationName $OrganizationName -ServerUrl $ServerUrl  -Credential $AdminUserCredentials 

I added some “plumbing” to try and force PowerShell to ensure that I only have version 2.5 of the module downloaded and imported into the session where I’m going to run this. The $xcrmConn will be the connection that we use to call every subsequent function in this update of our user.  According to the documentation you can specify this Connection as a global variable, I chose to not do this so that I could understand what is going on from each call I make to this module.

The next task was to try and figure out how to get all the users.  There are a bunch of different cmdlets that are there — From Get-CrmCurrentUserId to Get-MyCrmUserId as you can see below:


ps:\crm> get-command get*crm*user*

CommandType Name Version Source
----------- ---- ------- ------
Alias Get-CrmCurrentUserId 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserMailbox 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserPrivileges 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserSecurityRoles 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserSettings 2.5 Microsoft.Xrm.Data.Powershell
Function Get-MyCrmUserId 2.5 Microsoft.Xrm.Data.Powershell

None of them really seemed to deal or make sense out of how do I get all users or a specific user.. That is when I turned again back to the samples and found this Command

Get-CrmRecords

What I discovered is that you have to understand how to use the filters. The first thing  I tried was to get all the users in CRM.


$users = Get-CrmRecords -EntityLogicalName systemuser -conn $xcrmconn -Fields systemuserid,fullname

After several runs(trial and error) I was able to get to a workable call to the get-crmRecords. For an individual User.

In order to add a user as an admin we’ll need to get the user’s id.  Not only that we’ll also need to get the id of the Security Role that we are going to add them to.


$sysAdminrole = 'System Administrator'
#user we are going to filter on must be in the syntax of last, first
$userObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$domainUsername" -Fields domainname,fullname

$userGuid = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$userName = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
if($userGuid)
{$userRoles = (Get-CrmUserSecurityRoles -conn $xcrmconn -UserId $userGuid).roleid.guid}
else
{
Throw "$DomainUsername not found in $ServerUrl and Organization $OrganizationName"
}

$adminObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$($AdminUserCredentials.username)" -Fields domainname,fullname
#get the admins guid and user name
$adminId = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$AdminUserName = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
$adminRoleObject = Get-CrmUserSecurityRoles -conn $xcrmConn -UserId $adminId | Where-Object {$_.rolename -eq $sysAdminrole}
$adminroles = ($adminRoleObject).roleid.guid
$adminRoleName = $adminroleobject.rolename

Now that I have the required items for adding the role.  All i need to do is make sure that the role isn’t already there. Then add the Security Role ID to the user.  Now you have a user that has the System Admin  role added to it.

Full Script Follows:

#requires -module PowerShellGet
<# .SYNOPSIS A brief description of the updateusers.ps1 file. .DESCRIPTION A detailed description of the updateusers.ps1 file. .PARAMETER ServerUrl A description of the ServerUrl parameter. .PARAMETER OrganizationName Organization name in CRM For Example: yourorg .PARAMETER UserName2Add User name to add as an Admin To Crm For example Schumacher, Thomas .PARAMETER AdminUserCredentials Credentials that has admin privledges to the url passed. .EXAMPLE PS C:\> .\updateusers.ps1 -UserName2Add ‘Value1’ -xcrmCred (Get-Credential)

.NOTES
Additional information about the file.
#>
param
(
[string]$ServerUrl = ‘http://yourCrminstance.yourname.com:80&#8217;,
[string]$OrganizationName = ‘YourInstance’,
[Parameter(Mandatory = $true)]
[string]$DomainUsername =’domain\domainuser’,
[pscredential]$AdminUserCredentials = (Get-Credential)
)
$loadedandCorrectVersion = (get-command -module ‘Microsoft.Xrm.Data.Powershell’ -ErrorAction Ignore).version -eq ‘2.5’
if(-not $loadedandCorrectVersion)
{
find-module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 | Install-Module -Scope CurrentUser -AllowClobber -Force
Import-Module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 -Force -RequiredVersion 2.5
}
if(get-command -module ‘Microsoft.Xrm.Data.Powershell’)
{
$xcrmConn = Get-CrmConnection -OrganizationName $OrganizationName -ServerUrl $ServerUrl -Credential $AdminUserCredentials -Verbose
#https://github.com/seanmcne/Microsoft.Xrm.Data.PowerShell.Samples/blob/master/Microsoft.Xrm.Data.PowerShell.Samples/UpdateCrmUsersSettings/UpdateCrmUsersSettings.ps1

#get the necessary object for the admin
$sysAdminrole = ‘System Administrator’
#user we are going to filter on must be in the syntax of last, first
$userObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$domainUsername" -Fields domainname,fullname
$userGuid = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$userName = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
if($userGuid)
{$userRoles = (Get-CrmUserSecurityRoles -conn $xcrmconn -UserId $userGuid).roleid.guid}
else
{
Throw "$DomainUsername not found in $ServerUrl and Organization $OrganizationName"
}
$adminObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$($AdminUserCredentials.username)" -Fields domainname,fullname
#get the admins guid and user name
$adminId = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$AdminUserName = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
$adminRoleObject = Get-CrmUserSecurityRoles -conn $xcrmConn -UserId $adminId | Where-Object {$_.rolename -eq $sysAdminrole}
$adminroles = ($adminRoleObject).roleid.guid
$adminRoleName = $adminroleobject.rolename
if($adminRoleName -eq $sysAdminrole)
{
if($userroles -like $adminroles)
{
Write-Output "$DomainUsername is already an admin"
}
else
{
Add-CrmSecurityRoleToUser -conn $xcrmconn -UserId $userGuid -SecurityRoleId $adminId
Write-Output "$DomainUsername Added to AdminRole $adminRoleName"
}
}
else
{
Write-Warning "The $($AdminUserCredentials.username) doesn’t have the Role of ‘System Administrator’"
}
}
else
{ throw "cannot load the powershell module ‘Microsoft.Xrm.Data.Powershell’"}

I hope this helps someone

Until then keep Scripting

Thom

Slicing and Dicing Log Files

 


This post is part of the #PSBlogWeek PowerShell blogging series. #PSBlogWeek is a regular event where anyone interested in writing great content about PowerShell is welcome to volunteer for. The purpose is to pool our collective PowerShell knowledge together over a 5-day period and write about a topic that anyone using PowerShell may benefit from. #PSBlogWeek is a Twitter hashtag so feel free to stay up to date on the topic on Twitter at the #PSBlogWeek hashtag. For more information on #PSBlogWeek or if you’d like to volunteer for future sessions, contact Adam Bertram (@adbertram) on Twitter.


To begin to slice or dice a log file, first, we need to get
it into a variable.

To open a log file or text file, open a PowerShell prompt
and run a Get-Content
command. For example, this command uses Get-Content to get the content of the
License.txt file in the local path.

image001

Now that we know how to use Get-content, we can save the
results in whatever variable we choose. In my case, I chose $file.

image002

Now that we have the contents of our file in the $file
variable, we can begin to chop this file up.

To find out how many lines are in our file, type in $file.Count.
This expression uses the Count property of the string array in $file to get the
count of array elements.  By default, Get-Content reads the contents of the
file into a System.Array type. Each line is placed in a separate array element.
To see the first line of our file, type:

image003

As you can see from the first image, this is the first line
of our License.txt file.

Now, if we want to search through this line for matches on
the word ‘and’, we can use the Match operator.

image004

The Match operator automatically populates the $matches variable
with the first match of the item specified after the –Match operator.  If we
had used the –Match
operator
on the entire file our results would have shown us the number of
matches in a result set. For Brevity the count of the mached value and is
shown.

PS Ps:\> ($file -match
 'and').count
4
 PS Ps:\> ($file).count
10 
 

As you can see our total count for the $file is 10
and the matched number is 4.

If we wanted to replace the word and, instead of using a
match, we could use the –Replace
operator
.

image005

As you can see, the ‘and’ in the string is now
replaced with ‘replaced And’.

To do the same operation on the entire file use the –Replace
operator

 PS ps:\> $file -replace 'and', 'replaced And'

Copyright (c) 2010-2013 Keith
 Dahlby replaced And contributors Permission is hereby granted,
 free of charge, to any person obtaining a copy of this software replaced And
 associated documentation files (the "Software"), to deal in the Software
 without restriction, including without limitation the rights to use, copy,
 modify, merge, publish, distribute, sublicense, replaced And/or sell
 copies of the Software, replaced And to permit persons to whom the
 Software is furnished to do so, subject to the following conditions:

The above copyright notice
 replaced And this permission notice shall be included in all copies or
 substantial portions of the Software.
THE SOFTWARE IS PROVIDED
 "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING
 BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE replaced And NONINFRINGEMENT.
 IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
 DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
 OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
 SOFTWARE OR THE USE OR OTHER DEALINGS IN
 THE SOFTWARE. 

To save the file, pipe the results to the Set-Content cmdlet with
the path of where you want it saved.

To read the file in one string so there is no array, specify
the Raw dynamic parameter. It will save the contents of the file as a single
string in the variable you specify.

$file = get-content -raw
.\LICENSE.txt

To separate this file into two chunks, we’ll use the Substring
method
of string.

$file.substring(0,542) |
 out-file .\LICENSE1.txt
 

The first value in Substring
method
tells Substring the position of the character in our $file
string we want to start with (beginning at 0) and the second number is the
position of the character where we want to end.  So now we end up with half of our
original License1.txt file.

It’s even easier to get the second half of the file:

$file.substring(542) | out-file
.\LICENSE542.txt

Substring starts from position 542 and goes to the end of
the string.

Using a license file is not very practical. However, we did
figure out how to read the file into a variable, match items in the result, get
a count and split the file in two, now let’s move onto logs that IIS produces.

These files can be found on
C:\inetpub\logs\logfiles\w3svc(x), where (x) is the number of the site that is
logging data for the site.

After inspecting logs for this web server we’ve discovered
that there are a great number of small files with not much information in them.
Since this is just a local desktop we haven’t worked much with IIS here so the
first thing we’ll do is to consolidate all these log files into a single log
and then begin the process of drilling into the information in the consolidated
log file.

 PS
 C:\inetpub\logs\logfiles\w3svc1> dir 
    Directory:
 C:\inetpub\logs\logfiles\w3svc1

 Mode
 LastWriteTime         Length Name                          
----
 -------------         ------ ----                          
-a----         3/7/2015  10:11
 AM            338 u_ex150307.log                
-a----        6/14/2015   7:03
 PM           5522 u_ex150615.log                
-a----        6/15/2015   8:27
 PM            472 u_ex150616.log                
-a----        8/17/2015   9:19
 PM            593 u_ex150818.log                
-a----        8/19/2015   6:43
 AM           2616 u_ex150819.log                
-a----        8/19/2015   7:36
 PM           1516 u_ex150820.log                
-a----        8/20/2015   6:08
 PM            761 u_ex150821.log                
-a----        8/23/2015  10:36
 AM           3666 u_ex150822.log                
-a----        8/24/2015   5:42
 PM            761 u_ex150825.log                
-a----        8/26/2015   4:04
 PM           2117 u_ex150826.log                
-a----        8/30/2015   8:59
 AM           2574 u_ex150829.log                
-a----        9/24/2015   8:27
 PM            763 u_ex150925.log                
-a----        9/28/2015   8:53
 PM           1457 u_ex150929.log                
-a----        10/3/2015   8:46
 AM            769 u_ex151001.log  
-a----        11/9/2015   7:52
 PM           1469 u_ex151110.log 

As you can see, we have

(dir).count

22

…22 files in our  IIS logging folder. Get-Content takes a
Path value by property name. So, you can pipe Get-ChildItem to Get-Content
without the need for a foreach loop.

   dir | get-content | set-content iislog.txt 
 

Now that we have the content of all my files in iislog.txt, we
want to filter out anything that isn’t really a log.  This can be done with a
regular expression.  The one we’ll use is this filter:

“(\d+)-(\d+)-(\d+) (\d+):(\d+):(?:\d+) ::”

This will match on items like this one: 2015-11-10 02:36:54
::1

(get-content .\iislog.txt) -match “(\d+)-(\d+)-(\d+)
(\d+):(\d+):(?:\d+) ::”
| Set-Content filterediislog.txt

Now since we have our log file trimmed and the excess items
that were at the top of the log chopped we can now begin the process of
discovering interesting items about our log file.

How many 404’s has this web server experienced?  We can find
the number of 404’s by reading our log file back into a variable and using the –Match operator to find
our 404’s

PS
 C:\inetpub\logs\LogFiles\W3SVC1> $iislog = get-content  .\filterediislog.txt
 
PS
 C:\inetpub\logs\LogFiles\W3SVC1> $404Logs = $iislog -match ' 404 '
 
PS
 C:\inetpub\logs\LogFiles\W3SVC1> $404Logs.count
16
PS
 C:\inetpub\logs\LogFiles\W3SVC1> ($iislog –match ‘ 404 ‘).count
16

As we can see from the scripting above we used the Get-Content command to
get the file. Then we used the –Match operator to get
us the number of matches.  Since we didn’t use the –raw switch on the Get-Content command the
result set is scalar allowing matches to give us the number of matches it found
in the collection. Here are the first 2 matches in the results of the scalar
match:

PS
 C:\inetpub\logs\LogFiles\W3SVC1> $iislog -match ' 404 ' | select -first 2

2015-06-15 00:05:31 ::1 GET /t/curl_http_..clusterexternalnic_8090.storage.172.25.3.20.nodestat
 - 80 - ::1 Mozilla/5.0+(Windows+NT+6.3;+WOW64)+AppleWebK it/537.36+(KHTML,+like+Gecko)+Chrome/43.0.2357.124+Safari/537.36
 - 404 3 50 496
2015-06-15 00:05:31 ::1 GET
 /favicon.ico - 80 - ::1
 Mozilla/5.0+(Windows+NT+6.3;+WOW64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/43.0.2357.124+Safari/537.36
 http://localhost/t/curl_http_..clusterexternalnic_8090.storage.172.25.3.20.nodestat
 404 0 2 4 

Now that we have the number of 404’s in our log file we can
see what queries resulted in a 404. To achieve this we’ll use the first result
set to figure out what we need to select with a Regular expression to get the query
names out of our log file.


 C:\inetpub\logs\LogFiles\W3SVC1> $iislog -match ' 404 ' | select -first 1 

2015-06-15 00:05:31 ::1 GET
 /t/curl_http_..clusterexternalnic_8090.storage.172.25.3.20.nodestat - 80 - ::1
 Mozilla/5.0+(Windows+NT+6.3;+WOW64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/43.0.2357.124+Safari/537.36
 - 404 3 50 496 

The starting character in log is a / and the ending
characters are  (a space followed by a – ).

So we’ll re-use our regular expression used earlier and add
::1 GET and add it to a $404Filtered Variable.

  $404filtered = ($iislog -match ‘ 404
)
-REPLACE “(\d+)-(\d+)-(\d+)
(\d+):(\d+):(?:\d+) ::1 GET”
,

Note: This works for this log
file because the address written after the :: is a local loopback address. So
you may have to change this based on your scenario.

Below is the 1st item in the array of 16 based on
the –replace String above.

PS
 C:\inetpub\logs\LogFiles\W3SVC1> $404filtered | select -first 1
 
 /t/curl_http_..clusterexternalnic_8090.storage.172.25.3.20.nodestat
 - 80 - ::1
 Mozilla/5.0+(Windows+NT+6.3;+WOW64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/43.0.2357.124+Safari/537.36
 - 404 3 50 496
 

Now we need to remove the string on the end to give us our
failing page. Again we can do this with a regular expression we’ll use this
expression: ‘-.*’

PS
 C:\inetpub\logs\LogFiles\W3SVC1> $404filtered -replace '-.*',''
 /t/curl_http_..clusterexternalnic_8090.storage.172.25.3.20.nodestat
  /favicon.ico 
 /t/console_print.css  

For brevity sake we’ve only selected the first 3 failing
pages. Now we can take these files to the developers and ask them if they
should be here or if some other file in the website is hitting them that
shouldn’t.

As you can see you can do a number of things with very few
commands to split / join or search through logs and drill into the information
that will help you the most.  PowerShell has many other methods and commands
that can be used to search through files and operate on them. In my next blog
post I’ll go through some of the other methods and ways that will make dealing
with logs that much easier.

For more really good examples, see this blog post, which was
used as a primary source for the IIS regular expressions:

https://infracloud.wordpress.com/2015/09/28/iis-log-parsing-using-powershell/

For many other regular expressions that can be used with
PowerShell see this website.

http://regexlib.com/Search.aspx?k=iis&c=-1&m=-1&ps=20

Until next time keep on scripting!

Thom