Merging hashtables

Hash tables in PowerShell are very useful and can be used for a bunch of things.   Recently I had to use some code I found on StackOverflow to Merge hash tables. This post is about my experience and the really cool piece of code that iRon posted on StackOverFlow.

First I need to login to azure and find my application:


add-azurermaccount

get-azurermresource

I see my resource in the Get-azureRmresource so now I know that I can query for it’s app settings using this command:


$myapp = Get-AzureRmWebAppSlot -resourcegroupname myresourcegroup -name myresourcename -slot production

This produces and object that contains all my web application settings in azure for my app in question.  The item i want to work on is the .siteconfig.AppSettings

 

This portion of the object will have the properties of the appsettings in the azure blade as shown below:

Applicationsettings


PS C:\Users\me> $myApp.siteconfig.AppSettings

Name Value


WEBSITE_NODE_DEFAULT_VERSION 6.9.1

Now that I have the current version of what is in my application I now need to see what to do to put new settings in place and not wipe out any existing settings.  The cmdlet to do the addition is Set-AzureRmWebAppSlot.  After looking through the help I can see that I have a parameter that  I can pass for the settings i want called -appSettings.  It like most of the other settings require a hash table:  [[-AppSettings] <Hashtable>]

So The $myApp.SiteConfig.Appsettings is a list:


$appSettings = $Myapp.siteconfig.appsettings
&amp;nbsp;$appsettings -is [pscustomobject]
False
&amp;nbsp;$appsettings.gettype()

IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True List`1 System.Object

$appsettings -is [hashtable]
False

This means I need to convert my object from List`1 to a hashtable so I’ll iterate through it and create a hashtable:


$appSettingsHash = @{}
&amp;nbsp;foreach($k in $appSettings) { $appSettingsHash[$k.name] = $k.value }
&amp;nbsp;$appsettingshash

Name Value
---- -----
WEBSITE_NODE_DEFAULT_VERSION 6.9.1

&amp;nbsp;$appsettingshash -is [hashtable]

True

Ok now that I have my current settings in a hashtable I need to now work with entries that I want to add to a hashtable and then post it.


$appSettings ='{"AppSettings:testkey1": "45test","AppSettings:TestId": "This is a Test Key 28"}'

$newAppSettings = $appSettings | convertfrom-json 

$newAppSettingsHash = @{}
 $newAppSettings.psobject.properties | ForEach-Object { $newAppSettingsHash[$_.Name] = $_.Value }

$newappsettingsHash -is [hashtable]
True

This is where the magic of iRon‘s script comes into play. Since  I need to use this in a deployment from TFS I created the hashtable in Json Format first and then convert the Json format to a [hashtable]. Then I call iRon’s Script with the $newappSettingsHash and the $appsettingsHash. Now I have a merged hashtable that I can now update my application with.


Function Merge-Hashtables([ScriptBlock]$Operator) {
$Output = @{}
ForEach ($Hashtable in $Input) {
If ($Hashtable -is [Hashtable]) {
ForEach ($Key in $Hashtable.Keys) {$Output.$Key = If ($Output.ContainsKey($Key)) {@($Output.$Key) + $Hashtable.$Key} Else {$Hashtable.$Key}}
}
}
If ($Operator) {ForEach ($Key in @($Output.Keys)) {$_ = @($Output.$Key); $Output.$Key = Invoke-Command $Operator}}
$Output
}

$hashtable = $newAppSettingsHash, $appSettingsHash | Merge-Hashtables {$_[0]} 
$results = Set-AzureRmWebAppSlot -AppSettings $hashtable -name $website -ResourceGroupName $resourceGroup -slot $slot
$r = $results.SiteConfig.AppSettings
Write-Output $r

The really cool thing about the Merging of the hashtables function is that you can merge more than 2 hash tables.  See this comment from iRon about how it works:

 

hashtablemerge

Full code for this merge hashtable function against a azure application is below:


param($websitename = 'TEst' ,$resourceGroup = 'SchuTest',$slot = 'production', $appSettings ='{"AppSettings:testkey1": "45test","AppSettings:TestId": "This is a Test Key 28"}')
#https://stackoverflow.com/questions/8800375/merging-hashtables-in-powershell-how
Function Merge-Hashtables([ScriptBlock]$Operator) {
$Output = @{}
ForEach ($Hashtable in $Input) {
If ($Hashtable -is [Hashtable]) {
ForEach ($Key in $Hashtable.Keys) {$Output.$Key = If ($Output.ContainsKey($Key)) {@($Output.$Key) + $Hashtable.$Key} Else {$Hashtable.$Key}}
}
}
If ($Operator) {ForEach ($Key in @($Output.Keys)) {$_ = @($Output.$Key); $Output.$Key = Invoke-Command $Operator}}
$Output
}
try {
foreach($website in $websiteName)
{
ConvertFrom-Json $appSettings ErrorAction Stop
#it is expected that the app settings is a string representation of a hashtable that is writtin in Json. So that it can be converted to a powershell hashtable during runtime
$newAppSettings = $appSettings | convertfrom-json
$newAppSettingsHash = @{}
$newAppSettings.psobject.properties | ForEach-Object { $newAppSettingsHash[$_.Name] = $_.Value }
$Application = get-azurermwebappslot Name $website ResourceGroupName $resourceGroup Slot $slot
$ExistingSettings = $Application.siteconfig.AppSettings
$appSettingsHash = @{}
foreach($k in $ExistingSettings)
{
$appSettingsHash[$k.name] = $k.value
}
#https://stackoverflow.com/questions/8800375/merging-hashtables-in-powershell-how
$hashtable = $newAppSettingsHash, $appSettingsHash | Merge-Hashtables {$_[0]}
$results = Set-AzureRmWebAppSlot AppSettings $hashtable name $website ResourceGroupName $resourceGroup slot $slot
$r = $results.SiteConfig.AppSettings
Write-Output $r
}
}
catch
{
Write-Error "$appsettings must be in JSON format"
}

I hope that when you need to merge hashtables this article makes it a  bit easier for you.

 

Until then keep scripting

 

thom

Advertisement

Uploading files to Azure Applications (kudu)

I needed to copy some content to my azure application that the Build and deploy that I constructed for it wouldn’t need to do every deploy every time.   So my quest began on how do I upload files to an Azure application.  The most common and recognized way of uploading files to azure applications is through webdeploy. I didn’t think I needed to package up and use webdeploy so I sought out a way to do this with PowerShell.  This post is about that pursuit.

Thanks to this article most of the work was done Copy files to Azure Web App with PowerShell and Kudu API.  All I needed to do was to put a loop around my file upload and use Octavie van Haaften‘s scripts.

So I started with get-childitem -recurse “$downloadFolder\content”.  Now that I had my content in a variable called $files I can put this in a foreach loop and use Octavie van Haaften‘s  Upload-FileToWebapp.

During the upload of the files I need to determine if the file from my local disk is a File or Directory.  I used the following classes to determine this:

[System.IO.DirectoryInfo] &  [System.IO.FileInfo]

If the item was a directory then I had to make the upload location match the location on disk.  I did this through a little bit of replacement logic and used the $kudufolder as my variable to use for the upload function from Octavie.


$kudufolder = ((($file.FullName).Replace($uploadfrom,'Content'))`
.replace('\','/')).trimstart('/')
$kudufolder = "$kudufolder/"
Upload-FileToWebApp -resourceGroupName myresourcegroup`
-webAppName mywebapp -kuduPath $kudufolder

The same holds true for the upload of a file. The only difference between the file and the directory is the /. When you are uploading/creating a directory / to kudu means a directory.


$kudufile = ((($file.FullName).Replace($uploadfrom,'Content'))`
.replace('\','/')).trimstart('/')
Upload-FileToWebApp -resourceGroupName myresourcegroup`
-webAppName mywebapp -localPath $file.FullName -kuduPath $kudufile

Here is the full script in the foreach loop with each check for a directory or file.


$downloadfolder = 'c:\temp\myAzureStorage'

$uploadfrom = "$downloadfolder\Content"

$files = get-childitem -Recurse "$downloadfolder\Content"

foreach($file in $files)
{
if($file -is [System.IO.DirectoryInfo])
{
$kudufolder = ((($file.FullName).Replace($uploadfrom,'Content')).replace('\','/')).trimstart('/')
$kudufolder = "$kudufolder/"
Upload-FileToWebApp -resourceGroupName myresourcegroup -webAppName mywebapp -kuduPath $kudufolder
}
elseif($file -is [System.IO.FileInfo])
{
$kudufile = ((($file.FullName).Replace($uploadfrom,'Content')).replace('\','/')).trimstart('/')
Upload-FileToWebApp -resourceGroupName myresourcegroup -webAppName mywebapp -localPath $file.FullName -kuduPath $kudufile
}
}


I hope this helps someone
Until then keep Scripting
Thom


 

Azure File Storage Download

If you have an Azure account and you want to download files out of azure storage either individually or a whole folder. This script is about how I was able to do this with Powershell.

First we need to login to azure and get a storage context.  The StorageContext will require a key.


Add-AzureRmAccount -credential (get-credential) -tenantid yourid

$key = (get-azurermstorageAccountkey -resourcegroupname myresourcegroup -name mystorageaccountName | where-object{$Psitem.keyname -eq 'key1'}).value

$storageContext = New-AzureStorageContext -StorageAccountName "mystorage" -StorageAccountKey $key

Now that we have the storage context and key. Now we need to find the files that are in our AZURE File storage.


$content = get-azurestoragefile -storageaccountname "mystorage" -storageAccountkey $key

If we look at the contents of our $content variable we should see something similar to this:

files

Now that we have the content in a variable now we can begin the process of figuring out how to download each file. To start with to download a single file we need to use get-azureStorageFileContent 


$content = get-azurestoragefile -storageaccountname "mystorage" `

-storageAccountkey $key

get-azurestoragefilecontent -sharename "myshare" -path $content[0].uri.localpath `

-replace "$($content[0].share.name)/",'' -destination "c:\temp\" -context $storageContext

After much trial and error I found that in the object you get back from Azure there are two different Object types that you must check for:

Microsoft.WindowsAzure.Storage.File.FileDirectoryProperties

and the other type is:

Microsoft.WindowsAzure.Storage.File.CloudFile

By the class names you can see that one is a file and the other is a Directory.  With that in mind now I can put this in a function that recursively calls itself to get all the contents.


function Get-AzureFiles
{
param([string]$shareName = 'mystorage', [object]$storageContext, [string]$downloadFolder, [string]$path)
$content = get-azurestoragefile -sharename $sharename -Context $storagecontext -path $path| Get-AzureStorageFile

foreach($c in $content)
{
$Parentfolder = $c.uri.segments[($c.uri.segments.count -2)] -replace '/','\'

if(!(test-path $destination))
{mkdir $destination}
$p = $c.uri.LocalPath -replace "$($c.share.name)/" ,''
if(Get-AzureStorageFile -ShareName $c.share.name -path $p -Context $storageContext )
{
if($c.properties -is [Microsoft.WindowsAzure.Storage.File.FileDirectoryProperties])
{
$d = [Microsoft.WindowsAzure.Storage.File.CloudFileDirectory]::new($c.uri.AbsoluteUri)
#Get-AzureStorageFileContent -directory $d -ShareName $c.share.name -Destination "$destination$($c.name)" -Context $storageContext #-Path $p
$path = $d.Uri.LocalPath -replace "/$sharename/" , ""
$dest = $path -replace '/','\'
"$($c.name) is a directory -- getting $downloadfolder\$dest files"
if(!("$downloadfolder\$dest"))
{mkdir "$downloadfolder\$dest"}
Get-AzureFiles -shareName $shareName -storageContext $storageContext -path $path -downloadFolder "$downloadFolder\$dest"
}
elseif($c -is [Microsoft.WindowsAzure.Storage.File.CloudFile])
{
Write-Output "downloading --- $destination$($c.name)"
$destination = (($c.Uri.LocalPath -replace "/$sharename/" , "") -replace '/','\')
$dest = "$downloadFolder\$destination"
$dest
$de = $dest -replace $c.name, ""
if(!(test-path $de))
{
mkdir $de
}
if(!(test-path $dest))
{Get-AzureStorageFileContent -ShareName $c.share.name -Path $p -Destination $dest -Context $storageContext }# -WhatIf}
else
{
Write-Output "already downloaded --- $dest"
}
}
}
}
}

Now if we call the function with we’ll get a downloaded copy of all the folders from the location that you specify in azure to your local host:

get-AzureFiles -sharename “myShare” -storageContext $storageContext -downloadfolder “C:\temp\azurefiles”

There you go now you have your files in a directory from azure. Stay tuned for my next article where I’ll show you how to upload these same files to an Azure application (kudu).


I hope this helps someone
Until then keep Scripting
Thom


 

Finding 500,401 Errors in IIS logs

If you’ve  ever had to troubleshoot issues with IIS you’ll known that you are often drawn to looking at IIS Logs.    This post is about a script that I modified to search IIS logs for specific errors.

To begin with I found this article on StackOverFlow that got me started on this path.

To begin with we need to get the webadministration module so we can get the name of the website we want to get the log file for:


Import-Module WebAdministration

$site = Get-Item IIS:\Sites\$website
$id = $site.id
$logdir = "$($site.logfile.directory)\w3svc$id"

Now that we have the  The $logdir (log directory) we can now put the rest of the file name together by getting the date:


$File = "$logdir\u_ex$(((get-date).adddays(-$days)).ToString("yyMMdd")).log"
s

Assuming you are using a logfile per day the name of the log file is:

u_ex(yyMMdd)  which should be something similar to this: u_ex170824.log

Now that we have our log file we need to strip off unnecessary lines.  Specifically the First three lines that start with #S, #D, or #V which ends up being the Sofware, Version and Date items at the top of the log.

Then we’ll need to Build the columns based on the #fields value in the Log. That way each field in the results can be arranged into columns so we can sort our data based on what the item in the field is.

The additional column we’ll add is the name of the log file.


$Log = Get-Content $File | where {$_ -notLike "#[D,S-V]*" }

$Columns = (($Log[0].TrimEnd()) -replace "#Fields: ", "" -replace "-","" -replace "\(","" -replace "\)","").Split(" ")

$Columns += 'LogFile'

$Count = $Columns.Length

Now that we have the column titles now it’s time to filter out the log and only get the lines that have the value we want.  In my case I was searching for the Error code of 500.


$Rows = $Log | where {$_ -like "*$errorType 0 0*"}

Now that we have all the data we want in the row’s variable we can now construct a table of Columns and rows.


$IISLog = New-Object System.Data.DataTable "IISLog"

foreach ($Column in $Columns) {
$NewColumn = New-Object System.Data.DataColumn $Column, ([string])
$IISLog.Columns.Add($NewColumn)
}
# Loop Through each Row and add the Rows.
foreach ($Row in $Rows) {
$Row = $Row.Split(" ")
$AddRow = $IISLog.newrow()
for($i=0;$i -lt $Count; $i++) {
$ColumnName = $Columns[$i]
if($ColumnName -eq 'LogFile')
{$AddRow.$ColumnName = $file }
else {$AddRow.$ColumnName = $Row[$i]}
}
$IISLog.Rows.Add($AddRow)
}

$IISLog

Full function is below:


  function get-ErrorLogs
  {
  param($website = 'myWebSite', $errorType = '500',[int] $days =0)

    Import-Module WebAdministration

    $site = Get-Item IIS:\Sites\$website
    $id = $site.id
    $logdir = "$($site.logfile.directory)\w3svc$id"

    $File = "$logdir\u_ex$(((get-date).adddays(-$days)).ToString("yyMMdd")).log"
    $Log = Get-Content $File | where {$_ -notLike "#[D,S-V]*" }
    $Columns = (($Log[0].TrimEnd()) -replace "#Fields: ", "" -replace "-","" -replace "\(","" -replace "\)","").Split(" ")
    $Columns += 'LogFile'
    $Count = $Columns.Length
    $Rows = $Log | where {$_ -like "*$errorType 0 0*"}
    $IISLog = New-Object System.Data.DataTable "IISLog"
    foreach ($Column in $Columns) {
      $NewColumn = New-Object System.Data.DataColumn $Column, ([string])
      $IISLog.Columns.Add($NewColumn)
    }
    foreach ($Row in $Rows) {
      $Row = $Row.Split(" ")
      $AddRow = $IISLog.newrow()
      for($i=0;$i -lt $Count; $i++) {
        $ColumnName = $Columns[$i]
        if($ColumnName -eq 'LogFile')
        {$AddRow.$ColumnName = $file }
        else {$AddRow.$ColumnName = $Row[$i]}
      }
      $IISLog.Rows.Add($AddRow)
      }

    $IISLog
  }

Now that I have this in a full function I can just call it like this:

get-ErrorLogs -website “Website2” -errorType 500 -days 4

And get results similar to this:

500


I hope this helps someone
Until then keep Scripting
Thom


Just Hash It

I have been looking high and low for a good means to compare one variable to another and do it quickly.  In my search I found this article on Stack Over flow.  This led me to create a function that you could use for comparing one variable to another and getting a simple $true or $false answer if they are the same or not.   This article explains that concept

To start with I need to create a function block and pass two parameters. The item i’m using as a reference $reference and the item/variable I’ll use as the difference.


function Compare-Variables
{
param([string]$Reference, [string]$difference)

}

Using the example from the Post on Stack overflow I need to create an Object to hold the text encoding System.Text.UTF8Encoding  System.Security.Cryptography.MD5CryptoServiceProvider and a System.BitConverter.

To work backwards from the object to the comparison here is what takes place.

Step 1: Take the contents of each variable and turn them in to json – Using Converto-Json

$ref = $reference.CacheValue | ConvertTo-Json -Depth 100
$diff = $difference.CacheValue | ConvertTo-Json -Depth 100 

Now that I have it in a json object (so long as the object isn’t nested beyond 100) I’ll have the entire variable and it’s children objects.  This is in a variable called $ref and $diff.

Step 2: Since I have those in a variable I can get the bytes for the variable from Calling the UTF8 getBytes method.  Using a variable of test you’ll see that I get back a set of bytes for each character:


$utf8 = [System.Text.UTF8Encoding]::new()

$utf8.GetBytes('test')
116
101
115
116 

Step 3: Now that my variable is in bytes I can now compute a hash for those bytes with the System.Security.Cryptography.MD5CryptoServiceProvider


$md5 = [System.Security.Cryptography.MD5CryptoServiceProvider]::new()
$md5.ComputeHash($utf8.GetBytes('test'))
9
143
107
205
70
33
211
115
202
222
78
131
38
39
180
246 

Step 4: Now that I have my Computed hash I can convert this into a readable MD5 Sum with  System.BitConverter.

[System.BitConverter]::ToString($md5.ComputeHash($utf8.GetBytes('test')))
09-8F-6B-CD-46-21-D3-73-CA-DE-4E-83-26-27-B4-F6
 

Step 5: Now that I have that for both my variables I can simply ask if $ref -eq $diff and get an answer of $true or $false.

Completed Script is below:

function Compare-Variables
{
 param([object]$Reference, [object]$difference, [int]$objectDepth='2')
 $utf8 = [System.Text.UTF8Encoding]::new()
 $match = $false
 $md5 = [System.Security.Cryptography.MD5CryptoServiceProvider]::new()
 $ref = $reference | ConvertTo-Json -Depth $objectDepth
 $diff = $difference | ConvertTo-Json -Depth $objectDepth
 $hashref = [System.BitConverter]::ToString($md5.ComputeHash($utf8.GetBytes($ref)))
 $hashdif = [System.BitConverter]::ToString($md5.ComputeHash($utf8.GetBytes($diff)))
 $match = $hashref -eq $hashdif
 $hashref = $diff = $ref = $utf8 = $md5 = $null
 $match
}
 

Testing this Function:

Simple Test with just text: Now I can call this function and get a $true if the variables match and a false if they don’t.

$a = 'test'
$b = 'test'
Compare-Variables -Reference $a -difference $b
True
$a = 'test2'
$b = 'test'
Compare-Variables -Reference $a -difference $b
False

Test with an object from a Rest API: Now lets try something that we know will have a fair amount of data in it.  Githubs Rest Api:

 $a = 'test'
$b = 'test'
Compare-Variables -Reference $a -difference $b
True
$a = Invoke-RestMethod -uri http://api.github.com
$b = Invoke-RestMethod -Uri http://api.github.com
Compare-Variables -Reference $a -difference $b
True
$a = Invoke-RestMethod -Uri http://api.github.com/emojis
$b = Invoke-RestMethod -uri http://api.github.com
 Compare-Variables -Reference $a -difference $b
Flase

I hope this helps someone

Until then keep Scripting

Thom

 

 

 

What the Null??

Recently I’ve been working on some code for Querying schedules for SSRS.  I discovered the way that PowerShell passes a null to another function isn’t what the SSRS method expected.

So this started me on what the Heck is PowerShell null really set to?

Based on this blog article we can see I’m not the only one that has this question.   Cody Konior uncovered other ways to declare null:

If you test each of these against PowerShell’s null you get a false:

whattheNull

If we use some of the other comparisons maybe we’ll get to what $null is really set to:

PS ps:\> $b = $null
PS ps:\> [string]::IsNullOrWhiteSpace($b)
True
PS ps:\> [string]::IsNullOrWhiteSpace($b)
True
PS ps:\>

These evaluate the value you’d expect all $true.   I for sure don’t know the language as well as Kirk Munro (@Poshoholic).  He pointed me to a class that I used to compare to  PowerShell’s Null and it came up true:

[System.Management.Automation.Internal.AutomationNull]::Value

In a Blog post about a issue around null it’s explained this way by Jason Shirk (@lzybkr):

ShirkNull

Now I can test for PowerShell’s null, and this explains why $null is not Equal to the C# equivalent.

$Null -eq [System.Management.Automation.Internal.AutomationNull]::Value

True

Yet more detail on the why $Null is different in PowerShell (a more detailed example).

Moral of the story if you are calling a method that expects a $null make certain you get the Right $null for the Method you are calling.

 


I hope this helps someone

Until then keep Scripting

Thom

Creating a DataDriven Subscription from a file

In working with SSRS I found that if I wanted to do a delete of an RDL.  Problem is if I delete an RDL the Subscriptions are deleted as well.   So since I’ve put most all of my SSRS File management and Data source creation in Continuous Integration I needed a way to save off subscriptions before I attempted a delete.  I talk about how that is done in this post Saving SSRS Subscriptions to File.  This post will be about how I consume the saved off files and put the subscription in place in another environment.

If you’ve been following my other blog posts on SSRS you’ll know I’ve written about creating an SSRS Datasource. Testing an SSRS Datasource in each of these scripts I start with the following :  Reportservice2010.asmx Webservice to get to any functions that are needed to operate on SSRS. I assume you’ve read one of these articles and that you need to get a proxy to this.

Onto the method that we’ll call to recreate the DataDriven Subscription:

CreateDataDrivenSubscription A call to this method requires the following classes passed to it:

Path to where the report is : item

ExtensionSetttings – An ExtensionSettings object that contains a list of settings that are specific to the delivery extension.

DataRetrievalPlan – Type: ReportService2010.DataRetrievalPlan
A DataRetrievalPlan object that provides settings that are required to retrieve data from a delivery query. The DataRetrievalPlan object contains a reference to a DataSetDefinition object and a DataSourceDefinitionOrReference object.

description – Type: System.String A meaningful description that is displayed to users.

eventtype – Type: System.String
The type of event that triggers the data-driven subscription. The valid values are TimedSubscription or SnapshotUpdate

Matchdata – Type: System.String
The data that is associated with the specified EventType parameter. This parameter is used by an event to match the data-driven subscription with an event that has fired.

paramatervalueorfieldreference – Type: ReportService2010.ParameterValueOrFieldReference[]
An array of ParameterValueOrFieldReference objects that contains a list of parameters for the item.

Since I’m using a Powershell Class one of the first things I must make sure I do is have a proxy to the WebService I want to call. Everything else will fail if i haven’t done that first. Then I’ll read in my reportFiles that I want to operate on With Get-childitem.  I chose to use the xml export method.


$ssrsproxy = New-WebServiceProxy -Uri http://yourwebsite/yourreports/_vti_bin/ReportServer/ReportService2010.asmx -UseDefaultCredential -namespace 'SSRSProxy' -class 'ReportService2010'
$reportExportPath = 'C:\temp\reports3'
$reportFiles = Get-ChildItem $reportExportPath -Filter *.xml

Now that I have My reports that I want to operate on I can iterrate through each of these objects and rebuild the object and submit it to the new server.   So that I can change url’s from one server to another I Use a $source and $destination variable to assist.

Since each saved off report has 7 items in it we have to go through each item (object) in the xml and convert them back to the same type of object that the method call expects. So the first Object that we need to rebuild that is a little more complex is the $extensionsettings.  When you look at the extension settings they can have two other objects inside the extension setting. One of them is a ParameterFieldReference and the other is a Parametervalue.  So we have to test each of the names of the object properties to see what the name is so we know whether to build a ParameterFieldReference or a Parametervalue.


foreach($parameterField in $reportobject.extensionSettings.ParameterValues)
{
if($parameterfield.psobject.Properties.name[0] -eq 'ParameterName') #rebuild the object into an extenstion setting this one contains a parameter field reference
{
$a = [SSRSProxy.ParameterFieldReference]::new()
Write-Verbose 'Create a object of type ParameterField reference.'
$a.FieldAlias = $parameterfield.fieldalias
$a.ParameterName = $parameterfield.ParameterName
}
elseif($parameterfield.psobject.Properties.name[0] -eq 'Name') #rebuild the object into an extension settings object this one contains a param value
{
$a = [SSRSProxy.ParameterValue]::New()
Write-Verbose 'Create a object of type ParameterValue reference.'
$a.Label = $parameterField.Label
$a.Name = $parameterField.Name
$a.Value = $parameterField.Value
}
$paramvalues += $a
}
$extensionSettings.ParameterValues = [ssrsproxy.parametervalueorfieldreference[]]$paramvalues

Once we’ve built the extensionsettings we now need to rebuild the DataRetrievalPlan . The data retreival plan includes the reference to the new location for the DataSource.  this is where we use the source and destination to our advantage. This is done by settting the item on the dataretrieval plan to the datasource reference we wish to use using the object DataSourceReference


[SSRSProxy.DataRetrievalPlan]$DataRetrievalPlan = New-Object SSRSProxy.DataRetrievalPlan
 Write-Verbose 'Create a object of type DataRetrievalPlan reference.'

 $DataRetrievalPlan.DataSet = $reportobject.DataRetrievalPlan.DataSet

 $DataRetrievalPlan.DataSet = $reportobject.DataRetrievalPlan.DataSet
 [SSRSProxy.DataSourceReference]$dsReference = $reportobject.DataRetrievalPlan.Item
 $src = ([uri]$source).absoluteuri
 $dest = ([uri]$destination).absoluteuri
 $dsReference.Reference = (([uri]$dsReference.Reference).AbsoluteUri) -replace $src,$dest
 Write-Verbose "Datasource Reference $dsreference use the value for the datasource you want this data driven report to consume"
 $DataRetrievalPlan.Item = $dsReference

now the last few bits of information that need to be added is the ParameterValueorFieldReference and the report description, eventtype and match data.


$description = $reportobject.Description
 $eventtype = $reportobject.eventtype
 $matchdata = $reportobject.matchdata

$b = [Ssrsproxy.parameterfieldreference]::new()
 Write-Verbose 'Create a object of type parameterfieldreference reference.'
 $b.FieldAlias = $reportobject.parameters.fieldalias
 $b.ParameterName = $reportobject.parameters.ParameterName
 [SSRSProxy.ParameterValueOrFieldReference]$ParameterValueOrFieldReference = $b

Now that we have those set the last thing for us to do is to set the destination for the report where this should go.  Then we’ll call the method and hope we don’t hit an exception.


$itemPath = "$destination/$($reportobject.subscription.report)"
 try
 {
 Write-Verbose "Now that the object is re-constituted we can put this in the SSRS instance we wish to push it to"
 $ssrsproxy.CreateDataDrivenSubscription($itempath , $extensionsettings , $DataRetrievalPlan, $description, $eventtype, $matchdata, $ParameterValueOrFieldReference) 
 }
 Catch
 {
 "Error was $_"
 $line = $_.InvocationInfo.ScriptLineNumber
 "Error was in Line $line"
 }

 

Full script for this follows:

Write-Verbose " Destination for where the saved subscriptions will be pushed to"
$destination = 'http://yourwebsite/sites/datasourcetest/Shared%20Documents'
$source = 'http://yourwebsite/sites/Reports/Shared%20Documents'

$reportFiles = Get-ChildItem $reportExportPath -Filter *.xml
foreach($file in $reportFiles)
{
 $reportobject = Import-Clixml -path ($file.fullname)
 Write-Verbose "Create a object of type ExtensionSettings"
 $extensionSettings = New-Object -typename 'SSRSProxy.ExtensionSettings' 
 $extensionSettings.Extension = $reportobject.extensionSettings.Extension
 $paramvalues = @()
 foreach($parameterField in $reportobject.extensionSettings.ParameterValues)
 {
 if($parameterfield.psobject.Properties.name[0] -eq 'ParameterName') #rebuild the object into an extenstion setting this one contains a parameter field reference
 {
 $a = [SSRSProxy.ParameterFieldReference]::new()
 Write-Verbose 'Create a object of type ParameterField reference.'
 $a.FieldAlias = $parameterfield.fieldalias
 $a.ParameterName = $parameterfield.ParameterName
 }
 elseif($parameterfield.psobject.Properties.name[0] -eq 'Name') #rebuild the object into an extension settings object this one contains a param value
 {
 $a = [SSRSProxy.ParameterValue]::New()
 Write-Verbose 'Create a object of type ParameterValue reference.'
 $a.Label = $parameterField.Label
 $a.Name = $parameterField.Name
 $a.Value = $parameterField.Value
 }
 $paramvalues += $a
 }
 $extensionSettings.ParameterValues = [ssrsproxy.parametervalueorfieldreference[]]$paramvalues

 [SSRSProxy.DataRetrievalPlan]$DataRetrievalPlan = New-Object SSRSProxy.DataRetrievalPlan
 Write-Verbose 'Create a object of type DataRetrievalPlan reference.'

 $DataRetrievalPlan.DataSet = $reportobject.DataRetrievalPlan.DataSet
 [SSRSProxy.DataSourceReference]$dsReference = $reportobject.DataRetrievalPlan.Item
 $src = ([uri]$source).absoluteuri
 $dest = ([uri]$destination).absoluteuri
 $dsReference.Reference = (([uri]$dsReference.Reference).AbsoluteUri) -replace $src,$dest
 Write-Verbose "Datasource Reference $dsreference use the value for the datasource you want this data driven report to consume"
 $DataRetrievalPlan.Item = $dsReference
 $description = $reportobject.Description
 $eventtype = $reportobject.eventtype
 $matchdata = $reportobject.matchdata

 $b = [Ssrsproxy.parameterfieldreference]::new()
 Write-Verbose 'Create a object of type parameterfieldreference reference.'
 $b.FieldAlias = $reportobject.parameters.fieldalias
 $b.ParameterName = $reportobject.parameters.ParameterName
 [SSRSProxy.ParameterValueOrFieldReference]$ParameterValueOrFieldReference = $b

 $itemPath = "$destination/$($reportobject.subscription.report)"
 try
 {
 Write-Verbose "Now that the object is re-constituted we can put this in the SSRS instance we wish to push it to"
 $ssrsproxy.CreateDataDrivenSubscription($itempath , $extensionsettings , $DataRetrievalPlan, $description, $eventtype, $matchdata, $ParameterValueOrFieldReference) 
 }
 Catch
 {
 "Error was $_"
 $line = $_.InvocationInfo.ScriptLineNumber
 "Error was in Line $line"
 }
}

I hope this helps someone

Until then keep Scripting

Thom

Saving SSRS Subscriptions to File

In working with SSRS I found that if I wanted to do a delete of an RDL.  Problem is if I delete an RDL the Subscriptions are deleted as well.   So since I’ve put most all of my SSRS File management and Data source creation in Continuous Integration I needed a way to save off subscriptions before I attempted a delete.  This article is about the PowerShell I wrote to accomplish this task.

If you’ve been following my other blog posts on SSRS you’ll know I’ve written about creating an SSRS Datasource. Testing an SSRS Datasource in each of these scripts I start with the following :  Reportservice2010.asmx Webservice to get to any functions that are needed to operate on SSRS. I assume you’ve read one of these articles and that you need to get a proxy to this.

Onto the method that we’ll call to get the two different types of SSRS Subscriptions:

Normal subscriptions & DataDriven Subscriptions

For a normal subscription you’ll need to call the method listSubscriptions . This method expects the name of the report that you wish to get the subscriptions for.  To get the names of the reports we’ll use the .listChildren method and list all the children of the current site and then find each subscription for each report.

function Get-Subscriptions
{
  param([object]$ssrsproxy, [string]$site, [switch]$DataDriven)
  write-verbose 'Path to where the reports are must be specified to get the subscriptions you want.. Root (/) does not seem to get everything'
  $items = $ssrsproxy.ListChildren($site,$true) | Where-Object{$_.typename -eq 'report'}
  $subprops = $ddProps= @()

Now that we have the list of reports in the $items variable we can now ask for the subscription foreach item


foreach($item in $items)
 {
 $subs = $ssrsproxy.ListSubscriptions($item.path)

}

Now that we have our Subscriptions in the $subs variable we can now check to see if there was a return result if there was a return result we can then the the properties of each subscription type. We’ll know that we have a DataDriven subscripition by the property each of the subscriptions in the $subs array (.isdatadriven -eq $true)


 if($subs)
 {
   foreach($sub in $subs)
   {

    if($sub.isdatadriven -eq 'true')
      {
        $ddProps += Get-DataDrivenSubscriptionProperties -subscription $sub -ssrsproxy $ssrsproxy
      }
      elseif(-not $DataDriven)
      {
        $subProps += Get-SubscriptionProperties -subscriptionid $sub -ssrsproxy $ssrsproxy
      }

    }
 }
 if($DataDriven)
 {$ddProps}
 else {
 $subprops
 }

Now Onto explaining the Get-DataDrivenSubscriptionProperties and the Get-Subscription Properties.

The Get-SubscriptionProperties calls the method GetSubcriptionProperties with the ID of the subscription and then returns 7 objects through a reference.   In the function below I first set all the 7 reference variables to null then I call the method.  On successful return from the method I add to my powershell object [SSRSObject] (which is a  powershell class). If I choose to not use a class and instead want to use a standard object the standard object code is commented out so that it can be used if needed.


function Get-SubscriptionProperties
{
 param([string]$Subscription,
 [object]$ssrsproxy)
 $subextensionSettings = $subDataRetrievalPlan = $subDescription = $subactive = $substatus = $subeventtype = $submatchdata = $subparameters = $Null

 $subOwner = $ssrsproxy.GetSubscriptionProperties($subscription.SubscriptionID,[ref]$subextensionSettings,[ref]$subDescription,[ref]$subactive,[ref]$substatus,[ref]$subeventtype,[ref]$submatchdata,[ref]$subparameters)
 $ssrsobject = [SSRSObject]::New()
 $ssrsobject.subscription = $Subscription
 $ssrsobject.Owner = $subOwner
 $ssrsobject.ExtensionSettings = $subextensionSettings
 $ssrsobject.Description = $subDescription
 $ssrsobject.DataRetrievalPlan = $subDataRetrievalPlan
 $ssrsobject.Active = $subactive
 $ssrsobject.Status = $substatus
 $ssrsobject.EventType = $subeventtype
 $ssrsobject.MatchData = $submatchdata
 $ssrsobject.Parameters = $subparameters
 <#
 [PSCustomObject]@{
 'Owner' = $subOwner
 'extensionSettings' = $subextensionSettings
 'Description' = $subDescription
 'active' = $subactive
 'status' =$substatus
 'eventtype' =$subeventtype 
 'matchdata' = $submatchdata
 'parameters' = $subparameters
 }
 #>
}

For calling the Get-DataDrivenSubscriptionProperties we do all the same things as with the previous subscription type.  We call the method GetDataDrivenSubscriptionProperties it returns the same 7 reference objects.  On successful return from the method I add to my powershell object [SSRSObject] (which is a  powershell class). If I choose to not use a class and instead want to use a standard object the standard object code is commented out so that it can be used if needed.


function Get-DataDrivenSubscriptionProperties 
{
 param([object] $Subscription,
 [object]$ssrsproxy)
 $ssrsobject = [SSRSObject]::New()
 $sid = $Subscription.SubscriptionID
 $ddextensionSettings = $ddDataRetrievalPlan = $ddDescription = $ddactive = $ddstatus = $ddeventtype = $ddmatchdata = $ddparameters = $Null
 $ddOwner = $ssrsproxy.GetDataDrivenSubscriptionProperties($sid,[ref]$ddextensionSettings,[ref]$ddDataRetrievalPlan`
 ,[ref]$ddDescription,[ref]$ddactive,[ref]$ddstatus,[ref]$ddeventtype,[ref]$ddmatchdata,[ref]$ddparameters)

 $ssrsobject.subscription = $Subscription
 $ssrsobject.Owner = $ddOwner
 $ssrsobject.ExtensionSettings = $ddextensionSettings
 $ssrsobject.Description = $ddDescription
 $ssrsobject.DataRetrievalPlan = $ddDataRetrievalPlan
 $ssrsobject.Active = $ddactive
 $ssrsobject.Status = $ddstatus
 $ssrsobject.EventType = $ddeventtype
 $ssrsobject.MatchData = $ddmatchdata
 $ssrsobject.Parameters = $ddparameters
 $ssrsobject
 <# [PSCustomObject]@{
 'Owner' = $ddOwner
 'extensionSettings' = $ddextensionSettings
 'DataRetrievalPlan' = $ddDataRetrievalPlan
 'Description' = $ddDescription
 'active' = $ddactive
 'status' =$ddstatus
 'eventtype' =$ddeventtype 
 'matchdata' = $ddmatchdata
 'parameters' = $ddparameters
 } #>
}

Now that I have each of the reports in an object  I now persist this to disk with either Export-clixml or with convertto-json cmdlets

function New-XMLSubscriptionfile
{
[CmdletBinding()]
[Alias()]
param([psobject]$subscriptionObject, [string]$path)
if(test-path $path -PathType Leaf)
{
$path = split-path $path

}
if(-not(test-path $path))
{
mkdir $path
}
foreach($sub in $subscriptionObject)
{
$reportName = (($sub.subscription.report).split('.'))[0]
$filename = "$path\$reportName.xml"
$sub | Export-Clixml -Depth 100 -path $filename
}
}

function New-JsonSubscriptionFile
{
[CmdletBinding()]
[Alias()]
param([psobject]$subscriptionObject, [string]$path)
if(test-path $path -PathType Leaf)
{
$path = split-path $path

}
if(-not(test-path $path))
{
mkdir $path
}
foreach($sub in $subscriptionObject)
{
$reportName = (($sub.subscription.report).split('.'))[0]
$filename = "$path\$reportName.json"
$sub | convertto-json -Depth 100 | out-file $filename
}
}

To see the entire script see this Gist


#requires -version 5.0
function Get-DataDrivenSubscriptionProperties
{
param([object] $Subscription,
[object]$ssrsproxy)
$ssrsobject = [SSRSObject]::New()
$sid = $Subscription.SubscriptionID
$ddextensionSettings = $ddDataRetrievalPlan = $ddDescription = $ddactive = $ddstatus = $ddeventtype = $ddmatchdata = $ddparameters = $Null
$ddOwner = $ssrsproxy.GetDataDrivenSubscriptionProperties($sid,[ref]$ddextensionSettings,[ref]$ddDataRetrievalPlan`
,[ref]$ddDescription,[ref]$ddactive,[ref]$ddstatus,[ref]$ddeventtype,[ref]$ddmatchdata,[ref]$ddparameters)
$ssrsobject.subscription = $Subscription
$ssrsobject.Owner = $ddOwner
$ssrsobject.ExtensionSettings = $ddextensionSettings
$ssrsobject.Description = $ddDescription
$ssrsobject.DataRetrievalPlan = $ddDataRetrievalPlan
$ssrsobject.Active = $ddactive
$ssrsobject.Status = $ddstatus
$ssrsobject.EventType = $ddeventtype
$ssrsobject.MatchData = $ddmatchdata
$ssrsobject.Parameters = $ddparameters
$ssrsobject
<# [PSCustomObject]@{
'Owner' = $ddOwner
'extensionSettings' = $ddextensionSettings
'DataRetrievalPlan' = $ddDataRetrievalPlan
'Description' = $ddDescription
'active' = $ddactive
'status' =$ddstatus
'eventtype' =$ddeventtype
'matchdata' = $ddmatchdata
'parameters' = $ddparameters
} #>
}
function Get-SubscriptionProperties
{
param([string]$Subscription,
[object]$ssrsproxy)
$subextensionSettings = $subDataRetrievalPlan = $subDescription = $subactive = $substatus = $subeventtype = $submatchdata = $subparameters = $Null
$subOwner = $ssrsproxy.GetSubscriptionProperties($subscription.SubscriptionID,[ref]$subextensionSettings,[ref]$subDescription,[ref]$subactive,[ref]$substatus,[ref]$subeventtype,[ref]$submatchdata,[ref]$subparameters)
$ssrsobject = [SSRSObject]::New()
$ssrsobject.subscription = $Subscription
$ssrsobject.Owner = $subOwner
$ssrsobject.ExtensionSettings = $subextensionSettings
$ssrsobject.Description = $subDescription
$ssrsobject.DataRetrievalPlan = $subDataRetrievalPlan
$ssrsobject.Active = $subactive
$ssrsobject.Status = $substatus
$ssrsobject.EventType = $subeventtype
$ssrsobject.MatchData = $submatchdata
$ssrsobject.Parameters = $subparameters
<#
[PSCustomObject]@{
'Owner' = $subOwner
'extensionSettings' = $subextensionSettings
'Description' = $subDescription
'active' = $subactive
'status' =$substatus
'eventtype' =$subeventtype
'matchdata' = $submatchdata
'parameters' = $subparameters
}
#>
}
function Get-Subscriptions
{
#Returns a nested object with each
param([object]$ssrsproxy, [string]$site, [switch]$DataDriven)
#write-verbose 'Path to where the reports are must be specified to get the subscriptions you want.. Root (/) does not seem to get everything'
$items = $ssrsproxy.ListChildren($site,$true) | Where-Object{$_.typename -eq 'report'}
$subprops = $ddProps= @()
foreach($item in $items)
{
$subs = $ssrsproxy.ListSubscriptions($item.path)
write-verbose "found $($subs.count) subscriptions for $($item.Name)"
if($subs)
{
foreach($sub in $subs)
{
if($sub.isdatadriven -eq 'true')
{
$ddProps += Get-DataDrivenSubscriptionProperties subscription $sub ssrsproxy $ssrsproxy
}
elseif(-not $DataDriven)
{
$subProps += Get-SubscriptionProperties subscriptionid $sub ssrsproxy $ssrsproxy
}
}
}
}
if($DataDriven)
{$ddProps}
else {
$subprops
}
}
function New-XMLSubscriptionfile
{
[CmdletBinding()]
[Alias()]
param([psobject]$subscriptionObject, [string]$path)
if(test-path $path PathType Leaf)
{
$path = split-path $path
}
if(-not(test-path $path))
{
mkdir $path
}
foreach($sub in $subscriptionObject)
{
$reportName = (($sub.subscription.report).split('.'))[0]
$filename = "$path\$reportName.xml"
$sub | Export-Clixml Depth 100 path $filename
}
}
function New-JsonSubscriptionFile
{
[CmdletBinding()]
[Alias()]
param([psobject]$subscriptionObject, [string]$path)
if(test-path $path PathType Leaf)
{
$path = split-path $path
}
if(-not(test-path $path))
{
mkdir $path
}
foreach($sub in $subscriptionObject)
{
$reportName = (($sub.subscription.report).split('.'))[0]
$filename = "$path\$reportName.json"
$sub | convertto-json Depth 100 | out-file $filename
}
}
Function Export-DataDrivenSubscriptions
{
param([object]$ssrsproxy, [string]$site, [string]$path, [switch]$json)
$subs = get-subscriptions ssrsproxy $ssrsproxy site $site DataDriven
if($json)
{New-JsonSubscriptionFile subscriptionObject $subs path $path}
else
{New-XMLSubscriptionfile subscriptionObject $subs path $Path}
}
class SSRSObject
{
[SSRSProxy.Subscription[]]$subscription
[string]$Owner
[SSRSProxy.ExtensionSettings]$ExtensionSettings
[SSRSProxy.DataRetrievalPlan]$DataRetrievalPlan
[string]$Description
[SSRSProxy.ActiveState]$Active
[string]$Status
[string]$EventType
[string]$MatchData
[SSRSProxy.ParameterValueOrFieldReference[]]$Parameters
}

I hope this helps someone

Until then keep Scripting

Thom

Profile creation with PowerShell and the community

I was asked to see if I could create a script to create a Users profile without the user being logged in.

So I searched Bing’d and Goog’d and couldn’t find a PowerShell Module where you could do that.  I then began the process of searching for folks that could get me started. Once I got the “starting” information I was able to put a script together.   this led me to a base script to create a user profile with Pinvoke . when I first put this code together it caused ISE / Powershell to Crash. So then I was again perplexed as to what do I do now.   So I Posted a question on it and thankfully someone else had started working on the same thing    .  He gave me a working way to get around the crashes I was experiencing with his script .  Now on to what and how it works.

The main task was to create a profile so I’ll explain that first.

In order to use the interopservices / pinvoke I had to bring in System.runtime.interopservices.

thankfully Adamdriscoll did all the heaving lifting with his scripting that creates this type:


 Add-Type -TypeDefinition '
 using System;
 using System.Runtime.InteropServices;
 public static class PInvoke {
 [DllImport("userenv.dll", SetLastError = true, CharSet = CharSet.Auto)]
 public static extern int CreateProfile( [MarshalAs(UnmanagedType.LPWStr)] String pszUserSid, [MarshalAs(UnmanagedType.LPWStr)] String pszUserName, [Out][MarshalAs(UnmanagedType.LPWStr)] System.Text.StringBuilder pszProfilePath, uint cchProfilePath);
 }
 '

The next step was how to call that added type with the proper information.


$pszProfilePath = new-object -typename System.Text.StringBuilder
[int]$results = [PInvoke]::CreateProfile($UserSid, $UserName, $pszProfilePath, $ProfilePath)
}
$stringbuff = new-object system.text.stringbuilder(260)
[system.uint32]$a =$stringbuff.capacity
$sid = ((get-aduser -id 'brtestlocaluser').sid.value)
CreateProfile -usersid $sid -username 'brtestlocaluser' -ProfilePath $a

Here is where I found this code caused my ise and powershell process to crash.

function CreateProfile
{
param([String]$UserSid, [String]$UserName, [system.uint32]$ProfilePath)
Add-Type -TypeDefinition '
using System;
using System.Runtime.InteropServices;
public static class PInvoke {
[DllImport("userenv.dll", SetLastError = true, CharSet = CharSet.Auto)]
public static extern int CreateProfile( [MarshalAs(UnmanagedType.LPWStr)] String pszUserSid, [MarshalAs(UnmanagedType.LPWStr)] String pszUserName, [Out][MarshalAs(UnmanagedType.LPWStr)] System.Text.StringBuilder pszProfilePath, uint cchProfilePath);
}
'
$pszProfilePath = new-object -typename System.Text.StringBuilder
[int]$results = [PInvoke]::CreateProfile($UserSid, $UserName, $pszProfilePath, $ProfilePath)
}
$stringbuff = new-object system.text.stringbuilder(260)
[system.uint32]$a =$stringbuff.capacity
$sid = ((get-aduser -id 'brtestlocaluser').sid.value)
CreateProfile -usersid $sid -username 'brtestlocaluser' -ProfilePath $a

So with that in mind I sent out some Tweets to find out why this was crashing and I came across . He had already done some of the work to allow for a Pinvoke to be called.  What he did differently than what I was doing was to “wrap” the Pinvoke  in a Script Scope.  So the code I’m showing above ended up in being two functions one to register the native method the other function to add the native method.

function Register-NativeMethod
{
[CmdletBinding()]
[Alias()]
[OutputType([int])]
Param
(
# Param1 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
[string]$dll,

# Param2 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=1)]
[string]
$methodSignature
)

$script:nativeMethods += [PSCustomObject]@{ Dll = $dll; Signature = $methodSignature; }
}

Adding the Native Method:

function Add-NativeMethods
{
    [CmdletBinding()]
    [Alias()]
    [OutputType([int])]
    Param($typeName = 'NativeMethods')

    $nativeMethodsCode = $script:nativeMethods | ForEach-Object { "
        [DllImport(`"$($_.Dll)`")]
        public static extern $($_.Signature);
    " }

    Add-Type @"
        using System;
        using System.Text;
        using System.Runtime.InteropServices;
        public static class $typeName {
            $nativeMethodsCode
        }
"@
}

Now to show how they are called in the new function that creates a user profile.  The first thing that is done is we try and see if the user that we need to create a profile for is Local to the machine.

New-LocalUser -username $UserName -password $Password;

If that user is local then we goto the new-localuser function in the same script.  Once that completes we are on to the Pinvoke code. First we declare a name for our method to be from the Pinvoke. In this case it’s going to be USERENVCP.  Then we see if it is already declared with the If statement:

$methodName = 'UserEnvCP'
    $script:nativeMethods = @();

    if (-not ([System.Management.Automation.PSTypeName]$MethodName).Type)
    {

If it’s not in our session then here is where we are going to use the functions described above to get our Pinvoke registered in our session.  So now we call Register-NativeMethod with our Dll and the method signature to register it.  Then immediately after that we add the native method so we can call it.

Register-NativeMethod "userenv.dll" "int CreateProfile([MarshalAs(UnmanagedType.LPWStr)] string pszUserSid,`
         [MarshalAs(UnmanagedType.LPWStr)] string pszUserName,`
         [Out][MarshalAs(UnmanagedType.LPWStr)] StringBuilder pszProfilePath, uint cchProfilePath)";

        Add-NativeMethods -typeName $MethodName;

With the $methodname added now we can call it and create our profile:

    try
    {
        [UserEnvCP]::CreateProfile($userSID.Value, $Username, $sb, $pathLen) | Out-Null;
    }

Full code for the explained function is below:

function Create-NewProfile {

    [CmdletBinding()]
    [Alias()]
    [OutputType([int])]
    Param
    (
        # Param1 help description
        [Parameter(Mandatory=$true,
                   ValueFromPipelineByPropertyName=$true,
                   Position=0)]
        [string]$UserName,

        # Param2 help description
        [Parameter(Mandatory=$true,
                   ValueFromPipelineByPropertyName=$true,
                   Position=1)]
        [string]
        $Password
    )

    Write-Verbose "Creating local user $Username";

    try
    {
        New-LocalUser -username $UserName -password $Password;
    }
    catch
    {
        Write-Error $_.Exception.Message;
        break;
    }
    $methodName = 'UserEnvCP'
    $script:nativeMethods = @();

    if (-not ([System.Management.Automation.PSTypeName]$MethodName).Type)
    {
        Register-NativeMethod "userenv.dll" "int CreateProfile([MarshalAs(UnmanagedType.LPWStr)] string pszUserSid,`
         [MarshalAs(UnmanagedType.LPWStr)] string pszUserName,`
         [Out][MarshalAs(UnmanagedType.LPWStr)] StringBuilder pszProfilePath, uint cchProfilePath)";

        Add-NativeMethods -typeName $MethodName;
    }

    $localUser = New-Object System.Security.Principal.NTAccount("$UserName");
    $userSID = $localUser.Translate([System.Security.Principal.SecurityIdentifier]);
    $sb = new-object System.Text.StringBuilder(260);
    $pathLen = $sb.Capacity;

    Write-Verbose "Creating user profile for $Username";

    try
    {
        [UserEnvCP]::CreateProfile($userSID.Value, $Username, $sb, $pathLen) | Out-Null;
    }
    catch
    {
        Write-Error $_.Exception.Message;
        break;
    }
}

Many thanks to the members of the community that helped me with getting this script built and working (@Ms_dminstrator, @adamdriscoll )…. The entire script can be found on my gist here:


<#
.Synopsis
Rough PS functions to create new user profiles
.DESCRIPTION
Call the Create-NewProfile function directly to create a new profile
.EXAMPLE
Create-NewProfile -Username 'testUser1' -Password 'testUser1'
.NOTES
Created by: Josh Rickard (@MS_dministrator) and Thom Schumacher (@driberif)
Date: 24MAR2017
Location: https://gist.github.com/crshnbrn66/7e81bf20408c05ddb2b4fdf4498477d8
Contact: https://github.com/MSAdministrator
MSAdministrator.com
https://github.com/crshnbrn66
powershellposse.com
#>
#Function to create the new local user first
function New-LocalUser
{
[CmdletBinding()]
[Alias()]
[OutputType([int])]
Param
(
# Param1 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$userName,
# Param2 help description
[string]
$password
)
$system = [ADSI]"WinNT://$env:COMPUTERNAME";
$user = $system.Create("user",$userName);
$user.SetPassword($password);
$user.SetInfo();
$flag=$user.UserFlags.value -bor 0x10000;
$user.put("userflags",$flag);
$user.SetInfo();
$group = [ADSI]("WinNT://$env:COMPUTERNAME/Users");
$group.PSBase.Invoke("Add", $user.PSBase.Path);
}
#function to register a native method
function Register-NativeMethod
{
[CmdletBinding()]
[Alias()]
[OutputType([int])]
Param
(
# Param1 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
[string]$dll,
# Param2 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=1)]
[string]
$methodSignature
)
$script:nativeMethods += [PSCustomObject]@{ Dll = $dll; Signature = $methodSignature; }
}
function Get-Win32LastError
{
[CmdletBinding()]
[Alias()]
[OutputType([int])]
Param($typeName = 'LastError')
if (-not ([System.Management.Automation.PSTypeName]$typeName).Type)
{
$lasterrorCode = $script:lasterror | ForEach-Object{
'[DllImport("kernel32.dll", SetLastError = true)]
public static extern uint GetLastError();'
}
Add-Type @"
using System;
using System.Text;
using System.Runtime.InteropServices;
public static class $typeName {
$lasterrorCode
}
"@
}
}
#function to add native method
function Add-NativeMethods
{
[CmdletBinding()]
[Alias()]
[OutputType([int])]
Param($typeName = 'NativeMethods')
$nativeMethodsCode = $script:nativeMethods | ForEach-Object { "
[DllImport(`"$($_.Dll)`")]
public static extern $($_.Signature);
" }
Add-Type @"
using System;
using System.Text;
using System.Runtime.InteropServices;
public static class $typeName {
$nativeMethodsCode
}
"@
}
#Main function to create the new user profile
function Create-NewProfile {
[CmdletBinding()]
[Alias()]
[OutputType([int])]
Param
(
# Param1 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
[string]$UserName,
# Param2 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=1)]
[string]
$Password
)
Write-Verbose "Creating local user $Username";
try
{
New-LocalUser username $UserName password $Password;
}
catch
{
Write-Error $_.Exception.Message;
break;
}
$methodName = 'UserEnvCP'
$script:nativeMethods = @();
if (-not ([System.Management.Automation.PSTypeName]$MethodName).Type)
{
Register-NativeMethod "userenv.dll" "int CreateProfile([MarshalAs(UnmanagedType.LPWStr)] string pszUserSid,`
[MarshalAs(UnmanagedType.LPWStr)] string pszUserName,`
[Out][MarshalAs(UnmanagedType.LPWStr)] StringBuilder pszProfilePath, uint cchProfilePath)";
Add-NativeMethods typeName $MethodName;
}
$localUser = New-Object System.Security.Principal.NTAccount("$UserName");
$userSID = $localUser.Translate([System.Security.Principal.SecurityIdentifier]);
$sb = new-object System.Text.StringBuilder(260);
$pathLen = $sb.Capacity;
Write-Verbose "Creating user profile for $Username";
try
{
[UserEnvCP]::CreateProfile($userSID.Value, $Username, $sb, $pathLen) | Out-Null;
}
catch
{
Write-Error $_.Exception.Message;
break;
}
}
function New-ProfileFromSID {
[CmdletBinding()]
[Alias()]
[OutputType([int])]
Param
(
# Param1 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
[string]$UserName,
[string]$domain = 'PHCORP'
)
$methodname = 'UserEnvCP2'
$script:nativeMethods = @();
if (-not ([System.Management.Automation.PSTypeName]$methodname).Type)
{
Register-NativeMethod "userenv.dll" "int CreateProfile([MarshalAs(UnmanagedType.LPWStr)] string pszUserSid,`
[MarshalAs(UnmanagedType.LPWStr)] string pszUserName,`
[Out][MarshalAs(UnmanagedType.LPWStr)] StringBuilder pszProfilePath, uint cchProfilePath)";
Add-NativeMethods typeName $methodname;
}
$sb = new-object System.Text.StringBuilder(260);
$pathLen = $sb.Capacity;
Write-Verbose "Creating user profile for $Username";
#$SID= ((get-aduser -id $UserName -ErrorAction Stop).sid.value)
if($domain)
{
$objUser = New-Object System.Security.Principal.NTAccount($domain, $UserName)
$strSID = $objUser.Translate([System.Security.Principal.SecurityIdentifier])
$SID = $strSID.Value
}
else
{
$objUser = New-Object System.Security.Principal.NTAccount($UserName)
$strSID = $objUser.Translate([System.Security.Principal.SecurityIdentifier])
$SID = $strSID.Value
}
Write-Verbose "$UserName SID: $SID"
try
{
$result = [UserEnvCP2]::CreateProfile($SID, $Username, $sb, $pathLen)
if($result -eq '-2147024713')
{
$status = "$userName already exists"
write-verbose "$username Creation Result: $result"
}
elseif($result -eq '-2147024809')
{
$staus = "$username Not Found"
write-verbose "$username creation result: $result"
}
elseif($result -eq 0)
{
$status = "$username Profile has been created"
write-verbose "$username Creation Result: $result"
}
else
{
$status = "$UserName unknown return result: $result"
}
}
catch
{
Write-Error $_.Exception.Message;
break;
}
$status
}
Function Remove-Profile {
[CmdletBinding()]
[Alias()]
[OutputType([int])]
Param
(
# Param1 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
[string]$UserName,
[string]$ProfilePath,
[string]$domain = 'PHCORP'
)
$methodname = 'userenvDP'
$script:nativeMethods = @();
if (-not ([System.Management.Automation.PSTypeName]"$methodname.profile").Type)
{
add-type @"
using System.Runtime.InteropServices;
namespace $typename
{
public static class UserEnv
{
[DllImport("userenv.dll", CharSet = CharSet.Unicode, ExactSpelling = false, SetLastError = true)]
public static extern bool DeleteProfile(string sidString, string profilePath, string computerName);
[DllImport("kernel32.dll")]
public static extern uint GetLastError();
}
public static class Profile
{
public static uint Delete(string sidString)
{ //Profile path and computer name are optional
if (!UserEnv.DeleteProfile(sidString, null, null))
{
return UserEnv.GetLastError();
}
return 0;
}
}
}
"@
}
#$SID= ((get-aduser -id $UserName -ErrorAction Stop).sid.value)
if($domain)
{
$objUser = New-Object System.Security.Principal.NTAccount($domain, $UserName)
$strSID = $objUser.Translate([System.Security.Principal.SecurityIdentifier])
$SID = $strSID.Value
}
else
{
$objUser = New-Object System.Security.Principal.NTAccount($UserName)
$strSID = $objUser.Translate([System.Security.Principal.SecurityIdentifier])
$SID = $strSID.Value
}
Write-Verbose "$UserName SID: $SID"
try
{
#http://stackoverflow.com/questions/31949002/c-sharp-delete-user-profile
$result = [userenvDP.Profile]::Delete($SID)
}
catch
{
Write-Error $_.Exception.Message;
break;
}
$LastError
}

I hope this helps someone

Until then keep Scripting

Thom

Adding a user to a Group in Dynamics CRM 2016

Recently I’ve had to add a user to Dynamics CRM 2016. This post is about how I did that with a module from the PowerShell Gallery.

First thing I needed to do was to find something that was available for use against Dynamics CRM 2016.  So I searched PowerShell gallery and found this module: Microsoft.Xrm.Data.Powershell.  In addition to this module I found some handy samples to work with this module found here:  Microsoft.Xrm.Data.PowerShell.Samples 

I was able to take these samples and come up with a usable script to add users to a group in this application.   My purpose was to do the un-thinkable add users to a Admin group.   While probably not the best thing to do, in my situation it was what I needed to do.  So here is how I began.

I looked at this sample: UpdateCrmUserSettings.ps1

This helped me immensely in figuring out how I about connecting to my crm instance:


$adminUserCredentials = get-credential

$organizationName = 'MyOrg'

$serverUrl = 'http://mycrmserver.mycompany.com:80'

$loadedandCorrectVersion = (get-command -module 'Microsoft.Xrm.Data.Powershell' -ErrorAction Ignore).version -eq '2.5'
if(-not $loadedandCorrectVersion)
{
find-module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 | Install-Module -Scope CurrentUser -AllowClobber -Force
Import-Module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 -Force -RequiredVersion 2.5
}

$xcrmConn = Get-CrmConnection -OrganizationName $OrganizationName -ServerUrl $ServerUrl  -Credential $AdminUserCredentials 

I added some “plumbing” to try and force PowerShell to ensure that I only have version 2.5 of the module downloaded and imported into the session where I’m going to run this. The $xcrmConn will be the connection that we use to call every subsequent function in this update of our user.  According to the documentation you can specify this Connection as a global variable, I chose to not do this so that I could understand what is going on from each call I make to this module.

The next task was to try and figure out how to get all the users.  There are a bunch of different cmdlets that are there — From Get-CrmCurrentUserId to Get-MyCrmUserId as you can see below:


ps:\crm> get-command get*crm*user*

CommandType Name Version Source
----------- ---- ------- ------
Alias Get-CrmCurrentUserId 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserMailbox 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserPrivileges 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserSecurityRoles 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserSettings 2.5 Microsoft.Xrm.Data.Powershell
Function Get-MyCrmUserId 2.5 Microsoft.Xrm.Data.Powershell

None of them really seemed to deal or make sense out of how do I get all users or a specific user.. That is when I turned again back to the samples and found this Command

Get-CrmRecords

What I discovered is that you have to understand how to use the filters. The first thing  I tried was to get all the users in CRM.


$users = Get-CrmRecords -EntityLogicalName systemuser -conn $xcrmconn -Fields systemuserid,fullname

After several runs(trial and error) I was able to get to a workable call to the get-crmRecords. For an individual User.

In order to add a user as an admin we’ll need to get the user’s id.  Not only that we’ll also need to get the id of the Security Role that we are going to add them to.


$sysAdminrole = 'System Administrator'
#user we are going to filter on must be in the syntax of last, first
$userObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$domainUsername" -Fields domainname,fullname

$userGuid = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$userName = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
if($userGuid)
{$userRoles = (Get-CrmUserSecurityRoles -conn $xcrmconn -UserId $userGuid).roleid.guid}
else
{
Throw "$DomainUsername not found in $ServerUrl and Organization $OrganizationName"
}

$adminObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$($AdminUserCredentials.username)" -Fields domainname,fullname
#get the admins guid and user name
$adminId = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$AdminUserName = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
$adminRoleObject = Get-CrmUserSecurityRoles -conn $xcrmConn -UserId $adminId | Where-Object {$_.rolename -eq $sysAdminrole}
$adminroles = ($adminRoleObject).roleid.guid
$adminRoleName = $adminroleobject.rolename

Now that I have the required items for adding the role.  All i need to do is make sure that the role isn’t already there. Then add the Security Role ID to the user.  Now you have a user that has the System Admin  role added to it.

Full Script Follows:

#requires -module PowerShellGet
<# .SYNOPSIS A brief description of the updateusers.ps1 file. .DESCRIPTION A detailed description of the updateusers.ps1 file. .PARAMETER ServerUrl A description of the ServerUrl parameter. .PARAMETER OrganizationName Organization name in CRM For Example: yourorg .PARAMETER UserName2Add User name to add as an Admin To Crm For example Schumacher, Thomas .PARAMETER AdminUserCredentials Credentials that has admin privledges to the url passed. .EXAMPLE PS C:\> .\updateusers.ps1 -UserName2Add ‘Value1’ -xcrmCred (Get-Credential)

.NOTES
Additional information about the file.
#>
param
(
[string]$ServerUrl = ‘http://yourCrminstance.yourname.com:80&#8217;,
[string]$OrganizationName = ‘YourInstance’,
[Parameter(Mandatory = $true)]
[string]$DomainUsername =’domain\domainuser’,
[pscredential]$AdminUserCredentials = (Get-Credential)
)
$loadedandCorrectVersion = (get-command -module ‘Microsoft.Xrm.Data.Powershell’ -ErrorAction Ignore).version -eq ‘2.5’
if(-not $loadedandCorrectVersion)
{
find-module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 | Install-Module -Scope CurrentUser -AllowClobber -Force
Import-Module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 -Force -RequiredVersion 2.5
}
if(get-command -module ‘Microsoft.Xrm.Data.Powershell’)
{
$xcrmConn = Get-CrmConnection -OrganizationName $OrganizationName -ServerUrl $ServerUrl -Credential $AdminUserCredentials -Verbose
#https://github.com/seanmcne/Microsoft.Xrm.Data.PowerShell.Samples/blob/master/Microsoft.Xrm.Data.PowerShell.Samples/UpdateCrmUsersSettings/UpdateCrmUsersSettings.ps1

#get the necessary object for the admin
$sysAdminrole = ‘System Administrator’
#user we are going to filter on must be in the syntax of last, first
$userObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$domainUsername" -Fields domainname,fullname
$userGuid = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$userName = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
if($userGuid)
{$userRoles = (Get-CrmUserSecurityRoles -conn $xcrmconn -UserId $userGuid).roleid.guid}
else
{
Throw "$DomainUsername not found in $ServerUrl and Organization $OrganizationName"
}
$adminObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$($AdminUserCredentials.username)" -Fields domainname,fullname
#get the admins guid and user name
$adminId = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$AdminUserName = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
$adminRoleObject = Get-CrmUserSecurityRoles -conn $xcrmConn -UserId $adminId | Where-Object {$_.rolename -eq $sysAdminrole}
$adminroles = ($adminRoleObject).roleid.guid
$adminRoleName = $adminroleobject.rolename
if($adminRoleName -eq $sysAdminrole)
{
if($userroles -like $adminroles)
{
Write-Output "$DomainUsername is already an admin"
}
else
{
Add-CrmSecurityRoleToUser -conn $xcrmconn -UserId $userGuid -SecurityRoleId $adminId
Write-Output "$DomainUsername Added to AdminRole $adminRoleName"
}
}
else
{
Write-Warning "The $($AdminUserCredentials.username) doesn’t have the Role of ‘System Administrator’"
}
}
else
{ throw "cannot load the powershell module ‘Microsoft.Xrm.Data.Powershell’"}

I hope this helps someone

Until then keep Scripting

Thom