Changing Modified, and Created details in SharePoint

Sometimes you need to lie to SharePoint. In this post i’ll show you how to change the details for who created an item, modified it and when they modified it.

When you’re doing bulk uploads, dealing with lists where you wish to use the Advanced features of only allowing users to edit their own items or just testing some behaviour, eventually you’ll wish you can change the values that SharePoitn doesn’t let you change.

The first thing is, as always, to find the value we want to change:

#Add the SharePoint snapin
Add-PSSnapin Microsoft.SharePoint.Powershell -ea SilentlyContinue

#set the web url and the list name to work upon
$url = "http://sharepoint/sites/cthub"
$listName = "Shared Documents"
$fileName = "FileName.xlsx"

#Get the appropriate list from the web
$web = get-SPWeb $url
$list = $web.lists[$listName]

#Get the file using the filename
$item = $list.Items | ? {$_.Name -eq $fileName}

#Print out current Created by and Created date
Write-Output ("item created by {0} on {1}" -f $item["Author"].tostring(), $item["Created"] )

#Print out current Created by and Created date
Write-Output ("item last modified by {0} on {1}" -f $item["Editor"].tostring(), ($item["Modified"] -f "dd-MM-yyyy"))

As you can see we access the item properties by treating the $item as a hashtable and use the property name as the key.

#Set the created by values
$userLogin = "ALEXB\AlexB"
$dateToStore = Get-Date "10/02/1984"

$user = Get-SPUser -Web $web | ? {$_.userlogin -eq $userLogin}
$userString = "{0};#{1}" -f $user.ID, $user.UserLogin.Tostring()

#Sets the created by field
$item["Author"] = $userString
$item["Created"] = $dateToStore

#Set the modified by values
$item["Editor"] = $userString
$item["Modified"] = $dateToStore


#Store changes without overwriting the existing Modified details.
$item.UpdateOverwriteVersion()

Setting the value is a bit more complicated. To do that you have to build the appropriate user string. In my example the user is already part of the site, if they haven’t previously been added to the user information list you’ll need an extra step here to insert them.

The second bit which differs from your usual PowerShell update is the use of the UpdateOverwriteVersion() method. There’s several update methods in SharePoint but only this one will preserve your changes to modified by and created by.

And now, the full script:

<#
Title: Set modified and created by details
Author: Alex Brassington
Category: Proof of Concept Script
Description
This script is to show how to read, modify and otherwise manipulate the created by and modified by details on documents.
This is to enable correction of incorrect data as part of migrations. It is also useful to enable testing of retention policies.
#>

#Add the SharePoint snapin
Add-PSSnapin Microsoft.SharePoint.Powershell -ea SilentlyContinue

#set the web url and the list name to work upon
$url = "http://sharepoint/sites/cthub"
$listName = "Shared Documents"
$fileName = "FileName.xlsx"

#Get the appropriate list from the web
$web = get-SPWeb $url
$list = $web.lists[$listName]

#Get the file using the filename
$item = $list.Items | ? {$_.Name -eq $fileName}

#Print out current Created by and Created date
Write-Output ("item created by {0} on {1}" -f $item["Author"].tostring(), $item["Created"] )

#Print out current Created by and Created date
Write-Output ("item last modified by {0} on {1}" -f $item["Editor"].tostring(), ($item["Modified"] -f "dd-MM-yyyy"))

#Set the created by values
$userLogin = "ALEXB\AlexB"
$dateToStore = Get-Date "10/02/1984"

$user = Get-SPUser -Web $web | ? {$_.userlogin -eq $userLogin}
$userString = "{0};#{1}" -f $user.ID, $user.UserLogin.Tostring()


#Sets the created by field
$item["Author"] = $userString
$item["Created"] = $dateToStore

#Set the modified by values
$item["Editor"] = $userString
$item["Modified"] = $dateToStore


#Store changes without overwriting the existing Modified details.
$item.UpdateOverwriteVersion()

Deleting Versions

Someone asked for a script that could delete previous versions for SharePoint 3.0. I don’t have a 3.0 dev environment but I do have a 2010 build and it interested me.

Function Delete-SPVersions ()
{
[CmdletBinding(SupportsShouldProcess=$true)]
param(
    [Parameter(Mandatory=$True)][string]$webUrl, 
    [Parameter(Mandatory=$True)][string]$listName, 
    [Parameter(Mandatory=$False)][string]$numberOfMajorVersions
    ) 

    #Get the web
    $web = Get-SPWeb $webUrl


    #Get the list
    $list = $web.Lists[$listName]

    $list.Items | % {
        #Get the item in a slightly more usable variable
        $item = $_

        Write-Output ("Deleting versions for item {0}" -f $item.Name)
        
        #Get all major versions
        #Note, the version ID goes up by 512 for each major version.
        $majorVersions = $item.Versions | ? { $_.VersionID % 512 -eq 0}

        #get the largest version number
        $latestMajorID = $majorVersions | select VersionID -ExpandProperty VersionID | sort -Descending | select -First 1

        #Slightly lazy way to get the latest major version and format it as a single point decimal
        Write-Output ("   Latest major version to retain is {0:0.0}" -f ($latestMajorID /512))
        
        #Filter the major versions to only those which are lower than the highest number - 512 * $numberOfMajorVersions
        $majorVersionsToDelete = $majorVersions | ? {$_.VersionID -le ($latestMajorID - 512 * $numberOfMajorVersions)}
        if ($majorVersionsToDelete)
        {
            $majorVersionsToDelete | % {
                Write-Verbose ("  Deleting major version {0}" -f $_.VersionLabel)
                if ($pscmdlet.ShouldProcess($_.VersionLabel,"Deleting major version"))
                {
                    $_.Delete()
                }
            }
        }
        else
        {
            Write-Verbose "No major versions to delete"
        }
        
        #Re-fetch the item to ensure that the versions are still valid
        $item = $list.GetItemByUniqueId($item.UniqueId)
        
        #Get all the minor versions
        $minorVersions = $item.Versions | ? { $_.VersionID % 512 -ne 0}

        #Delete Minor versions greater than the last major version kept
        $minorVersionsToDelete = $minorVersions | ? {$_.VersionID -lt $latestMajorID}
        if ($minorVersionsToDelete)
        {
            $minorVersionsToDelete | % {
                Write-Verbose ("Deleting minor version {0}" -f $_.VersionLabel)
                if ($pscmdlet.ShouldProcess($_.VersionLabel,"Deleting minor version"))
                {
                    $_.Delete()
                }
            }
        }
        else
        {
            Write-Verbose "No minor versions to delete"
        }
    }
    $web.Dispose()
}

Failed Search Scripting

Sometimes knowing what doesn’t work is as useful as what does work. In that vein here’s how I spent my journey home…

A post on technet asked about how to deal with long running search crawls that were impacting users when they overran into business hours. In large SharePoint environments that shouldn’t really happen but it’s a fairly common concern in smaller shops.

Ideally you’ll tune your search so that it always completes in time but that doesn’t always work. For those edge cases there’s two options:

  1. Pause a specific (or all) crawl(s) during working hours.
  2. Reduce the impact of the crawls during working hours

Pausing a crawl is easy, it’s also done very well by other people such as:
Ed Wilson

I wanted to drop the performance of the crawl down so that it can still keep going but not impact the end users.

The first step was to find out how to create a crawl rule to reduce the impact of the search

$shRule = Get-SPEnterpriseSearchSiteHitRule –Identity "SharePoint"

#Cripple search
$shRule.HitRate = 1000
$shRule.Behavior = 'DelayBetweenRequests'
$shRule.Update()

#Revive search
$shRule.HitRate = 8
$shRule.Behavior = 'SimultaneousRequests'
$shRule.Update()

It turns out that in the API a crawl rule is known as a hit rule. Hit rules have two important values, the ‘rate’ and the behaviour.

The script above was enough to let me create a rule and set it to either run at a normal page or with a 16 minute delay between requests. And it worked!

Well, it created a rule and that rule worked. Sadly i’d forgotten that the crawler rules are only checked when you start a crawl. If you start a crawl then punch the delay between items up to 1000 it won’t make a blind bit of difference.

It turns out that pausing the crawl doesn’t make the search engine re-check the crawl rate.

So, a failure. The only thing i can think of is using reflection to find out what is happening in the background and then doing something deeply unsupported to modify the values in flight. Maybe another time.

Check Blob Cache settings across a farm

A routine but really annoying task came up today. The status of BLOB caching for all Web Applications, normally I’d hop into the config files and check but these are spread over 6 farms.
To make things worse each farm had between 4 and 5 web apps and between 1 and 3 WFEs. In total that should have meant checking in the region of 50 config files.

Sod that.

Getting data from the config file is easy, after all it’s just an xml document. Once we’ve got the document it’s a nice easy task to suck the actual values out:

	$xml = [xml](Get-Content $webConfig)
           $blobNode =  $xml.configuration.SharePoint.BlobCache
            
           $props = @{
                'Location' = $blobNode.location;
                'MaxSize' = $blobNode.maxsize;
                'Enabled' = $blobNode.enabled
            }
           New-Object -TypeName PSObject -Property $props
That on it's own isn't that helpful. After all we don't just want to get the values for ONE config file, we want to get them for something like 50. Also I don't really want to have to list each one out, I just want them all... Through some rather excessive PowerShelling I know that we can get the config file's physical path from the IIS settings on a web application:
$WebApp = Get-SPWebApplication "http://sharepoint.domain.com" 
$WebApp.IISSettings
#This is horrible code but I haven't found a better way to get the value in a usable format.
$physicalPath = ($settings.Values | select path -Expand Path).Tostring()

That will give us the physical path to the folder containing the web.config file. It's only a small leap to make it loop through all the locations…

#Script to check the blob caching configuration in all webconfig files in your farm on all servers.

Add-PSSnapin Microsoft.SharePoint.PowerShell -ea SilentlyContinue

$outputFile = "C:\Folder\TestFile.txt"

#This is a rubbish way of getting the server names where the foundation web app is running but i can't
#find a better one at the moment.
$serversWithFWA =Get-SPServiceInstance `
    | ? { $_.TypeName-eq "Microsoft SharePoint Foundation Web Application" -AND $_.Status -eq "Online"} `
    | Select Server -ExpandProperty Server | select Address

	
	
Get-SPWebApplication | %{
 
    $configResults = @()
	$webAppName = $_.name
    Write-Host "Webapp $webAppName"
    
    #Get the physical path from the iis settings
    $settings =$_.IISSettings
    $physicalPath = ($settings.Values | select path -Expand Path).Tostring()
     
    #foreach server running the Foundation Web Application.
    foreach ($server in $serversWithFWA)
    {
        #Build the UNC path to the file Note that this relies on knowing more or less where your config files are
        #This could be improved using regex.
        $serverUNCPath = $physicalPath.Replace("C:\",("\\" + $server.Address + "\C$\"))
        $webConfig = $serverUNCPath +"\web.config"
 
        #If the file exists then try to read the values
        If(Test-Path ($webConfig))
        {
            $xml = [xml](Get-Content $webConfig)
            $blobNode =$xml.configuration.SharePoint.BlobCache

            $props = @{
				'Server' = $server.Address;
                'Location' = $blobNode.location;
                'MaxSize' = $blobNode.maxsize;
                'Enabled' = $blobNode.enabled
            }
            $configResults += New-Object -TypeName PSObject -Property $props
        }
    }
	#Print the results out for the GUI
	$webAppName
    $configResults | ft
	
	#Output the data into a useful format - start by printing out the file name
	$webAppName >>  $outputFile
	#CSV because the data is immeasurably easier to load into a table etc. later. HTML would be a good alternative
	$configResults | ConvertTo-CSV >> $outputFile
}

Shared Folder Logs In SharePoint

First of all. Do not try this on anything other than a disposable test farm.

I’m 99% certain Microsoft will laugh in your face if you do this to your production server and try to get support. Doing this on a client site would be reckless and irresponsible, this is posted for interests sake.

I was down the pub one evening and It occurred to me, what happens if you want to put all your logs on a mapped drive? It quickly occurred to me that I should drink more.

… Six months passed …

In the world of SANs my question doesn’t make that much sense. You build your aggregate, parcel it up into LUNs and attach them to the relevant hosts. We’re not talking about a significant improvement in performance, scalability, maintenance etc.

On the other hand if you’re got a big farm the idea of having 20 x 10GB disks with one attached to each server has to add to the complexity From an admin perspective it’s also a pain as your ULS files are all over the shop. But the main reason I want to do this is because I want to know if it can be done…

To assist me I have a friendly shared drive:

\\SPINTDEV\TestFolder

By default my install was using the C:\Program Files path, not best practice but this is a throwaway VM. It was good enough to start:

Default Log Directory

Default Log Directory


So, let’s try putting in my shared folder:

Attempting to set a shared folder as the log file path

Attempting to set a shared folder as the log file path

Unsurprisingly, SharePoint objected to my choice of paths. Normally this would be a good time to realise that this is a bad idea but let’s try something else:

Adding a shared folder as a mapped drive

Adding a shared folder as a mapped drive

Perhaps a mapped drive will do it (note I actually used H:\ instead). Once that’s created we can try using it instead:

Failing to add a mapped drive as a log folder path

Failing to add a mapped drive as a log folder path


I guess not. It seems SharePoint doesn’t like my idea as much as I do. Time to see if PowerShell also dislikes my clever plan …
Setting the log folder path using PowerShell

Setting the log folder path using PowerShell


*manic cackle*
It turns out that PowerShell will let us set it, even when the GUI complains! Score one for PowerShell! Note at this point a wider lesson, PowerShell will sometimes let you do things you can’t through the GUI. That can be a good thing or a bad thing.

Of course when I went back to check my folder I saw an ocean of emptiness, no log files. Event viewer tells us the folly of our ways…

Errors in Windows event viewer

Errors in Windows event viewer

Error 2163 Unable to write log

Error 5402 Unable to create log
So, a resounding failure, on the other hand that error message includes the Log location in the registry.
Regedit.exe open at the location of the log file location value

There we can see our, not working, mapped drive. I wonder what happens if we try for the full house…

Updated log file location to point to the shared folder

And if we force SharePoint to create a new log file using ‘New-SPLogFile’ then..

Log files being created in the correct directory

*Manic cackle*

It works. Nothing new in the event viewer, log files seem to be being created correctly. The GUI still shows the old folder path and for any new servers you’ll get the same error message we saw above.

So it turns out that yes, you can use Network drives for SharePoint log files but doing so has some serious drawbacks:

  • Probably out of support
  • Inability to edit the diagnostic logs through the GUI afterwards
  • Not thoroughly tested
  • Adds to DR / new server build complexity

There’s probably a lot more but any one of the first three should stop you deploying to a live system.

On the other hand i’d quite like it if MS did open this up. Disavowing UNC paths might help to avoid some admins using a consumer grade NAS as if it was flash cached fiber SAN but it also restricts everyone else and i’m not sure if there’s a better reason.

Automating SharePoint Search Testing

I was browsing technet, as you do, when i found this comment on Search best practices:

We recommend that you test the crawling and querying functionality of the server farm after you make configuration changes or apply updates

http://technet.microsoft.com/en-us/library/cc850696(v=office.14).aspx

This chimed with me as a client I worked with had failed to do this and had paid the price when their Production search service went down for a day.

The article continues:

An easy way to do this is to create a temporary content source that is used only for this purpose. To test, we recommend that you crawl ten items — for example .txt files on a file share — and then perform search queries for those files. Make sure that the test items are currently not in the index. It is helpful if they contain unique words that will be displayed at the top of the search results page when queried. After the test is complete, we recommend that you delete the content source that you created for this test. Doing this removes the items that you crawled from the index and those test items will not appear in search results after you are finished testing

To put that in bullet point format:

  1. Test the search system doesn’t already have test content
  2. Create some test content to search
  3. Create a new content source
  4. Crawl the test content
  5. Search for the test content
  6. Check that the test content is there
  7. Remove the test content by blowing away the content source
  8. Confirm it’s no longer there

It’s a good test script. It also breaks down nicely into some bullet points. With a bit of thought we can break this down into some simple tasks:

  1. Run a search and test the results
  2. Create some files
  3. Create a new content source
  4. Crawl the test content
  5. Run a search and test the results
  6. Remove the test content by blowing away the content source
  7. Run a search and test the results

The common aspect is to run a search three times and test the results. Of course the results will, hopefully, vary depending on when we run that test but we can manage that.

Let’s go with the big one. Running a search:

Function Check-TestPairValue ()
{
<#
.DESCRIPTION
Takes a pipeline bound collection of test values and search terms and searches for
them using the searchPageURL.
Returns either 'Present' or 'Not Found' depending on the result.
Not currently production grade#> 
    [CmdletBinding()]
    Param (
    [Parameter(Mandatory=$true,ValueFromPipeline=$true)]$testPair,
    [Parameter(Mandatory=$true)]$searchPageURL
     )
    BEGIN{
        #Create the IE window in the begin block so if the input is pipelined we don't have to re-open it each time.
        $ie = New-Object -com "InternetExplorer.Application"
        $ie.visible = $true
    }
    PROCESS
    { 
        #Get the test value and the search term from the pair
        $testValue = $testPair[0]
        $searchTerm = $testPair[1]
        
        #Open the navigation page
        $ie.navigate($searchPageURL)

        #Wait for the page to finish loading
        while ($ie.readystate -ne 4)
        {
            start-sleep -Milliseconds 100
        }
        Write-Verbose "Page loaded"
        
        #Get the search box
        $searchTextBoxID = "ctl00_m_g_2f1edfa4_ab03_461a_8ef7_c30adf4ba4ed_SD2794C0C_InputKeywords"
        $document = $ie.Document
        $searchBox = $document.getElementByID($searchTextBoxID)

        #enter the search terms
        $searchBox.innerText = $searchTerm
        Write-Verbose "Searching for: $searchTerm - Expected result: $testValue"    
        
        #Get the search button
        $searchButtonID = "ctl00_m_g_2f1edfa4_ab03_461a_8ef7_c30adf4ba4ed_SD2794C0C_go"
        
        #Run the search
        $btn = $document.getElementByID($searchButtonID)
        $btn.click()
        
        #Wait for the results to be loaded
        while ($ie.locationurl -eq $searchPageURL)
        {
           start-sleep -Milliseconds 100
        }
        
        Write-Verbose "Left the page, waiting for results page to load"
        #Wait for the results page to load
        while ($ie.readystate -ne 4)
        {
            start-sleep -Milliseconds 100
        }
        Write-Verbose "Results page loaded"
        #Once loaded check that the results are correct
        $document = $ie.document

        #Check that the search term results contain the test results:
        $firstSearchTermID = "SRB_g_9acbfa38_98a6_4be5_b860_65ed452b3b09_1_Title"
        $firstSearchResult = $document.getElementByID($firstSearchTermID)
        
        $result =""
        
        #test that the title of the file is equal to the search result
        If ($firstSearchResult.innerHTML -match $testValue)
        {
            $result ="Present"
        }
        else
        {
            $result ="Not Found"
        }
        
        Write-Verbose "Test $result"
        
        #Create a new PS Object for our result and let PowerShell pass it out.
        New-Object PSObject -Property @{
            TestCase = $searchTerm
            Result = $result
        }    
    }
    END {
        #Close the IE window after us
        $ie.Quit()
    }
}

Well to be honest that’s the only tricky bit in the process. From there on in it’s plumbing.
We create some test files:


Function Create-SPTestFiles ()
{
[CmdletBinding()]
Param(
    $filesToCreate,
    [string]$folderPath
    )
    
    If (!(Test-Path $folderPath))
    {
    	#Folder doesn't exist.
        Write-Verbose "Folder does not exist - attempting to create"
    	New-Item $folderPath -type directory
    }

    #if the files don’t exist. Create them
    Foreach ($file in $filesToCreate)
    {
         $fileName = $file[0]
    	$filePath = $folderPath + "\" + $fileName
    	If (Test-Path $filePath)
        {
            Write-Verbose "File $fileName already exists. Skipping"
            Write-EventLog -LogName "Windows PowerShell" -Source "PowerShell" -EventId 103 -EntryType Error -Message "Test content already present."
        }
        else
    	{
            Write-Verbose "Creating $fileName"
    		$file[1] >> $filePath
    	}
    }
    Write-Verbose "All files created"
}

We create a content source (this function isn’t perfect here but i’m stealing it from another script)


Function Ensure-TestContentSourceExists ()
{
    [CmdletBinding()]
    Param(
    $sa,
    [string]$contentSourceName,
    [string]$filePath
    )
    $testCS = $sa | Get-SPEnterpriseSearchCrawlContentSource | ? {$_.Name -eq $contentSourceName}
    if ($testCS)
    {
        Write-Verbose "Content Source $contentSourceName already exists"
    }
    else
    {
        Write-Verbose "Content Source $contentSourceName does not exist, creating"
        New-SPEnterpriseSearchCrawlContentSource -SearchApplication $sa -Type file -name $contentSourceName -StartAddresses $filePath | Out-Null
        $testCS = $sa | Get-SPEnterpriseSearchCrawlContentSource | ? {$_.Name -eq $contentSourceName}
    }
    #Output the content source
    $testCS
}

Run the crawl and wait for it to finish.


Function Run-TestCrawl ()
{
    [CmdletBinding()]
    Param ($contentSource)
    #Run a crawl for that content source
    $contentSource.StartFullCrawl()

    #Set a flag to allow us to abort if the duration is excessive
    $stillNotStupidDuration = $true
    $startTime = Get-Date
    $crawlTimeout = 5
    $crawlInitalTime = 2

    Write-Verbose "Starting crawl. Waiting for 2 minutes (Default SharePoint minimum search duration)"
    Sleep -Seconds 120
    #Wait for it to finish
    while ($contentSource.CrawlStatus -ne "Idle" -AND $stillNotStupidDuration -eq $true)
    {
        Write-Verbose "Crawl still running at $timeDifference, waiting 10 seconds"
        Sleep -Seconds 10
        $timeDifference = (Get-Date) - $startTime
        if ($timeDifference.Minutes -gt $crawlTimeout)
        {
            $stillNotStupidDuration = $false
        }
        
    }
    Write-Verbose "Crawl complete"
}

Then we’re back to searching and clean up! Easy.

Of course there’s a little bit more plumbing to be done to stick it all together: so here’s a fully functioning script.

Param (

    #Name of the search service application to test
    $searchAppName = "Search Service Application",

    #Path to the shared folder
    #NOTE: THIS HAS TO BE SETUP BEFORE RUNNING THE SCRIPT MANUALLY (It can be scripted but i haven't)
    $fileSharePath = "\\spintdev\TestFolder",


    #The search page
    $searchSiteURL = "http://sharepoint/sites/search/Pages/default.aspx",

    #Start generating the report
    $reportFolder = "C:\AutomatedTest",
   
    #Flag to set or reject verbose output
    $printVerbose = $false
)


Add-PSSnapin Microsoft.SharePoint.PowerShell -ea SilentlyContinue


Function Process-ASTPassFail ()
{
<#Internal helper function. Will be used for reporting#>
Param($collectionThatShuldBeEmpty,
    $failText,
    $passText
    )

    if ($collectionThathouldBeEmpty -ne $null)
     {
        Write-Warning $failText
        Write-EventLog -LogName "Windows PowerShell" -Source "PowerShell" -EventId 102 -EntryType Error -Message $failText
        $thisTestText = $failText + "`n"
    }
    else
    {
        $sucsessText =  $passText
        Write-Host $sucsessText
        Write-EventLog -LogName "Windows PowerShell" -Source "PowerShell" -EventId 102 -EntryType Information -Message $passText
        $thisTestText = $passText + "`n"
    }
    $thisTestText
}


Function Create-ASTFiles ()
{
<#Creates sometest files for us to search later#>
[CmdletBinding()]
Param(
    $filesToCreate,
    [string]$folderPath
    )
    
    If (!(Test-Path $folderPath))
    {
    	#Folder doesn't exist.
        Write-Verbose "Folder does not exist - attempting to create"
    	New-Item $folderPath -type directory
    }

    #if the files don’t exist. Create them
    Foreach ($file in $filesToCreate)
    {
        $fileName = $file[0]
    	$filePath = $folderPath + "\" + $fileName
    	If (Test-Path $filePath)
        {
            Write-Verbose "File $fileName already exists. Skipping"
            Write-EventLog -LogName "Windows PowerShell" -Source "PowerShell" -EventId 103 -EntryType Error -Message "Test content already present."
        }
        else
    	{
            Write-Verbose "Creating $fileName"
    		$file[1] >> $filePath
    	}
    }
    Write-Verbose "All files created"
}

Function Test-ContentSourceCountAcceptable()
{
[CmdletBinding()]
Param($searchServiceApplication)
    
    #Check the maximum number of content sources allowed
    #http://technet.microsoft.com/en-us/library/cc262787(v=office.14).aspx
    $maxContentSources = 50
    
    $ContentSources = $sa | Get-SPEnterpriseSearchCrawlContentSource

    #Lazy way to check if there is only one item (note, also works for none)
    if ($ContentSources.Count -ne $null)
    {
        $CTSourceCount = $ContentSources.Count
    }
    else
    {
        #Note that this might be wrong if there are no CTs. Not a problem here but it's not a rigourous number
        $CTSourceCount = 1
    }

    #if it is below the limit. Stop and throw an error
    if ($count -ge $maxContentSources)
    {
        #Throw error and let slip the dogs of war
        Write-Verbose "Warning content type count is higher than Microsoft Boundaries"
        $false
    }
    else
    {
        #If we're under the MS limit then return true
        $true
    }
}

Function Ensure-ASTContentSourceExists ()
{
<#Check if conent source already exists. This should be re-written to delete it but for development purposes this is more efficient#>
    [CmdletBinding()]
    Param(
    $sa,
    [string]$contentSourceName,
    [string]$filePath
    )
    $testCS = $sa | Get-SPEnterpriseSearchCrawlContentSource | ? {$_.Name -eq $contentSourceName}
    if ($testCS)
    {
        Write-Verbose "Content Source $contentSourceName already exists. Deleting it."
        Write-EventLog -LogName "Windows PowerShell" -Source "PowerShell" -EventId 100 -EntryType Warning -Message "Unable to create a Content Source as one already exists"
        $testCS.Delete()     
    }
    else
    {
        Write-Verbose "Content Source $contentSourceName does not exist, creating"
        New-SPEnterpriseSearchCrawlContentSource -SearchApplication $sa -Type file -name $contentSourceName -StartAddresses $filePath | Out-Null
        $testCS = $sa | Get-SPEnterpriseSearchCrawlContentSource | ? {$_.Name -eq $contentSourceName}
    }
    #Output the content source - Note that this could result in an error state as a pre-existing one might be re-used.
    $testCS
}

Function Run-ASTCrawl ()
{
<#
.SYNOPSIS
Runs a crawl for a content source and waits until it is complete.
.DESCRIPTION
Runs a crawl for a content source and waits for it to complete, features abort option that will exit the function if the crawl takes too long.
#>
    [CmdletBinding()]
    Param (
    [Parameter(Mandatory=$true,ValueFromPipeline=$true)]$contentSource,
    [Parameter(Mandatory=$false,ValueFromPipeline=$false)]$crawlTimeOut = 5
    )
    #Run a crawl for that content source
    $contentSource.StartFullCrawl()

    
    #Start the stopwatch, Note: replace with stopwatch.
    $startTime = Get-Date
    
    #Inital pause time under which there is no point checking for the crawl to be complete
    $crawlInitalTime = 120
    
    #Set a flag to allow us to abort if the duration is excessive
    $stillNotStupidDuration = $true

    Write-Verbose "Starting crawl. Waiting for $crawlInitalTime seconds (Default SharePoint minimum search duration)"
    Sleep -Seconds $crawlInitalTime
    #Wait for it to finish
    while ($contentSource.CrawlStatus -ne "Idle" -AND $stillNotStupidDuration -eq $true)
    {
        Write-Verbose "Crawl still running at $timeDifference, waiting 10 seconds"
        Sleep -Seconds 10
        $timeDifference = (Get-Date) - $startTime
        if ($timeDifference.Minutes -gt $crawlTimeout)
        {
            $stillNotStupidDuration = $false
        }
        
    }
    if ($stillNotStupidDuration)
    {
        Write-Verbose "Crawl complete"
    }
    else
    {
        Write-Warning "No longer waiting for process to complete. Search not finished, results will be unpredictable"
        Write-EventLog -LogName "Windows PowerShell" -Source "PowerShell" -EventId 103 -EntryType Critical -Message "Crawler took longer than the timeout value of $crawlTimeOut so the function exited early."
    }
}


Function Check-ASTPairValue ()
{
<#
.SYNOPSIS
Tests that a search term returns a file with the appropriate name
.DESCRIPTION
Takes a pipeline bound pair of test values and search terms and searches for
them using the searchPageURL page.
Returns either 'Present' or 'Not Found' depending on the result.
.EXAMPLE
$testContent | Check-ASTPairValue -searchPageURL $searchSiteURL 
#> 
    [CmdletBinding()]
    Param (
    [Parameter(Mandatory=$true,ValueFromPipeline=$true)]$testPair,
    [Parameter(Mandatory=$true)]$searchPageURL
     )
    BEGIN{
        #Create the IE window in the begin block so if the input is pipelined we don't have to re-open it each time.
        $ie = New-Object -com "InternetExplorer.Application"
        $ie.visible = $true
    }
    PROCESS
    { 
        #Get the test value and the search term from the pair
        $testValue = $testPair[0]
        $searchTerm = $testPair[1]
        
        #Open the navigation page
        $ie.navigate($searchPageURL)

        #Wait for the page to finish loading
        while ($ie.readystate -ne 4)
        {
            start-sleep -Milliseconds 100
        }
        Write-Verbose "Page loaded"
        
        #Get the search box
        $searchTextBoxID = "ctl00_m_g_2f1edfa4_ab03_461a_8ef7_c30adf4ba4ed_SD2794C0C_InputKeywords"
        $document = $ie.Document
        $searchBox = $document.getElementByID($searchTextBoxID)

        #enter the search terms
        $searchBox.innerText = $searchTerm
        Write-Verbose "Searching for: $searchTerm - Expected result: $testValue"    
        
        #Get the search button
        $searchButtonID = "ctl00_m_g_2f1edfa4_ab03_461a_8ef7_c30adf4ba4ed_SD2794C0C_go"
        
        #Run the search
        $btn = $document.getElementByID($searchButtonID)
        $btn.click()
        
        #Wait for the results to be loaded
        while ($ie.locationurl -eq $searchPageURL)
        {
           start-sleep -Milliseconds 100
        }
        
        Write-Verbose "Left the page, waiting for results page to load"
        #Wait for the results page to load
        while ($ie.readystate -ne 4)
        {
            start-sleep -Milliseconds 100
        }
        Write-Verbose "Results page loaded"
        #Once loaded check that the results are correct
        $document = $ie.document

        #Check that the search term results contain the test results:
        $firstSearchTermID = "SRB_g_9acbfa38_98a6_4be5_b860_65ed452b3b09_1_Title"
        $firstSearchResult = $document.getElementByID($firstSearchTermID)
        
        $result =""
        
        #test that the title of the file is equal to the search result
        If ($firstSearchResult.innerHTML -match $testValue)
        {
            $result ="Present"
        }
        else
        {
            $result ="Not Found"
        }
        
        Write-Verbose "Test $result"
        
        #Create a new PS Object for our result and let PowerShell pass it out.
        New-Object PSObject -Property @{
            TestCase = $searchTerm
            Result = $result
        }    
    }
    END {
        #Close the IE window after us
        $ie.Quit()
    }
}

######################################################################################
#Execution script begins here
######################################################################################


#Generate the output file location
$reportFilePath = $reportFolder+  "\SearchTest_Results_" + (Get-Date -Format "dd_MM_yyyy") + ".txt"

#Name of the search service application to test
$searchAppName = "Search Service Application"

#Path to the shared folder
#NOTE: THIS HAS TO BE SETUP BEFORE RUNNING THE SCRIPT MANUALLY (It can be scripted but i haven't)
$fileSharePath = "\\spintdev\TestFolder"


#The search page
$searchSiteURL = "http://sharepoint/sites/search/Pages/default.aspx"

#Start generating the report
$reportFilePath = "C:\AutomatedTest\SearchTest_Results_" + (Get-Date -Format "dd_MM_yyyy") + ".txt"



#All items from here on in are internal and do not have to be specified or modified unless you wish it.

#test content - deliberately junk and non sensical rubbish to trim down search results and avoid false negatives.
#Note: I have no particular insight or interest in the dietry foibles of the politicans listed below.
$testContent = @(
    ("FileA.txt","Miliband loves pie"),
    ("FileB.txt","Osbourne despises soup"),
    ("FileC.txt","Cameron tolerates beans"),
    ("FileD.txt","Clegg loathes eggs which is ironic"),
    ("FileE.txt","Benn likes red meat"),
    ("FileF.txt","Balls desires flan"),
    ("FileG.txt","Cable adores sandwiches"),
    ("FileH.txt","Hunt regrets cake")
)

#Junk content for an additional test to exclude false positive results
$itemToConfirmFailure =@(
"sdkfslskjladsflkj", "lflkfdskjlfdskjfdslkjf"
"sdkfslsfdjklfkjladsflkj", "lflskjfdslkjf"
)

#Only used internally.
$testCTName = "TestSearchContentType"

$startDateTime = Get-Date
$currentComputerName = $env:computername

#Header info
"Automated SharePoint Search Testing Results`n" >> $reportFilePath
"Test started at $startDateTime on Computer $currentComputerName" >> $reportFilePath
        
#Write the first test to the report
"Test 1 - Confirm search terms do not retrieve values `n" >> $reportFilePath
"Confirms that there are no files that can generate a false positive in the system.`n" >> $reportFilePath

Write-Host "Starting tests, checking that there is no pre-existing content that might cause false positives"
 
#Run a search for the testcontent
$deliberatelyFailedResults = @()
$deliberatelyFailedResults +=  $testContent | Check-ASTPairValue -searchPageURL $searchSiteURL -Verbose:$printVerbose
$falsePositives = $deliberatelyFailedResults | ? {$_.Result -eq "Present"}

$errorText = "Test failed, files found by search engine. Results not reliable"
$sucsessText =  "Test Passed, moving to next stage"

$testText = (Process-ASTPassFail -collectionThatShuldBeEmpty $falsePositives -passText $sucsessText -failText $errorText)
$testText >> $reportFilePath 
#Create the test files based on the array above
Create-ASTFiles -filesToCreate $testContent -folderPath $fileSharePath

#Get the search app
$sa = Get-SPEnterpriseSearchServiceApplication -Identity $searchAppName
if ($sa -eq $null)
{
    Write-EventLog -LogName "Windows PowerShell" -Source "PowerShell" -EventId 101 -EntryType Error -Message "Could not find search application $searchAppName"
}


Write-Host "Checking that we are within guidelines for number of Content Sources"
#Test the number of content sources already in place
$numberOfContentSourcesBelowThreshold = Test-ContentSourceCountAcceptable -searchServiceApplication $sa -Verbose:$printVerbose

#Only progress if we're not going to breach the content type limit.
if ($numberOfContentSourcesBelowThreshold)
{
    Write-Host "Within the Acceptable number of Site Collections"
    #Get the content source.
    $testCS = Ensure-ASTContentSourceExists -sa $sa -contentSourceName $testCTName -filePath $fileSharePath -Verbose:$printVerbose
    
    Write-Host "Running the crawl - estimated completion in approximately 2 minutes"
    #Run the crawl and wait for it to complete
    Run-ASTCrawl -contentSource $testCS -Verbose:$printVerbose

    $searchResults = @()
    Write-Host "Crawl Complete, testing links"
    
    $searchResults += $testContent | Check-ASTPairValue -searchPageURL $searchSiteURL -Verbose:$printVerbose
    $failures = $deliberatelyFailedResults | ? {$_.Result -ne "Present"}
    
    #Write the  test to the report
    "Test 2 - Test new content`n" >> $reportFilePath
    "Confirms that search works for our new content.`n" >> $reportFilePath
    
    $errorText = "Test failed, files were not found"
    $sucsessText =  "Passed main test."
    $failures += (Process-ASTPassFail -collectionThatShuldBeEmpty $falsePositives -passText $sucsessText -failText $errorText)

    #Confirm that the test will fail given junk input.
    $deliberatelyFailedResults = @()
    $deliberatelyFailedResults +=  $itemToConfirmFailure | Check-ASTPairValue -searchPageURL $searchSiteURL -Verbose:$printVerbose
    $falsePositives = $deliberatelyFailedResults | ? {$_.Result -eq "Present"}
    
    #Write the  test to the report
    "Test 3 - Check for junk terms `n" >> $reportFilePath
    "Confirms that search doens't find some junk values.`n" >> $reportFilePath
    
    $errorText = "Test failed, files found by search engine when given junk data"
    $sucsessText =  "Passed confirmation test - junk values not found"
    $testText = (Process-ASTPassFail -collectionThatShuldBeEmpty $falsePositives -passText $sucsessText -failText $errorText)
    $testText >> $reportFilePath 
    
    
    #Clean up the content source 
    $CSToDelete = $sa | Get-SPEnterpriseSearchCrawlContentSource | ? {$_.Name -eq $testCTName}
    $CSToDelete.Delete()
    
    #Delete the files
    foreach ($combo in $testContent)
    {
        $fileName = $combo[0]
        $file = Get-ChildItem -Path $fileSharePath | ? {$_.name -eq $fileName}
        $file.Delete()
    }
    #Note that the content source may take a minute to be deleted
    Write-Host "Pausing for 1 minute to allow the index to update"
    Sleep -Seconds 60   

    #Run a search for the testcontent
    $deliberatelyFailedResults = @()
    $deliberatelyFailedResults +=  $testContent | Check-ASTPairValue -searchPageURL $searchSiteURL -Verbose:$printVerbose
    $falsePositives = $deliberatelyFailedResults | ? {$_.Result -eq "Present"}

    #Write the  test to the report
    "Test 3 - Confirm search terms are removed`n" >> $reportFilePath
    "Confirms that the test search content is removed from the system.`n" >> $reportFilePath
    
    $errorText = "Test failed, files found by search engine when given junk data"
    $sucsessText =  "Passed confirmation test. Test files are not present"
    $testText = (Process-ASTPassFail -collectionThatShuldBeEmpty $falsePositives -passText $sucsessText -failText $errorText)
    $testText >> $reportFilePath 
}
else
{
    $errorText = "Error - Unable to create a Content Source as the total number of Content Sources is greater than the Microsoft boundary"
    Write-EventLog -LogName "Windows PowerShell" -Source "PowerShell" -EventId 100 -EntryType Warning -Message $errorText
    $errorText >> $reportFilePath 
}

"Automated SharePoint Search Testing Completed at $(Get-Date) `n" >> $reportFilePath 

So there we have it. A fully functioning automated testing process for SharePoint Search. It would be nice if it sent an email but i’m planning on rolling this into some SCOM work i’m playing with.

I haven’t tested this on 2013 yet, it’d need at least some tweaks to field IDs and maybe more structural work to get the Search API right for 2013. If anyone is interested i’ll knock up a new version.

Gooey SharePoint Scripting

Today i’m going to show you how to script Sharepoint through the GUI. Whilst in this example we’ll be running the code on server the same concepts and approach can be used to Script it from any machine that can hit the relevant website…

Our example might seem to be a little forced but it’s based on a real world experience. We had a client who had a fairly complicated Content Type scenario, over 150 Content Types spread over 8 levels of inheritance with untold columns. Then we discovered an issue and needed to publish every single one of those content types. This is the classic example of where PowerShell should be used but awkwardly they’d been burnt with PowerShell publishing before.
As such we had a flat edict, no PowerShell publishing of content types. It must go through the GUI.

A post i’d seen recently by Dr James McCaffrey popped into my head. It was about using PowerShell to automate testing of web applications using PowerShell.
Why not use the same process to automate the publishing of the content types?

The first thing to do is to get ourselves an IE window:

$ie = New-Object -com &q0uot;Internet Explorer&quot;
#This starts by default in a hidden mode so let&#039;s show it
$ie.Visible = $true

This isn’t much use on its’ own so let’s send it to a page. In our case we want to go to the page to publish one of our content types. We know that the publish page itself is an application page that is referenced from a site collection root web with the following URL syntax:

siteCollectionRoot/_layouts/managectpublishing.aspx?ctype=ContentTypeID

Glossing over how to get the ContentTypeID for now we have this:

$pageUrl= "http://sharepoint/sites/cthub/_layouts/managectpublishing.aspx?ctype=0x0100A4CF347707AC054EA9C3735EBDAC1A7C"
$ie.Naviagte($pageUrl)

Now PowerShell moves fast, so we’ll need to wait for Javascript to catch up.

While ($ie.ReadyState -ne 4)
{
	Sleep -Milliseconds 100
}

Now we’re there, let’s get the publish button. Thankfully this button has a consistent ID that we can get using the trusty F12 button in IE.

Image of Identifying an element's ID uwing F12

Identifying an element’s ID uwing F12

The catchily titled “ctl00_PlaceHolderMain_ctl00_RptControls_okButton” button? Depressingly i think i’m starting to see the naming convention behind these ids…

$textBoxID = "ctl00_PlaceHolderMain_ctl00_RptControls_okButton"
#You have to pass the Document into it's own object otherwise it will fail
$document = $ie.Document
$button= $document.getElementByID($buttonID)

And now all we need to do is to click that button:

$button.Click()

Now you might think that we’ve done all we need to do here and slap it into a foreach loop and be done with it. Of course you can’t do that as you need to give IE time to send that request using our good old friend Javascript.
So we wait for the page to re-direct us:

 
While ($ie.locationurl -eq $url)
{
start-sleep -Milliseconds 100
}

Now we can slap it into a foreach loop and with a little bit of work we can come up with something like the code below:

Add-PSSnapin Microsoft.SharePoint.PowerShell -ea SilentlyContinue

#URL for the content type hub to use
$CTHubURL= "https://sharepoint/sites/cthub"

#Get the Content Type hub
$site = Get-SPSite $CTHubURL

#Content Types to publish
$ContentAndColumns = @(
("Document Type 1"),
("Document Type 2"),
("Document Type 3")
)


#Open a new IE window
$ie = New-Object -com "InternetExplorer.Application"

#Make the window visible
$ie.visible = $true

#Loop through the content types and publish them
foreach ($contentTypeName in $ContentTypes)
{
    
    Write-Verbose  "Processing $ContentTypeName"
    
    #Content types live at the root web
    $web = $site.rootWeb
    #Get the content type using it's name
    $ct =   $web.ContentTypes[$ContentTypeName]
    #Get the GUID for the CT
    $GUID = $ct.ID.ToString()
    #Get the URL for the page based on the content type hub url, the application page that does publishing and the GUID
    $url = $CTHubURL+ "/_layouts/managectpublishing.aspx?ctype=" + $GUID  
    #Go to the page
    $ie.navigate($url)
    #Wait for the page to finish loading
    while ($ie.ReadyState -ne 4)
     {
        start-sleep -Milliseconds 100
     }
     #The ID of the button to press
    $buttonID = "ctl00_PlaceHolderMain_ctl00_RptControls_okButton"
    $document = $ie.Document
    $btn = $document.getElementByID($buttonID)
    #Push the button
    $btn.click()
    #Wait for the page to be re-directed
     while ($ie.locationurl -eq $url)
     {
        start-sleep -Milliseconds 100
     }
     Write-Verbose "Content Type $contentTypeName published"
}

I don’t know about you but there is something deeply neat about sitting at your desk watching IE do the dull task that you were convinced was going to bring your RSI back with a vengance, and in half the time you could do it.

This example might not be useful for that many people but the concept is intriguing. There’s no reason most of this can’t be done without any code on the server at all, the only time we use it is to get the GUIDs and those can be pre-fetched if needs be. Nor does it need any significant rights, as long as the account you use has permision to get into that site collection and publish content types then that’s all they need.

The logical destination of this is Office 365, the scripts and rules for running them on there are limited and limiting, they have to be. But the beauty of Scripting is that we don’t have to be limited by the detail of code, we can use higher level components and tools to worry about that for us. In this case, the GUI that microsoft were kind enough to provide us for when it’s too awkward to find the PowerShell console.

Managed Metadata columns fail to sync between SharePoint and client applications

This issue seems to be cropping up a lot at the moment, one possible fix is below.

Symptoms:

When you set a Managed Metadata Service (MMS) column in SharePoint these values are pushed down to the office document and will be visible on the Document Information Panel (DIP). When these values are changed in an office document however these MMS column changes are not updated in the SharePoint item. Non MMS fields (i.e. Single Line of Text, Choice, Number etc.) are correctly synced. If you close and re-open the office document, even from another computer, any changes made in office to the MMS values will still remain as you set them in the DIP. However as normal any changes to the values in SharePoint will be pushed down to the office document overwriting any values in the DIP.

In summary: SharePoint can write to the office document but MMS values in the document cannot be written to SharePoint by office.

Note: If text, choice or other non MMS fields are not being synced when you save the document then this is probably unrelated to your issue.

Where has this been seen:

We’ve seen it in at least two SharePoint 2010 SP1 environments in the last week, with farms using varying CUs. No obvious cause has been identified.
The main example is in office, at least word and Excel. This has also been seen with Harmon.ie where it is impossible to set the MMS value, it is probable other systems may be effected.

Solution:

Add and remove a MMS column from each list. You can confirm that this fixes your issue by performing a manual update of a single list and then run a bulk correction using PowerShell. Note that you will need to test and re-create any faulty Site Templates.

Cause

Not known at this time, it appears to be related to the document parser. It appears that in some cases the document Parser process fails on MMS values. The value in Word is maintained in the document’s xml fields but is not correctly udpated (at least in our tests) with the correct namespace for the term or termID.
It seems that by adding a new MMS column the issues with the other columns is corrected, we believe this might be due to some version or synchronisation process but have not tracked down the root cause.

Manual steps

In your list or library, open the list settings.

Image of library ribbon with Library Settings highlighted

Library Settings

Click on ‘Create Column’

Image of Create Column highlighted within Library Settings

Create Column

Enter a name, here we will use ‘DummyColumn’ and select ‘Managed Metadata’

Column creation process with Name and Type highlighted

Create Column (specify type and name)

Select a value in the MMS

Image of Managed metadata value selected in column creation

Select Managed Metadata Value

Click OK.

At this point you should be able to confirm that the MMS field is now synchronised between Office and SharePoint. You can then delete the column.

Note: If the process fails then delete the column anyway, unless you’re selling childrens accessories then it will probably be of little use.

Programatic

This can be scritped in several ways but the primary method will be on server PowerShell. An example script is shown below:

<#
Author: Alex Brassington (Trinity Expert Systems)
Date: 26/04/2013
Description:
Adds and removes an MMS colummn to every library in the white list for all sites in a web application. This is to
fix the office => SharePoint managed metadata service sync field issues.
This can be run with either a white list of lists/libraries to update or without, in which case all document libraries will be updated. It is possible that this only needs to run on one document library per site but i have not yet been able to confirm or refute that.
#>

Add-PSSnapin Microsoft.SharePoint.PowerShell -ea SilentlyContinue

#Reference to the content type hub to be used for the MMS Column    
$CTHubURL= "http://sharepoint/sites/cthub"

#Site Collection to modify
$SCURL = "http://sharepoint/sites/cthub"

#Name of the MMS instance to use
$MMSInstance = "Managed Metadata Service"


#A 'white list' of libraries to process. Note that this currently contains 'DOcuments' which should be handled as a special case.
$librariesToCheck =
(
    "Documents",
    "Entertainment",
    "Project Documents",
    "Management Information"
)

    #Setup the termstore object
    $contentTypeHub = Get-SPSite $contentTypeHubURL
    $session = New-Object Microsoft.SharePoint.Taxonomy.TaxonomySession($contentTypeHub)
    $termStore = $session.TermStores | ? {$_.Name -eq $MMSInstance}
    $group = $termStore.Groups["Demo Terms"]
    $termSet = $group.Termsets["Condiments"]


Function Update-LibrariesInSiteCollection ()
{
[CmdletBinding()]
    Param (
    [Parameter(Position=0,Mandatory=$true,ValueFromPipeLine=$true)][string]$siteURL, 
    [Parameter(Position=1,Mandatory=$true)][Microsoft.SharePoint.Taxonomy.TermSet]$termSet,
    [Parameter(Position=2,Mandatory=$true)][Microsoft.SharePoint.Taxonomy.TermStore]$termStore,
    [Parameter(Position=3,Mandatory=$false)][string]$errorFile,
    [Parameter(Position=4,Mandatory=$false)][string[]]$librariesToCheck
    )
    
    
    #No change required, only used internally
    $columnName = "TempColumn"
    
    #Get the SharePoint Site Collection to process
    $site = Get-SPSite $siteURL
    Write-Verbose "Updating Site Collection $($site.URL)"
    foreach ($web in $site.AllWebs)
    {
        Write-Verbose "Updating Web $($web.URL)"
        
        #If there's a list of folders to use as a whitelist then use them
        if ($librariesToCheck)
        {
            Write-Verbose "Updating libraries based on provided White list"
            $lists = $web.Lists | ? {$librariesToCheck -contains $_}
        }
        else
        {
            #If not then process all libraries.
            Write-Verbose "Updating all document libraries only"
            $lists = $web.Lists | ? {$_.BaseType -eq "DocumentLibrary"}
        }
        
        foreach ($list in $lists)
        {
            Write-Verbose "Updating list $($list.Title)"
            try
            {
                #Create a new taxonomy field
                $taxField = $list.fields.CreateNewField("TaxonomyFieldType", $columnName)
                
                #set the term store ID and the termset ID 
                $taxField.SspId = $termStore.Id
                $taxField.TermSetId = $termSet.Id
                
                #Add the column to the list
                $list.Fields.Add($taxField) | Out-Null
                $list.Update()
                
                #Remove the column
                $column = $list.fields[$columnName]
                $column.Delete()
                Write-Verbose "List Complete $($list.Title)"
            }
            catch
            {
                Write-Error "Error encountered on List: $($list.Title)"
            }
        }
    $web.Dispose()
    }
    
    #If a file path was given then write out the error log.
    if ($errorFile)
    {
        $error >> $errorFile
    }
    #Dispose of the site collection
    $site.Dispose()
}

Update-LibrariesInSiteCollection -siteURL $SCURL -termSet $termSet -termStore $termStore -errorFile $ErrorPath -Verbose

My thanks to my colleague Paul Hunt (aka Cimares) who found the fix that we scripted above.

Tweaking exercise

One of my colleagues needed a PowerShell script to report on the SharePoint Site Collection Quotas in use on all sites, as well as how much of the site was being used.
Not being a PowerShell or SharePoint expert they asked for a second opinion, since I had an hour an a half of train journey they got a bit more than they expected.

The original Script:

$t = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.quotatemplates
$tFound = $false
$webApp = Get-SPWebApplication | %{$_.Sites} | Get-SPSite -Limit ALL
$webApp | fl Url, @{n=”Storage Used/1MB”;e={[int]($_.Usage.Storage/1MB)}},
@{n=”Storage Available Warning/1MB”; e={[int](($_.Quota).StorageWarningLevel/1MB)}},
@{n=”Storage Available Maximum/1MB”; e={[int](($_.Quota).StorageMaximumLevel/1MB)}},
@{n=”Sandboxed Resource Points Warning”;e={[int](($_.Quota).UserCodeWarningLevel)}},
@{n=”Sandboxed Resource Points Maximum”;e={[int](($_.Quota).UserCodeMaximumLevel)}},
@{n=”Quota Name”; e={ foreach($qt in $t){if($qt.QuotaId -eq [int](($_.Quota).QuotaID)){$qt.Name; $tFound = $true}} if($tFound -eq $false){“No Template Applied”}$tFound=$false;}} >> c:quotaoutput.txt
if($parent) {$webApp.Dispose(); $t.Dispose()}

First. Scripts are written by humans, for humans. Computers might use them but they are meant for us. There’s also a direct correlation between consistency of indenting and code quality. That monolithic block has to go.

$t = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.quotatemplates
$tFound = $false
 

$webApp = Get-SPWebApplication | %{$_.Sites} | Get-SPSite -Limit ALL
$webApp | fl Url, 
	@{n=”Storage Used/1MB”;e={[int]($_.Usage.Storage/1MB)}},
	@{n=”Storage Available Warning/1MB”; e={[int](($_.Quota).StorageWarningLevel/1MB)}},
	@{n=”Storage Available Maximum/1MB”; e={[int](($_.Quota).StorageMaximumLevel/1MB)}},
	@{n=”Sandboxed Resource Points Warning”;e={[int](($_.Quota).UserCodeWarningLevel)}},
	@{n=”Sandboxed Resource Points Maximum”;e={[int](($_.Quota).UserCodeMaximumLevel)}},
	@{n=”Quota Name”; e={ 
	        foreach($qt in $t)
	        {
	            if($qt.QuotaId -eq [int](($_.Quota).QuotaID))
	            {
	                 $qt.Name; 
	                 $tFound = $true
	            }
	        } 
	        if($tFound -eq $false)
	        {
	            “No Template Applied”
	        }
	    $tFound=$false;
	    }
	} >> c:PSoutput.txt
 
if($parent)
 {
	$webApp.Dispose(); 
	$t.Dispose()
}

At this point we can actually work out what happens. A collection of site collections are fetched, then we iterate through each of them, capturing bits of information, and then try to work out if the site has a quota and if so what it is called.

You might already have spotted the second item in there. If not here’s a hint, we’re getting a Collection of Site Collections.

Not a WebApplication, nor even a collection of them.

Note to self and others: Always use meaningful names.

I was slightly confused when I first read this as I assumed the names were meaningful. It took me a second, and a run through in debug mode, to convince myself otherwise.

So, let’s correct that name to something more meaningful. We’re in a simple scenario so we can use something short but descriptive like ‘AllSites’. While we’re there let’s also tidy up that $t to $templates

$templates = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.quotatemplates
$tFound = $false
 
$AllSites= Get-SPWebApplication | %{$_.Sites} | Get-SPSite -Limit ALL
$AllSites| fl Url, 
	@{n=”Storage Used/1MB”;e={[int]($_.Usage.Storage/1MB)}},
	@{n=”Storage Available Warning/1MB”; e={[int](($_.Quota).StorageWarningLevel/1MB)}},
	@{n=”Storage Available Maximum/1MB”; e={[int](($_.Quota).StorageMaximumLevel/1MB)}},
	@{n=”Sandboxed Resource Points Warning”;e={[int](($_.Quota).UserCodeWarningLevel)}},
	@{n=”Sandboxed Resource Points Maximum”;e={[int](($_.Quota).UserCodeMaximumLevel)}},
	@{n=”Quota Name”; e={ 
	        foreach($qt in $t)
	        {
	            if($qt.QuotaId -eq [int](($_.Quota).QuotaID))
	            {
	                 $qt.Name; 
	                 $tFound = $true
	            }
	        } 
	        if($tFound -eq $false)
	        {
	            “No Template Applied”
	        }
	    $tFound=$false;
	    }
	} >> c:PSoutput.txt
 
if($parent)
 {
	$AllSites.Dispose(); 
	$template.Dispose()
}

Now, that makes it a bit nicer to read. The mislabeled variable is a big hint to our next item, return values from cmdlets. Let’s look at this one line:

$AllSites= Get-SPWebApplication | %{$_.Sites} | Get-SPSite -Limit ALL

Let’s work through what this does. First we get all the WebApplications in the farm, then for each of those we get their sites, then for each of those sites we run the Get-SPSite -Limit All comand for that single site.
Wait, what?
Yup, we get a collection of all the sites and then we step through each and fetch it again. It’s almost surprising it works until you realise just how clever the PowerShell compiler is at converting types.
In fact, all three lines following are equivalent:

	$sites = Get-SPWebApplication | % { $_.Sites} | Get-SPSite –Limit All
	$sites = Get-SPWebApplication | % { $_.Sites} 
	$sites = Get-SPSite –Limit All

Why make things more complicated than they need to be? Let’s go with the last one.

$templates = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.quotatemplates
$tFound = $false

$AllSites =  Get-SPSite -Limit ALL
$AllSites| fl Url, 
	@{n=”Storage Used/1MB”;e={[int]($_.Usage.Storage/1MB)}},
	@{n=”Storage Available Warning/1MB”; e={[int](($_.Quota).StorageWarningLevel/1MB)}},
	@{n=”Storage Available Maximum/1MB”; e={[int](($_.Quota).StorageMaximumLevel/1MB)}},
	@{n=”Sandboxed Resource Points Warning”;e={[int](($_.Quota).UserCodeWarningLevel)}},
	@{n=”Sandboxed Resource Points Maximum”;e={[int](($_.Quota).UserCodeMaximumLevel)}},
	@{n=”Quota Name”; e={ 
	        foreach($qt in $t)
	        {
	            if($qt.QuotaId -eq [int](($_.Quota).QuotaID))
	            {
	                 $qt.Name; 
	                 $tFound = $true
	            }
	        } 
	        if($tFound -eq $false)
	        {
	            “No Template Applied”
	        }
	    $tFound=$false;
	    }
	} >> c:PSoutput.txt
 
if($parent)
 {
	$AllSites.Dispose(); 
	$template.Dispose()
}

That’s better, but looking down the script there’s another item that has probably grabbed your notice. What the heck is $parent?
I have my suspicions it is orriginally from a 2007 PowerShell script, back when we were still using WSS 3, STSADM, PowerShell V1.0 and dinosaur attacks were listed on the risk register.
Either way this has no place here, if we’re executing in Strict mode (which wwe should be) then the script won’t even compile. If we’re not then it’ll never fire as $null evaluates to $false.
That’s probably for the best really as $AllSites, being a collection, doesn’t have a .Dispose() method and nor does $template.
Let’s just blow that away completely.

$templates = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.quotatemplates
$tFound = $false

$AllSites =  Get-SPSite -Limit ALL
$AllSites| fl Url, 
	@{n=”Storage Used/1MB”;e={[int]($_.Usage.Storage/1MB)}},
	@{n=”Storage Available Warning/1MB”; e={[int](($_.Quota).StorageWarningLevel/1MB)}},
	@{n=”Storage Available Maximum/1MB”; e={[int](($_.Quota).StorageMaximumLevel/1MB)}},
	@{n=”Sandboxed Resource Points Warning”;e={[int](($_.Quota).UserCodeWarningLevel)}},
	@{n=”Sandboxed Resource Points Maximum”;e={[int](($_.Quota).UserCodeMaximumLevel)}},
	@{n=”Quota Name”; e={ 
	        foreach($qt in $t)
	        {
	            if($qt.QuotaId -eq [int](($_.Quota).QuotaID))
	            {
	                 $qt.Name; 
	                 $tFound = $true
	            }
	        } 
	        if($tFound -eq $false)
	        {
	            “No Template Applied”
	        }
	    $tFound=$false;
	    }
	} >> c:PSoutput.txt

That’s better still. Sleeker and more readable. On the other hand that .Dispose method should be ringing some bells, as you all know SharePoint is infamous for not properly releasing memory for the key components. Without the .Dispose method the objects will sit in memory until the PowerShell session ends.

In C# we’d have ‘using’ blocks but they don’t really exist in PowerShell. Here we use the pipeline, anything that’s run in a pipeline is disposed of at the end by default.

It just so happens that our $AllSites object is only used once after being declared, by rolling that into the pipeline we can make use of this wonderful feature and streamline our code further!

$templates = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.quotatemplates
$tFound = $false
 
Get-SPSite -Limit ALL| fl Url, 
	@{n=”Storage Used/1MB”;e={[int]($_.Usage.Storage/1MB)}},
	@{n=”Storage Available Warning/1MB”; e={[int](($_.Quota).StorageWarningLevel/1MB)}},
	@{n=”Storage Available Maximum/1MB”; e={[int](($_.Quota).StorageMaximumLevel/1MB)}},
	@{n=”Sandboxed Resource Points Warning”;e={[int](($_.Quota).UserCodeWarningLevel)}},
	@{n=”Sandboxed Resource Points Maximum”;e={[int](($_.Quota).UserCodeMaximumLevel)}},
	@{n=”Quota Name”; e={ 
	        foreach($qt in $t)
	        {
	            if($qt.QuotaId -eq [int](($_.Quota).QuotaID))
	            {
	                 $qt.Name; 
	                 $tFound = $true
	            }
	        } 
	        if($tFound -eq $false)
	        {
	            “No Template Applied”
	        }
	    $tFound=$false;
	    }
	} >> c:PSoutput.txt

Of course, that doesn’t work because of the aforementioned crapness of SharePoint and it’s memory handling. I’m working on a longer post on how to deal with it but for now just remember to kill your sessions as soon as you can.

So, if you run this you get a nice text file with a rubbish name dumped out at the end. The format might look something like this:

Url : http://mysites:8080
Storage Used/1MB : 2
Storage Available Warning/1MB : 0
Storage Available Maximum/1MB : 0
Sandboxed Resource Points Warning : 100
Sandboxed Resource Points Maximum : 300
Quota Name : No Template Applied

Url : http://sharepoint
Storage Used/1MB : 7
Storage Available Warning/1MB : 0
Storage Available Maximum/1MB : 0
Sandboxed Resource Points Warning : 100
Sandboxed Resource Points Maximum : 300
Quota Name : No Template Applied

I’m liking the data but if you’ve got hundreds of sites that’s going to be a nightmare to go through. It just so happens we can make use of one of the lesser known, but highly awesome, PowerShell features to help us here.

As we all know the world floats on Excel and if we’re honest that’s where this data’s going anyway for us to sort. Let’s dump it out into a CSV file, now we could re-write the format-list statement to dump the stuff out in strings and then concatenate our hearts out.
Or we can change two things, swap fl out for Select and insert ConvertTo-CSV.

$templates = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.quotatemplates
$tFound = $false
 
Get-SPSite -Limit ALL| Select Url, 
	@{n=”Storage Used/1MB”;e={[int]($_.Usage.Storage/1MB)}},
	@{n=”Storage Available Warning/1MB”; e={[int](($_.Quota).StorageWarningLevel/1MB)}},
	@{n=”Storage Available Maximum/1MB”; e={[int](($_.Quota).StorageMaximumLevel/1MB)}},
	@{n=”Sandboxed Resource Points Warning”;e={[int](($_.Quota).UserCodeWarningLevel)}},
	@{n=”Sandboxed Resource Points Maximum”;e={[int](($_.Quota).UserCodeMaximumLevel)}},
	@{n=”Quota Name”; e={ 
	        foreach($qt in $t)
	        {
	            if($qt.QuotaId -eq [int](($_.Quota).QuotaID))
	            {
	                 $qt.Name; 
	                 $tFound = $true
	            }
	        } 
	        if($tFound -eq $false)
	        {
	            “No Template Applied”
	        }
	    $tFound=$false;
	    }
	} | ConvertTo-CSV >> c:PSoutput-CSV.CSV

That turns our text output into something like this:

#TYPE Selected.Microsoft.SharePoint.SPSite
“Url”,”Storage Used/1MB”,”Storage Available Warning/1MB”,”Storage Available Maximum/1MB”,”Sandboxed Resource Points Warning”,”Sandboxed Resource Points Maximum”,”Quota Name”
“http://mysites:8080″,”2″,”0″,”0″,”100″,”300″,”No Template Applied”
“http://sharepoint”,”7″,”0″,”0″,”100″,”300″,”No Template Applied”
“http://sharepoint/sites/CTHub”,”3″,”0″,”0″,”100″,”300″,”No Template Applied”
“http://sharepoint/sites/sync”,”3″,”0″,”0″,”100″,”300″,”No Template Applied”
“http://sharepoint/sites/TechNet”,”2″,”0″,”0″,”100″,”300″,”No Template Applied”

A hell of a lot uglier but with a little Excel care and attention it’s sortable, filterable and fit for use in a report.

What if we’re not going to be using Excel but we are going to be inspecting by eye, isn’t there a better format there? Well yes there is, you can use the ConvertTo-HTML option and that’ll turn the entire lot into a fully formed HTML file for you. With a modicum of genius and/or a particulary epic book by Don Jones you can add your own CSS and Jquery.

That works, but if this is going to be run more than once I don’t’ want my files overwriting the old ones, or even worse appending (as the script above will do, talk about confusing!)

Let’s slap a date stamp onto our output file:

$outputFolder = "C:"
$path = $outputFolder + "Output-" + (Get-Date -Format "dd-MM-yyyy") + ".txt"


Yes, I’m a Brit, we will use the only sensible date format in this blog.

With a slight modification we’re now here:


$outputFolder = "C:"
$path = $outputFolder + "Output-" + (Get-Date -Format "dd-MM-yyyy") + ".csv"

$templates = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.quotatemplates
$tFound = $false
 
Get-SPSite -Limit ALL| Select Url, 
	@{n=”Storage Used/1MB”;e={[int]($_.Usage.Storage/1MB)}},
	@{n=”Storage Available Warning/1MB”; e={[int](($_.Quota).StorageWarningLevel/1MB)}},
	@{n=”Storage Available Maximum/1MB”; e={[int](($_.Quota).StorageMaximumLevel/1MB)}},
	@{n=”Sandboxed Resource Points Warning”;e={[int](($_.Quota).UserCodeWarningLevel)}},
	@{n=”Sandboxed Resource Points Maximum”;e={[int](($_.Quota).UserCodeMaximumLevel)}},
	@{n=”Quota Name”; e={ 
	        foreach($qt in $t)
	        {
	            if($qt.QuotaId -eq [int](($_.Quota).QuotaID))
	            {
	                 $qt.Name; 
	                 $tFound = $true
	            }
	        } 
	        if($tFound -eq $false)
	        {
	            “No Template Applied”
	        }
	    $tFound=$false;
	    }
	} | ConvertTo-CSV >>  $path

We’ve turned a script that shouldn’t even run into something that’s more legible, probably faster (more to come on this I hope), more efficient and giving more useful results.

What haven’t we done? We haven’t touched on the, frankly brutal, RAM leaks which are the massive elephant in the room. This script will make your server cry, if it’s a really large farm then it might even impact the stability or performance of your CA box or wherever you run it. If you’ve got thousands of site collections I recommend running this out of hours with Task manager open and a hand hovering over Ctrl + C.

What next? Elephant hunting and SPAssignments

Adding Content Types to the New button on a document library with PowerShell

Background
I was at a customer site and they wanted to remove a load of document types from the “New” button on their document libraries. I tried using the SPContentType.Hidden = $true parameter but realised that wasn’t the one. I then spent some more time banging my head against it and just did it by hand and moved on.

Another person asked how to do something similar on PowerShell.org (here: http://powershell.org/discuss/viewtopic.php?f=12&t=1407). I had some time and was irked by my failure before hand so I gave it another go. I met some success but thought that since it’s something that annoyed me, and since there’s no easily found PowerShell specific posts about this, it’s worth doing properly and blogging.

It turns out that the new button is determined by SPList.rootFolder.UniqueContentTypeOrder property. This is an ordered list of content types to display, any item in the list must be in the lists’ content types but not vice versa. Modify this and you modify the same property you set in the GUI. Happy days.

The first step is to see if a content type is available in the new button or not:

Is-ContentTypeInNewButton

Function Is-ContentTypeInNewButton {

[CmdletBinding()]
Param ([parameter(Mandatory=$true)][string] $ContentTypeName,
       [parameter(Mandatory=$true)][Microsoft.SharePoint.SPList] $SPList)
BEGIN   {  Write-Verbose "Begining Is-ContentTypeInNewButton" }
       PROCESS{
            #get the uniquecontenttypes from the list root folder
            $rootFolder = $SPList.RootFolder
            $contentTypesInPlace = [Microsoft.SharePoint.SPContentType[]] $rootFolder.UniqueContentTypeOrder
            
            #Check if any of them are the same as the test content type
            $results = $contentTypesInPlace | where { $_.Name -eq $ContentTypeName} 
            if ($results -ne $null)
            {
                Write-Verbose "$ContentTypeName Found"
                return $true
            }
            else
            {
                Write-Verbose "$ContentTypeName Not Found"
                return $false
            }
    }
    
END   {  Write-Verbose "Exiting Is-ContentTypeInNewButton" }
}

Of course there’s a possible gotcha. What if the Content type isn’t even added to the list at all?

Ensure-ContentTypeInList

Function Ensure-ContentTypeInList{

[CmdletBinding()]
Param ( [parameter(Mandatory=$true,ValueFromPipeline=$true)][string] $ContentTypeName,
       [parameter(Mandatory=$true)][Microsoft.SharePoint.SPList] $SPList)

BEGIN   {  Write-Verbose "Begining Ensure-ContentTypeInList" }
PROCESS { 

     #Check to see if the content type is already in the list
     $contentType = $SPList.ContentTypes[$ContentTypeName]
     if ($ContentType -ne $null)
     {
        #Content type already present
        Write-Verbose "$ContentTypeName already present in list"
        Return $true
     }
     else
     {
        Write-Verbose "$ContentTypeName not in list. Attempting to add"
        if (!$SPList.ContentTypesEnabled)
        {
            Write-Verbose "Content Types disabled in list $SPList, Enabling"
            $SPList.ContentTypesEnabled = $true
            $SPList.Update()
        }
         #Add site content types to the list from the site collection root
         $ctToAdd = $SPList.ParentWeb.Site.RootWeb.ContentTypes[$ContentTypeName]
         if($ctToAdd -eq $null)
         {
            Write-Error "Error - Content Type could not be found in the Site Collection"
            #I don't believe this will be called.
            return $false
         }
         $SPList.ContentTypes.Add($ctToAdd) | Out-Null
         $SPList.Update()
         Write-Verbose "$ContentTypeName added to list"
         return $true
     }
    }
END {
     Write-Verbose "Exiting Ensure-ContentTypeInList"
    }
}

Well that’s a start. Now we can tell if the content type already exsits, and can add the content type to the list if it doesn’t, let’s put that into something useful:

Ensure-ContentTypeInNewButton

Function Ensure-ContentTypeInNewButton{

[CmdletBinding()]
Param ( [parameter(Mandatory=$true,ValueFromPipeline=$true)][string] $ContentTypeName,
        [parameter(Mandatory=$true)][Microsoft.SharePoint.SPList] $SPList)
    BEGIN   { 
                Write-Verbose "Begining  Ensure-ContentTypeInNewButton"
                #get the uniquecontenttypes from the list root folder
                $contentTypesInPlace = New-Object 'System.Collections.Generic.List[Microsoft.SharePoint.SPContentType]'
                $contentTypesInPlace = $SPList.RootFolder.UniqueContentTypeOrder
                $dirtyFlag = $false
            }
    PROCESS { 
                
        #Check the content type isn't already present in the content type
        $AlreadyPresent = Is-ContentTypeInNewButton -ContentTypeName $ContentTypeName -SPList $SPList
        if ($AlreadyPresent)
        {
            Write-Verbose "$ContentTypeName is already present in the new button"
        }
        else
        {
            #Check that there really is such a content type
            $ContentTypePresent = Ensure-ContentTypeInList $ContentTypeName $SPList
            #Catch error events
            if ($ContentTypePresent)
            {
                #We now know that the content type is not in the new button and is present in the list. Carry on adding the content type
                
                $ctToAdd = $SPList.ContentTypes[$ContentTypeName]
                
                #add our content type to the unique content type list
                $contentTypesInPlace  =  $contentTypesInPlace + $ctToAdd
                $dirtyFlag = $true
                Write-Verbose "$ContentTypeName queued to add to the new button"
            }
            else
            {
                Write-Error -Message "Content type could not be added to the list."
            }
        }
    }
    End{
        #Set the UniqueContentTypeOrder to the collection we made above
        if ($dirtyFlag)
        {
           $SPList = $SPList.ParentWeb.Lists[$SPList.ID]
            $rootFolder = $SPList.RootFolder
            $rootFolder.UniqueContentTypeOrder = [Microsoft.SharePoint.SPContentType[]]  $contentTypesInPlace
        
             #Update the root folder
             $rootFolder.Update()
             Write-Verbose "ContentType(s) added to the new button in list $($SPList.Name)"
        }
        else
        {
                Write-Verbose "No changes"
        }
         Write-Verbose "Exiting  Ensure-ContentTypeInNewButton"
                
    }
}

Awesome. On the other hand the stuff above didn’t lend itself to testing. I had to go into the GUI each time to remove my content types. So let’s have something to help make unwind our changes:

Remove-ContentTypeFromNewButton

Function Remove-ContentTypeFromNewButton{

[CmdletBinding()]
Param ( [parameter(Mandatory=$true,ValueFromPipeline=$true)][string] $ContentTypeName,
        [parameter(Mandatory=$true)][Microsoft.SharePoint.SPList] $SPList)
    
BEGIN   { Write-Verbose "Begining Remove-ContentTypeFromNewButton" }
PROCESS { 
   
            #Check the content type isn't already present in the content type
            $AlreadyPresent = Is-ContentTypeInNewButton -ContentTypeName $ContentTypeName -SPList $SPList
            if ($AlreadyPresent)
            {
                Write-Verbose "$ContentTypeName is present in the new button - removing"
                
                #get the uniquecontenttypes from the list root folder
                $rootFolder = $SPList.RootFolder
                
                #Get the content types where the names are different to our content type
                $contentTypesInPlace = [System.Collections.ArrayList] $rootFolder.UniqueContentTypeOrder
                $contentTypesInPlace = $contentTypesInPlace | where {$_.Name -ne $contentTypeName}
                
                #Set the UniqueContentTypeOrder to the collection we made above
                $rootFolder.UniqueContentTypeOrder = [Microsoft.SharePoint.SPContentType[]]  $contentTypesInPlace
                
                #Update the root folder
                $rootFolder.Update()
                Write-Verbose "$ContentTypeName removed from the new button in list $($SPList.Name)"
            }
            else
            {
                Write-Verbose "$ContentTypeName is not present in the new button. No further action required."
            }
        }
END     { Write-Verbose "Exiting Remove-ContentTypeFromNewButton" }

}

Done.

So we now have the functions to take a list and content type, run a single command which will add a content type, ensuring it’s added to the new button. Further to that we’ve got some basic help (which WordPress has stripped out), error handling and it’ll take piplines and multiple content types. I love PowerShell.

Tests and examples of code

$CTHubSiteCollectionURL = "http://sharepoint/sites/cthub"
$singleContentType = "AlexB_Document"
$contentTypesToAddToNewButton = @("AlexB_Document1b","AlexB_Docudddment2")

$SPWeb = Get-SPWeb $CTHubSiteCollectionURL
$docLib = $spweb.Lists["TestDocLib"]


Write-Host "Is Content Type $ContentTypeName in the new button already? $(Is-ContentTypeInNewButton $singleContentType $doclib )"
Write-Host "Adding the content type to the new button (using the wonderful Ensure method which won't throw errors if already present)"
Ensure-ContentTypeInNewButton -ContentTypeName $singleContentType -SPList $doclib
Write-Host "Is Content Type $ContentTypeName in the new button already? $(Is-ContentTypeInNewButton $singleContentType $doclib )"
#Victory!

"Removing"
#$contentTypesToUpdate | Remove-ContentTypeFromNewButton -SPList $doclib 
Write-Host "Is Content Type in the new button already? $(Is-ContentTypeInNewButton $singleContentType $doclib )"
#Also Victory!

#Let's try a more interesting example
foreach ($contentTypeName in $contentTypesToAddToNewButton)
{
Write-Host "Is Content Type: $ContentTypeName in the new button already? $(Is-ContentTypeInNewButton $contentTypename $doclib)"
}
Write-Host "Adding the content types to the new button (using the wonderful Ensure method which won't throw errors if already present)"
$contentTypesToAddToNewButton | Ensure-ContentTypeInNewButton -SPList $doclib
foreach ($contentTypeName in $contentTypesToAddToNewButton)
{
Write-Host "Is Content Type: $ContentTypeName in the new button already? $(Is-ContentTypeInNewButton $contentTypename $doclib)"
}
#Victory!

And now i can rest. Any critiques of the powershell welcomed.

Just for those that are interested in the bit that makes this all possible, error handling etc. stripped out:

#Get the Web that holds the list
$SPWeb = Get-SPWeb "http://sharepoint/sites/cthub"
#get the library
$list = $SPWeb.Lists["Shared Documents"]

#Get a content type
$contentType = $docLib.ContentTypes | where { $_.Name -eq "AlexB_Document"}

#Get the root folder object
$rootFolder = $list.RootFolder

#Get the current list of content types available
$contentTypesInPlace = [Microsoft.SharePoint.SPContentType[]] $rootFolder.UniqueContentTypeOrder

#add our content type
$contentTypesInPlace  =  $contentTypesInPlace + $ContentType

#set the list to our new list
$rootFolder.UniqueContentTypeOrder = [Microsoft.SharePoint.SPContentType[]]  $contentTypesInPlace

#Update the folder
$rootFolder.Update()

References:
Thanks to Praveen Battula who’s blog post pointed me in the right direction and has some nice C# for doing a similar task.

http://praveenbattula.blogspot.co.uk/2011/01/change-content-type-order-in-new-button.html

Link to TechNet article on the UniqueContentTypeOrder: http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spfolder.uniquecontenttypeorder(v=office.14).aspx.

Thoughts for the future:
It’d be nice to be able to order the items. Not difficult technically but what would the best way to use such a process be?
It seems you can change the new button for different folders in the hierarchy. That’d be handy