The SharePoint 2013 installer doesn’t like .NET Framework 4.6.x

One of the Prerequisites of SharePoint 2013 is the .NET Framework 4.5. I have been installing countless SharePoint 2013 environments without issues but recently, I started noticing that the SharePoint installer fails with the following error :

Setup is unable to proceed due to the following error(s):
This product requires Microsoft .NET Framework 4.5.


NET Framework 4.6 - pic 2

This might seem strange since the prerequisite installer installed everything. When this happens, you might want to have a look at the actual installed versions of the .NET Framework. You will probably notice that .NET Framework 4.6 or higher is installed. To check which versions have been installed, you can download the .NET Framework Setup Verification Utility from Microsoft to verify which versions of the .NET Framework have been installed. When you run this tool, you will see an overview of all installed versions.

NET Framework 4.6 - pic2

When you see the .NET Framework 4.6 or 4.6.1 listed, the SharePoint installer cannot detect the presence of version 4.5.

With the release of Service Pack 1 for SharePoint 2013, SharePoint is compatible with version 4.6 but the installer isn’t. You can have version 4.6, but only AFTER you installed SharePoint 2013.

So, in order to get SharePoint installed on that machine, you need to uninstall version 4.6 or higher before you can install SharePoint 2013. And this is where it gets a bit annoying. The tool to verify the .NET Framework also has the ability to uninstall a version. Except… this doesn’t work! You can try it but nothing happens. When you recheck, it’s still there. Even after a reboot.

This version of the .NET Framework can be uninstalled by uninstalling Windows update KB3102467. But this is just the first action… if you are installing SharePoint in an organization which pushes Windows Updates, you probably want to  block this framework from installing on that machine for the time being.

To do that, you can execute the following PowerShell script. This is going to add a key ‘BlockNetFramework461’ to the registry and set it to 1 to block the installation of that version. Once SharePoint is installed, you can remove this key if you want.

To see the official article for the blocking of the installation of the .NET Framework 4.6.1, see the following link:

Once that version is uninstalled, you can proceed with the installation of SharePoint 2013 on that machine.

Update (14/09/2016)

Microsoft released a fix for the installer issue. You can find this fix in KB3087184.

Cleaning up large content databases and reclaiming unused disk space

Dealing with large SharePoint content databases can be a daunting task. Microsoft has the recommendation to keep your content databases below the 200GB mark for databases which are used daily. But when you are dealing with backup/restore, migrations and general operations which involve moving around those kind of databases, even 200GB can be a huge pain in the ass and will cost you in terms of time you are spending looking at a progress bar or watching a percentage creeping slowly to 100%.

A solution for this is to split up the content database into smaller databases provided that you have multiple site collections in that database which can be moved out.
Relocating a site collection to a different content database is very easy. You can do this with the Move-SPSite cmdlet.

Once you moved out a site collection to a different database, you will notice that your huge database is not getting smaller. That’s because the site collection which has been moved, is still in that database but it’s marked for deletion.
The actual removal of that site collection is done by the Gradual Site Delete timer job. This job runs once a day. Once it has run, the site collection is completely removed from the database.
But still, if you look at the database, it will not be any smaller than before. When you look at the used space in the database, you will see that this has decreased and the unused space has grown. The unused space is not released.
To release unused space, … *ducks for cover* … you can shrink the database. There, I said it.

Generally speaking, shrinking content databases is not done. You can do this, but it has no advantages and it has a negative effect on performance because when someone adds something to a site in that database, it has to expand to be able to store anything.
So, shrinking is definitely something you should avoid at any cost… except for the case where you have such a huge content database that you’ve split up into smaller content databases. The reason for splitting up the database in the first place was to make it smaller, right? To have a size which is much more manageable. But in order to get it back to a reasonable size, you need to shrink it. There’s no way around it.

During a migration from SharePoint 2007 to SharePoint 2013, I had to migrate a content database of 220GB. All things considered, this is not huge. It’s large, but not huge. This content database contained around 20 site collections. Backup of this database was not an issue… 20 minutes to backup this database. Copying this backup to a SharePoint 2010 migration server was frustrating. It took over an hour. Yeah, It SHOULD go faster but if you pass through a 10Mbit router, you are copying at USB 2.0 speed! But this was nothing compared to the time the Mount-SPContentDatabase cmdlet needed to complete to attach this database to the web application and do the actual upgrade from SP2007 to SP2010. This attach/upgrade took almost 3 hours and then it just aborted due to lack of disk space. The migration server had a data disk of 600GB and it just filled completely with the transaction log that was created as part of the attach/upgrade process. So, I lost 3 hours, had to wait until extra disk space was added and restart the whole thing again. By the time it had attached and upgraded, the size of the database was actually increased to 330GB.

When everything was attached and upgraded, I decided that I’m not going through this again when I do the migration from SP2010 to SP2013. I needed to have databases which are easier to handle. So, I split up this database into 5 databases of which the largest was still 115GB. But ok, nothing I could do about that in the short term.

Running the Gradual Site Delete job however proved to be a pain as well… took almost 6 hours to complete! Started around 2PM. Went home at 5PM. Next day, I noticed it finished around 8PM. So, I started the shrink operation of the database… lost half a day with that. Wasn’t able to do anything with that database for the larger part of the day.

Since this was only a “test” migration, I realised that history was going to repeat itself during the final migration and that I needed a way to make use of those lost hours between the finishing of the gradual site delete and the shrink. When the gradual site delete is done, start with the shrinking to have it done in the morning.

Enter… PowerShell!

The script below is going to kick off the Gradual Site Delete timer job for a specific web application and it then will monitor this job every 5 minutes to see if has completed. Once it has completed, it will continue with the shrinking of the database. The shrinking happens in 2 stages. The first stage is done with NOTRUNCATE. This means that SQL will move all pages to the front of the file but it will not reclaim unused space. The size stays the same. The next step is a shrink operation with TRUNCATEONLY. This will just remove all unused space from the end of the file. It’s basically the same thing that happens when you do a Shrink Database from the Management Studio.

Again, don’t do this as a part of a weekly maintenance routine because the first step of the shrink will introduce index fragmentation in your database. Seriously! For me, this was a necessary cleanup I had to do as part of a migration project to reorganize the content database and minimize the migration time! The environment I was doing this in, was a intermediate SharePoint 2010 environment, not a live environment.

Also, the Shrink operation in the script allows you to specify a percentage of free space it should reserve for unused space.

I used 5%. This way, for a content database of 100GB, 5GB of free space is retained. You can change this if you want, or you can add an additional parameter which allows you to specify the amount of free space it should keep.

You can find this script in my PowerShell repository on GitHub.

An eye for details… changing the ImageUrl for migrated lists

Migrating from SharePoint 2007 to SharePoint 2013 can cause all kind of headaches but this post is not about those headaches. It’s about details, or better… having an eye for details. Ever noticed that after you migrate a site to SharePoint 2013 and you complete the actual visual upgrade to SharePoint 2013 mode, the list icons which are used, are not the fancy list icons which you get when you create a new list or library in SharePoint 2013? The icons for migrated lists and libraries are still the old icons from the early days of SharePoint 2007.

ImageUrl - 1

The icon for a list or library is stored in the ImageUrl property of a list and this property points to a gif or png in the “/_layouts/images/” folder for migrated lists. When you create a new list in SharePoint 2013, the value of the property points to “/_layouts/15/images”. Furthermore, if you compare for instance a migrated document library with a new document library, you notice that the value of the property differs, not only in the location where the icon is displayed from, but also the type of file. For instance, a simple document library.

  • Migrated document library : /_layouts/images/itdl.gif
  • New document library : /_layouts/15/images/itdl.png?rev=23

While I can imagine that a lot of people really don’t see any issue with this and don’t care how those icons look like, I don’t like loose ends. If you migrate an environment, you might as well get it done completely and replace the list icons with new versions as well and get the following result in the end.

ImageUrl - 2

Admit it, this looks much better than those old school icons. It’s a small detail, but it just makes more sense. If you have a smart user who actually cares about the environment, the question why new lists have different icons than existing lists, will eventually pop up anyway and if you tell them that this the result of the migration, the next question will be whether you can change them to resemble the new lists. Show your customers or users you have an eye for detail and do it proactively.

Changing these icons can be done very easily using PowerShell. The only thing you need is a mapping between the old and new icon.

I created a script which replaces all icons for lists and libraries. In this script, a mapping is done for the mostpart of the icons which are used. It might not be the complete list, but feel free to add missing icons. There are some scripts out there which replace icons, not based on a mapping but just replace all .gif icons with .png’s. However, there are some icons which don’t have .png counterparts. So, if you replace those, your list icon will be broken.

You can find this script in my PowerShell repository on GitHub

Replacing event receivers in SharePoint

I’m currently migrating a SharePoint 2007 to SharePoint 2013. For this particular environment, a custom solution was made which involves a number of event receivers. The customer wanted to retain this functionality, so I had to port this solution to SharePoint 2013. One problem though… the source code was not available. We had to revert to reverse engineering the solution using ILSpy to recreate the source code and build a new solution. We made sure that all feature ID’s were the same and that our namespaces and class names were also the same. After deploying and testing the solution, it worked.

During the migration, we attached the content database to the SharePoint 2013 web application and that’s when we noticed something.
When you add an event receiver to a SharePoint list, the “Assembly” property of the event receiver contains the assembly signature of the DLL which contains the event receiver class. When we attached the database, SharePoint complained it was missing an assembly. The assembly of the old solution. When we compared the assembly signature of the old solution with the signature of our new solution, we saw it had a different publickeytoken. We completely overlooked this. This was one of those “Doh!” moments.

It seems that it’s not that straightforward to change the publickeytoken. I found a way to extract this publickeytoken from a DLL and generate a strong name key (SNK) file.

sn.exe -e myassembly.dll mykey.snk

But this strong name key is missing one crucial piece of information. The private key. If you want to sign your solution with this strong name, you need to do this using delay signing. Your solution will build and the signature matches the one from the old assembly, but when you try to deploy it to SharePoint, it fails because it can’t add the assembly to the GAC due to the missing private key.

I figured that instead of looking for workarounds, the most easy way to solve this, is to replace the old event receivers with new ones which have the correct signature. This proved to be an easy solution. I created 2 scripts which helped me with this.

Get all event receivers with a specific signature

This scripts returns all event receivers which have a specific signature.

You can export this output to a CSV file, which can be used in the next script. All information which is needed to replace these eventreceivers is included in the output.

Delete and recreate event receivers

Using the .CSV file which can be created from the previous script, the script below deletes the old eventreceivers and replaces them with new ones. It uses the information from the old eventreceivers which is included in the CSV and uses the signature which is passed in as a parameter, as the new assembly signature for the new event receivers.

You can find these scripts in my PowerShell repository on GitHub.

Add a SQL Alias using PowerShell

Setting a SQL alias on every SharePoint server is a common task when you are installing SharePoint. You use the SQL Server Client Network Utility (cliconfg.exe) for this. This tool is available on every SharePoint server because it’s part of the SQL Server Native Client prerequisite.

Setting a SQL alias is a best practice because it makes your life a whole lot easier when you want to change the actual database server in some point of time. If you use an alias, the only thing you need to do at that moment, is change the target of your alias and you’re good to go. If you install SharePoint and you reference the database server directly, your only way of pointing SharePoint to the new database server painlessly, is to create an alias at that time, set the name of the alias to the name of the old database server and have it point to the new server. Definately not cool because when someone looks at the Servers in Central Administration, it will list the old name and it’s not clear that this is not a server anymore but an alias.

When you want to set an alias, you run cliconfg.exe on each SharePoint server. In this tool, you have an “Alias” tab, where you can set it. You provide a name, the type of connection (Named Pipes, TCP/IP) and a server name. You can also select a custom port if you use TCP/IP or keep the default.

sql alias - 01

If you don’t want to do this manually, there’s also a way of doing this with PowerShell. The only thing this tool does, is create a string value in the registry under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSSQLServer\Client\ConnectTo

sql alias - 01

So, doing this using PowerShell is easy.

You can find this script in my PowerShell Repository on GitHub.

TFS Build agent running but still displayed as offline

A colleague of mine notified me today that there was a problem with the TFS build agent on our TFS server. When he tried to queue a build, the following message popped up:

There are issues with the request or definition that may prevent the build from running:
There are agents that are capable of running the build, but they are not online. If the agent is configured to run as a service, ensure that the “VSO Agent ({agent name})” service is running.


This looked like a no-brainer. Probably the service which was not started. When I looked at the service, I noticed that it was running.


I checked the Agent pools in the TFS Control Panel and this was showing a red status while it should be green. Not good.


To see what was going on, I went to the folder where the agent was installed. In that folder, a _diag folder exists with logging. After opening one of the log files, I noticed this error:

System.Net.Http.HttpRequestException: An error occurred while sending the request. —> System.Net.WebException: Unable to connect to the remote server —> System.Net.Sockets.SocketException: No connection could be made because the target machine actively refused it

(IP is replaced with xxx in the message above).

The IP which was listed, was the IP of the TFS server. What got me thinking in the right direction for a solution, is a part of the logfile above the message where the settings are loaded:

VsoAgent.exe was run with the following command line: “C:\BuildAgent\agent\vsoagent.exe” /runningAsService
12:04:06.109768 SettingsFileHelper.Load – settings[AutoUpdate]=True
12:04:06.109768 SettingsFileHelper.Load – settings[RootFolder]=C:\BuildAgent
12:04:06.109768 SettingsFileHelper.Load – settings[WorkFolder]=c:\BuildAgent\_work
12:04:06.109768 SettingsFileHelper.Load – settings[ServerUrl]=
12:04:06.109768 SettingsFileHelper.Load – settings[AgentName]=xxxxx
12:04:06.109768 SettingsFileHelper.Load – settings[PoolId]=1
12:04:06.109768 SettingsFileHelper.Load – settings[PoolName]=default
12:04:06.109768 SettingsFileHelper.Load – settings[AgentId]=2
12:04:06.109768 SettingsFileHelper.Load – settings[RunAsWindowsService]=True
12:04:06.109768 SettingsFileHelper.Load – settings[WindowsServiceName]=vsoagent.tfs.xxxxx
12:04:06.109768 SettingsFileHelper.Load – settings[WindowsServiceDisplayName]=VSO Agent (xxxxx)

The “ServerUrl” setting listed the default URL for TFS. Which is HTTP on port 8080. But I remembered that we changed our configuration to run on HTTPS. So, our TFS was not at 8080, but 443 and then it hit me… I configured the buildagent before we changed to HTTPS.

The solution to this issue was pretty clear, I needed to update the “settings.json” file in the buildagent folder and replace the old URL with the new one.
After changing this file, I restarted the buildagent service, went back to the Agent Pools, refreshed the page and noticed the status changed to green.


Good to go!

Toggle audience targeting using PowerShell

Audience targeting in SharePoint is a great way of hiding irrelevant information from users and likewise, only show relevant information. You could create a list with a lot of items and target those items to specific groups of users. Users who are targeted will see the items while other users will not see them… audience targeting. This has nothing to do with permissions. It’s just a filter. Users who are not targeted can still access those items if they want.

To be able to use audience targeting on a list, you have to enable it first. It’s disabled by default. To enable this, you can go to the list settings of your list and open the Audience Targeting settings. There you will find a checkbox to enable this.

Once it’s enabled, you will see an extra column in your list “Target Audiences” of the type “Audience Targeting”.


To disable it, go back into the audience targeting settings and uncheck the checkbox. The extra column will be removed.

That’s the UI way… now, how about PowerShell? Can you enable/disable audience targeting? Offcourse!

We only need to add a field with a specific ID and attributes to the list and it’s done.

Read more

Get Web Template Usage in SharePoint

Having a SharePoint platform which is widely adopted and used is cool. But as a SharePoint administrator I’m also interested in how this platform is used and where it’s used for. One of the things I like to do is to find out which kind of sites are used and how many of them exist.

This can also be quite useful when you are considering a migration to a new SharePoint version. Not all site templates are supported anymore in SharePoint. For example, the document workspaces, meeting workspaces, and group work templates. These are partially supported when migrating from SharePoint 2010 to SharePoint 2013 or are just supported for backward compatibility. At the same time, there are also custom solutions out there which add site templates and it’s good to know when preparing for a migration, if you have sites which are based on those kind of templates.

The tricky thing is… how to get that information. There’s a cmdlet which is called Get-SPWebTemplate. This gives you a list of all web templates which have been installed in the environment.


This doesn’t give you which templates are actually used. For this, you need to combine some information. The name of a web template consists of 2 parts:

  • Web Template
  • Configuration ID

When you look at the properties of an SPWeb object, you find these 2 specific properties in the list of properties. You need to combine these 2 with a “#” between them to have the webtemplate.Get-SPWebTemplate1

This allows you to identify the template for each web which exists in your farm. But this is only part of the info we need. We need the usage count of each used template and get something like this:


To get this output, I wrote a small script.

If you would like to sort the output of the script to have the most used templates first, you can execute it like this:

which gives you the following:



SharePoint 2016 Fast Site Creation vs. Traditional site creation

Creating site collections in SharePoint has always been a task which requires some patience. It can take a lot of time to create a site. With the release of SharePoint 2016, a new feature is introduced that listens to the name “Fast Site Creation”. I’m not going to explain what it does because there are enough places out there which do an excellent job in explaining it. You can find some here, here, and here.

The basic idea is that the Fast Site Creation feature gives an administrator the ability to create site collections rapidly by offloading the actual creation completely to the database server where a site is created by copying an existing “master” site.

That’s the theory. In this post, I’m going show you the actual difference between both methods in the context of execution time. I have been searching for this on the internet and apparently, everybody who is talking about this feature, is just repeating what Microsoft says. Seems that nobody is actually wondering if this feature is actually faster.

So, I did it myself. I ran some tests, comparing both methods.

Read more

User Profile Sync DB is read-only in SharePoint 2016

When you provision a new User Profile Service application in SharePoint 2016, you will notice that the Sync DB name and server is read-only and that you cannot change it.


I don’t know why exactly Microsoft made it impossible to change the name of the Sync database in the UI. Has it something to do with the absence of the synchronization service? Either way, this DB is created and despite all good efforts to have a consistent naming convention for your SharePoint databases, you end up with this rebelling database.

But then again, who uses the UI to get things done in SharePoint anyway, right? We do everything with PowerShell and in PowerShell, you have a cmdlet New-SPProfileServiceApplication which allows you to specify the name of the databases:

Solved it! When you open the properties of your created user profile service application, you see that this small rebellion was suppressed swiftly and without making a lot of fuss.