The SharePoint 2013 installer doesn’t like .NET Framework 4.6.x

One of the Prerequisites of SharePoint 2013 is the .NET Framework 4.5. I have been installing countless SharePoint 2013 environments without issues but recently, I started noticing that the SharePoint installer fails with the following error :

Setup is unable to proceed due to the following error(s):
This product requires Microsoft .NET Framework 4.5.


NET Framework 4.6 - pic 2

This might seem strange since the prerequisite installer installed everything. When this happens, you might want to have a look at the actual installed versions of the .NET Framework. You will probably notice that .NET Framework 4.6 or higher is installed. To check which versions have been installed, you can download the .NET Framework Setup Verification Utility from Microsoft to verify which versions of the .NET Framework have been installed. When you run this tool, you will see an overview of all installed versions.

NET Framework 4.6 - pic2

When you see the .NET Framework 4.6 or 4.6.1 listed, the SharePoint installer cannot detect the presence of version 4.5.

With the release of Service Pack 1 for SharePoint 2013, SharePoint is compatible with version 4.6 but the installer isn’t. You can have version 4.6, but only AFTER you installed SharePoint 2013.

So, in order to get SharePoint installed on that machine, you need to uninstall version 4.6 or higher before you can install SharePoint 2013. And this is where it gets a bit annoying. The tool to verify the .NET Framework also has the ability to uninstall a version. Except… this doesn’t work! You can try it but nothing happens. When you recheck, it’s still there. Even after a reboot.

This version of the .NET Framework can be uninstalled by uninstalling Windows update KB3102467. But this is just the first action… if you are installing SharePoint in an organization which pushes Windows Updates, you probably want to  block this framework from installing on that machine for the time being.

To do that, you can execute the following PowerShell script. This is going to add a key ‘BlockNetFramework461’ to the registry and set it to 1 to block the installation of that version. Once SharePoint is installed, you can remove this key if you want.

To see the official article for the blocking of the installation of the .NET Framework 4.6.1, see the following link:

Once that version is uninstalled, you can proceed with the installation of SharePoint 2013 on that machine.

Update (14/09/2016)

Microsoft released a fix for the installer issue. You can find this fix in KB3087184.

Cleaning up large content databases and reclaiming unused disk space

Dealing with large SharePoint content databases can be a daunting task. Microsoft has the recommendation to keep your content databases below the 200GB mark for databases which are used daily. But when you are dealing with backup/restore, migrations and general operations which involve moving around those kind of databases, even 200GB can be a huge pain in the ass and will cost you in terms of time you are spending looking at a progress bar or watching a percentage creeping slowly to 100%.

A solution for this is to split up the content database into smaller databases provided that you have multiple site collections in that database which can be moved out.
Relocating a site collection to a different content database is very easy. You can do this with the Move-SPSite cmdlet.

Once you moved out a site collection to a different database, you will notice that your huge database is not getting smaller. That’s because the site collection which has been moved, is still in that database but it’s marked for deletion.
The actual removal of that site collection is done by the Gradual Site Delete timer job. This job runs once a day. Once it has run, the site collection is completely removed from the database.
But still, if you look at the database, it will not be any smaller than before. When you look at the used space in the database, you will see that this has decreased and the unused space has grown. The unused space is not released.
To release unused space, … *ducks for cover* … you can shrink the database. There, I said it.

Generally speaking, shrinking content databases is not done. You can do this, but it has no advantages and it has a negative effect on performance because when someone adds something to a site in that database, it has to expand to be able to store anything.
So, shrinking is definitely something you should avoid at any cost… except for the case where you have such a huge content database that you’ve split up into smaller content databases. The reason for splitting up the database in the first place was to make it smaller, right? To have a size which is much more manageable. But in order to get it back to a reasonable size, you need to shrink it. There’s no way around it.

During a migration from SharePoint 2007 to SharePoint 2013, I had to migrate a content database of 220GB. All things considered, this is not huge. It’s large, but not huge. This content database contained around 20 site collections. Backup of this database was not an issue… 20 minutes to backup this database. Copying this backup to a SharePoint 2010 migration server was frustrating. It took over an hour. Yeah, It SHOULD go faster but if you pass through a 10Mbit router, you are copying at USB 2.0 speed! But this was nothing compared to the time the Mount-SPContentDatabase cmdlet needed to complete to attach this database to the web application and do the actual upgrade from SP2007 to SP2010. This attach/upgrade took almost 3 hours and then it just aborted due to lack of disk space. The migration server had a data disk of 600GB and it just filled completely with the transaction log that was created as part of the attach/upgrade process. So, I lost 3 hours, had to wait until extra disk space was added and restart the whole thing again. By the time it had attached and upgraded, the size of the database was actually increased to 330GB.

When everything was attached and upgraded, I decided that I’m not going through this again when I do the migration from SP2010 to SP2013. I needed to have databases which are easier to handle. So, I split up this database into 5 databases of which the largest was still 115GB. But ok, nothing I could do about that in the short term.

Running the Gradual Site Delete job however proved to be a pain as well… took almost 6 hours to complete! Started around 2PM. Went home at 5PM. Next day, I noticed it finished around 8PM. So, I started the shrink operation of the database… lost half a day with that. Wasn’t able to do anything with that database for the larger part of the day.

Since this was only a “test” migration, I realised that history was going to repeat itself during the final migration and that I needed a way to make use of those lost hours between the finishing of the gradual site delete and the shrink. When the gradual site delete is done, start with the shrinking to have it done in the morning.

Enter… PowerShell!

The script below is going to kick off the Gradual Site Delete timer job for a specific web application and it then will monitor this job every 5 minutes to see if has completed. Once it has completed, it will continue with the shrinking of the database. The shrinking happens in 2 stages. The first stage is done with NOTRUNCATE. This means that SQL will move all pages to the front of the file but it will not reclaim unused space. The size stays the same. The next step is a shrink operation with TRUNCATEONLY. This will just remove all unused space from the end of the file. It’s basically the same thing that happens when you do a Shrink Database from the Management Studio.

Again, don’t do this as a part of a weekly maintenance routine because the first step of the shrink will introduce index fragmentation in your database. Seriously! For me, this was a necessary cleanup I had to do as part of a migration project to reorganize the content database and minimize the migration time! The environment I was doing this in, was a intermediate SharePoint 2010 environment, not a live environment.

Also, the Shrink operation in the script allows you to specify a percentage of free space it should reserve for unused space.

I used 5%. This way, for a content database of 100GB, 5GB of free space is retained. You can change this if you want, or you can add an additional parameter which allows you to specify the amount of free space it should keep.

You can find this script in my PowerShell repository on GitHub.

An eye for details… changing the ImageUrl for migrated lists

Migrating from SharePoint 2007 to SharePoint 2013 can cause all kind of headaches but this post is not about those headaches. It’s about details, or better… having an eye for details. Ever noticed that after you migrate a site to SharePoint 2013 and you complete the actual visual upgrade to SharePoint 2013 mode, the list icons which are used, are not the fancy list icons which you get when you create a new list or library in SharePoint 2013? The icons for migrated lists and libraries are still the old icons from the early days of SharePoint 2007.

ImageUrl - 1

The icon for a list or library is stored in the ImageUrl property of a list and this property points to a gif or png in the “/_layouts/images/” folder for migrated lists. When you create a new list in SharePoint 2013, the value of the property points to “/_layouts/15/images”. Furthermore, if you compare for instance a migrated document library with a new document library, you notice that the value of the property differs, not only in the location where the icon is displayed from, but also the type of file. For instance, a simple document library.

  • Migrated document library : /_layouts/images/itdl.gif
  • New document library : /_layouts/15/images/itdl.png?rev=23

While I can imagine that a lot of people really don’t see any issue with this and don’t care how those icons look like, I don’t like loose ends. If you migrate an environment, you might as well get it done completely and replace the list icons with new versions as well and get the following result in the end.

ImageUrl - 2

Admit it, this looks much better than those old school icons. It’s a small detail, but it just makes more sense. If you have a smart user who actually cares about the environment, the question why new lists have different icons than existing lists, will eventually pop up anyway and if you tell them that this the result of the migration, the next question will be whether you can change them to resemble the new lists. Show your customers or users you have an eye for detail and do it proactively.

Changing these icons can be done very easily using PowerShell. The only thing you need is a mapping between the old and new icon.

I created a script which replaces all icons for lists and libraries. In this script, a mapping is done for the mostpart of the icons which are used. It might not be the complete list, but feel free to add missing icons. There are some scripts out there which replace icons, not based on a mapping but just replace all .gif icons with .png’s. However, there are some icons which don’t have .png counterparts. So, if you replace those, your list icon will be broken.

You can find this script in my PowerShell repository on GitHub

Replacing event receivers in SharePoint

I’m currently migrating a SharePoint 2007 to SharePoint 2013. For this particular environment, a custom solution was made which involves a number of event receivers. The customer wanted to retain this functionality, so I had to port this solution to SharePoint 2013. One problem though… the source code was not available. We had to revert to reverse engineering the solution using ILSpy to recreate the source code and build a new solution. We made sure that all feature ID’s were the same and that our namespaces and class names were also the same. After deploying and testing the solution, it worked.

During the migration, we attached the content database to the SharePoint 2013 web application and that’s when we noticed something.
When you add an event receiver to a SharePoint list, the “Assembly” property of the event receiver contains the assembly signature of the DLL which contains the event receiver class. When we attached the database, SharePoint complained it was missing an assembly. The assembly of the old solution. When we compared the assembly signature of the old solution with the signature of our new solution, we saw it had a different publickeytoken. We completely overlooked this. This was one of those “Doh!” moments.

It seems that it’s not that straightforward to change the publickeytoken. I found a way to extract this publickeytoken from a DLL and generate a strong name key (SNK) file.

sn.exe -e myassembly.dll mykey.snk

But this strong name key is missing one crucial piece of information. The private key. If you want to sign your solution with this strong name, you need to do this using delay signing. Your solution will build and the signature matches the one from the old assembly, but when you try to deploy it to SharePoint, it fails because it can’t add the assembly to the GAC due to the missing private key.

I figured that instead of looking for workarounds, the most easy way to solve this, is to replace the old event receivers with new ones which have the correct signature. This proved to be an easy solution. I created 2 scripts which helped me with this.

Get all event receivers with a specific signature

This scripts returns all event receivers which have a specific signature.

You can export this output to a CSV file, which can be used in the next script. All information which is needed to replace these eventreceivers is included in the output.

Delete and recreate event receivers

Using the .CSV file which can be created from the previous script, the script below deletes the old eventreceivers and replaces them with new ones. It uses the information from the old eventreceivers which is included in the CSV and uses the signature which is passed in as a parameter, as the new assembly signature for the new event receivers.

You can find these scripts in my PowerShell repository on GitHub.

Toggle audience targeting using PowerShell

Audience targeting in SharePoint is a great way of hiding irrelevant information from users and likewise, only show relevant information. You could create a list with a lot of items and target those items to specific groups of users. Users who are targeted will see the items while other users will not see them… audience targeting. This has nothing to do with permissions. It’s just a filter. Users who are not targeted can still access those items if they want.

To be able to use audience targeting on a list, you have to enable it first. It’s disabled by default. To enable this, you can go to the list settings of your list and open the Audience Targeting settings. There you will find a checkbox to enable this.

Once it’s enabled, you will see an extra column in your list “Target Audiences” of the type “Audience Targeting”.


To disable it, go back into the audience targeting settings and uncheck the checkbox. The extra column will be removed.

That’s the UI way… now, how about PowerShell? Can you enable/disable audience targeting? Offcourse!

We only need to add a field with a specific ID and attributes to the list and it’s done.

Read more

Get Web Template Usage in SharePoint

Having a SharePoint platform which is widely adopted and used is cool. But as a SharePoint administrator I’m also interested in how this platform is used and where it’s used for. One of the things I like to do is to find out which kind of sites are used and how many of them exist.

This can also be quite useful when you are considering a migration to a new SharePoint version. Not all site templates are supported anymore in SharePoint. For example, the document workspaces, meeting workspaces, and group work templates. These are partially supported when migrating from SharePoint 2010 to SharePoint 2013 or are just supported for backward compatibility. At the same time, there are also custom solutions out there which add site templates and it’s good to know when preparing for a migration, if you have sites which are based on those kind of templates.

The tricky thing is… how to get that information. There’s a cmdlet which is called Get-SPWebTemplate. This gives you a list of all web templates which have been installed in the environment.


This doesn’t give you which templates are actually used. For this, you need to combine some information. The name of a web template consists of 2 parts:

  • Web Template
  • Configuration ID

When you look at the properties of an SPWeb object, you find these 2 specific properties in the list of properties. You need to combine these 2 with a “#” between them to have the webtemplate.Get-SPWebTemplate1

This allows you to identify the template for each web which exists in your farm. But this is only part of the info we need. We need the usage count of each used template and get something like this:


To get this output, I wrote a small script.

If you would like to sort the output of the script to have the most used templates first, you can execute it like this:

which gives you the following:



SharePoint 2016 Fast Site Creation vs. Traditional site creation

Creating site collections in SharePoint has always been a task which requires some patience. It can take a lot of time to create a site. With the release of SharePoint 2016, a new feature is introduced that listens to the name “Fast Site Creation”. I’m not going to explain what it does because there are enough places out there which do an excellent job in explaining it. You can find some here, here, and here.

The basic idea is that the Fast Site Creation feature gives an administrator the ability to create site collections rapidly by offloading the actual creation completely to the database server where a site is created by copying an existing “master” site.

That’s the theory. In this post, I’m going show you the actual difference between both methods in the context of execution time. I have been searching for this on the internet and apparently, everybody who is talking about this feature, is just repeating what Microsoft says. Seems that nobody is actually wondering if this feature is actually faster.

So, I did it myself. I ran some tests, comparing both methods.

Read more

User Profile Sync DB is read-only in SharePoint 2016

When you provision a new User Profile Service application in SharePoint 2016, you will notice that the Sync DB name and server is read-only and that you cannot change it.


I don’t know why exactly Microsoft made it impossible to change the name of the Sync database in the UI. Has it something to do with the absence of the synchronization service? Either way, this DB is created and despite all good efforts to have a consistent naming convention for your SharePoint databases, you end up with this rebelling database.

But then again, who uses the UI to get things done in SharePoint anyway, right? We do everything with PowerShell and in PowerShell, you have a cmdlet New-SPProfileServiceApplication which allows you to specify the name of the databases:

Solved it! When you open the properties of your created user profile service application, you see that this small rebellion was suppressed swiftly and without making a lot of fuss.


Get a document from an unattached SharePoint content database

Retrieving a lost document in SharePoint is easy. You go to the recycle bin and you restore it. That is, if it hasn’t been deleted more than 60 days ago. But what do you do when you need to retrieve a specific document which has been overwritten with an older version and yes, Murphy is in town… you don’t have versioning enabled?

In that case you better be having backups of the content database which contains the site where the document is stored because you need to retrieve the document directly from a backup of the content database.

Now, restoring a backup of a content database is probably not something you want to do to get a specific document because you will restore all sites in that content database to a previous state. 1 happy user vs. hundreds of angry ones… ouch! But if you don’t restore it, how are you going to get a good copy of the document?

Well, you need to restore the content database under a different name. Once it has been restored, you can use PowerShell to access it as an “unattached content database” where you can access the content as if it was attached to your web application.

The Get-SPContentDatabase cmdlet has a -ConnectAsUnattachedDatabase parameter which gives you a reference to an unattached content database.

Once you have the database reference, you can use the Get-SPSite cmdlet to get your site reference from that database.

From there on, you need the web and the list where the document is stored and offcourse, the document name. With all of this information, you can get a reference to the document item in the database. Only thing left to do is to extract this item as a document.

I combined everything into one convenient script which takes a number of parameters and get you the item from the database.

Read more

SharePoint 2016 farm configuration using PowerShell

With the arrival of SharePoint 2016, farm configuration changed a bit. A new feature was added which allows you to choose the role of the server when you create a new SharePoint farm, or when you are joining a server to an existing farm. This feature is called MinRole. In PowerShell, there are 2 cmdlets which are involved in the creation of a farm or in the joining in an existing farm.

These cmdlets have a new parameter -LocalServerRole which accepts a Microsoft.SharePoint.Administration.SPServerRole value.

I created 2 complete scripts which use these cmdlets. These scripts have a parameter -SPVersion which accepts 3 values: 2010, 2013, and 2016. This makes these scripts usable for SharePoint 2010, 2013, and 2016. When 2016 is specified, the user will be prompted to select the role.

Update 04/10/2016 : The Join-Farm script has been updated. After adding a server to a farm, there are 2 services which are not started immediately:

  • SharePoint Timer Service
  • App Fabric Caching Service

The script has been updated to check for these services and will start them if needed. The App Fabric Caching service is obviously only started if the server has been chosen to be a distributed cache host.

Update 10/11/2016 : I have updated the scripts below somewhat.

  • They now include the extra 2 server roles which came with Feature Pack 1.
  • The parameters are changed to be a bit more dynamic. For example, you now have a switch parameter to indicate if it’s a 2010, 2013 or 2016 farm. Depending on that switch, a different set of parameters is applicable.

Read more