Monday, 10 February 2020

Associate Custom Claims Provider with Specific SharePoint Zone


Anyone familiar with using claims based authentication with SharePoint will likely have had issues when validating user accounts using custom claims providers.


I have worked with many over the years, some better than others and this post is about one particular issue I faced on a recent project.


I was implementing a custom claims provider, which was provided to me by the development team in the form of a WSP solution file.


After deploying the solution and activating the required feature as specified by the developers, the claim provider was present and active but my authentication was failing because no identity was being picked up.


The first thing to check is that your claim provider is created and the attributes look correct:


Get-SPClaimProvider


Check the "IsEnabled" is set as true and 'IsVisible" is also set as true.


After some more head scratching, I wanted to ensure that the claim provider was actually associated with the correct zone as I was trying to access the intranet zone.


This script will output the claim providers for a particular zone or URL.


$site = Get-SPSite("https://test.com")

$web = $site.OpenWeb("/")
$request = New-Object System.Web.HttpRequest("", $web.Url, "")
$response = New-Object System.Web.HttpResponse(New-Object System.IO.StreamWriter(New-Object System.IO.MemoryStream))
$dummyContext = New-Object System.Web.HttpContext($request, $response)
$dummyContext.Items["HttpHandlerSPWeb"] = [Microsoft.SharePoint.SPWeb]$web
[System.Web.HttpContext]::Current = $dummyContext
$zone = [microsoft.sharepoint.Spcontext]::current.web.site.zone
[microsoft.sharepoint.spcontext]::current.web.site.webapplication.Iissettings[$zone].claimsProviders

The feature really should have handled the association of the claim provider, but in this case I have had to handle this manually using the below script which will associate the claim provider you specify with the URL you specify.

$url = "https://test.com"
$zone = "Intranet"
$webapp = get-spwebapplication $url
if ($webapp.iisSettings.containsKey($zone)) {
    $providers = @()
    $providers += "CustomClaimProvider";
    Set-SPWebApplication -Identity $WebApp -Zone $Zone -AdditionalClaimProvider $providers  
}

Hope this comes in handy,

Matt

Thursday, 6 February 2020

SharePoint 2016 API Request getfolderbyserverrelativeurl 500 server error

I was recently asked to help troubleshoot a custom application which uses the SharePoint API to first check SharePoint for an item, and if it does not exist then to upload that file.

This was part of a migration from SharePoint 2013 to 2016, and the results differed for the same API request.

After trying the request in multiple SharePoint 2016 farms, I concluded that this was not down to a farm level issue/difference but likely a change in the way that SharePoint handles this particular event.

This may come in handy for those without a direct line to Microsoft, they have confirmed that SharePoint 2016 handles the event ID ay1r6 with a NullreferenceException as an unknown error and therefore the end user receives a 500 error rather than what would have been a 404 in SharePoint 2013.

So in my scenario, the code was trying to retrieve an item using the following API request: https://test.com/_api/web/getfolderbyserverrelativeurl('/Documents/2020 ')/

The logic is bad, but in the case of a 404 error for the location it would create that location and carry on but as 2016 throws a 500 the code no longer functioned.

SharePoint 2013 response to a non-existent library location:









SharePoint 2016 response to a non-existent library location:









Luckily there is an easy fix for this problem, as the developers should have already been using the ‘Exists’ method to return a value of true or false.


Thursday, 25 February 2016

Security Hardening your SharePoint Environment

This post is focussing on some SharePoint/IIS specific security hardening points for internet facing web sites.

I'm not going to cover the more infrastructure focussed security aspects you should also be looking at such as SSL, Kerberos, Firewalls, SQL encryption etc.

When exposing your SharePoint site to the internet or working with a project where security is a key concern you may need to look at security hardening your SharePoint implementation.

With this post I am aiming to create a combined guide for two key points that penetration testers will usually flag up regarding the IIS/SharePoint configuration.

1. Removing IIS Headers for SharePoint and ASP.Net
2. Using custom error pages to mask the underlying technology being used

Removing the page HTTP headers:

When removing or trimming the headers, the tool I used to check these headers was Fiddler but there are various tools you could use for this purpose.



You will be able to spot vital pieces of information within the headers tab, these are the first thing hackers will look for.

From the screenshot below, you can clearly see the IIS and SharePoint version, which can then be used to research security vulnerabilities in those particular versions.



1. Open IIS Manager.

2. At the server level and the site level you will find 'HTTP Response Headers'.

3. On the HTTP Response Headers page, select the header and click remove..

To remove the SharePoint version number, and the ASP.Net version number you need to modify the web.config file for the web application.

Remove SharePoint version number

    <httpProtocol>
      <customHeaders>
        <add name="X-Content-Type-Options" value="nosniff" />
        <add name="X-MS-InvokeApp" value="1; RequireReadOnly" />
<remove name="MicrosoftSharePointTeamServices"/>
        <remove name="X-Powered-By" />
      </customHeaders>
    </httpProtocol>


Remove ASP.NET version

    <system.web>
      <httpRuntime maxRequestLength="2097151" executionTimeout="3600" enableVersionHeader="False" />
    </system.web>


After these changes you should no longer see the SharePoint version or .NET version.




When it comes to error pages, hackers can sometimes manipulate URLs and expose IIS error pages which give direct hints at the technology in use (IIS) and also more detailed information such as IIS or ASP.Net version numbers. These version numbers can be used to find security vulnerabilities.

I was able to access the below error pages by manipulating the URL.



To start with we can change the custom error pages in IIS.



When it comes to SharePoint error pages, we have two ways that error pages are managed within SharePoint.

Create your custom error page, which can be a simple html page to start with or something more customised for your needs.

Save your custom error pages to the below location on your SharePoint servers:

C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\TEMPLATE\LAYOUTS\1033

Set the SharePoint error pages on non publishing sites:



$webApp = Get-SPWebApplication https://testsite.com
$webApp.FileNotFoundPage = “CustomError_Page.html”
$webApp.Update()

$webApp = Get-SPWebApplication https://testsite.com
$webApp.UpdateMappedPage([Microsoft.SharePoint.Administration.SPWebApplication+SPCustomPage]::Error,"/_layouts/1033/CustomError_Page.html")
$webApp.Update()


A publishing site has its own attribute,

If you set the site level error page to a blank value "" then the web app level setting we just set will override this and the site will be consistent with the other non publishing sites.






$site = get-spsite 'https://testsite.com'
$site.FileNotFoundUrl = “”

I hope these tips help.

Thanks for reading,
Matt

Thursday, 8 October 2015

SharePoint 2013 MMS Performance Issues

Loading the SharePoint term store page is for the most part a pain free process which takes no longer to load than many other SharePoint pages.

I have however recently been investigating poor performance when loading the term store management page (_layouts/15/termstoremanager.aspx).

When loading the page the message appears: ‘Term store operation in progress, this may take a few moments.’

To narrow down what was causing the delay in loading I enabled developer tools in my IE browser.



And I could see from the network capture that the GetGroups POST was taking up to 30 seconds in some cases.




As this issue was only being reported by term store administrators I also tested this with a general user with read only access to the term store and the performance was as expected with load times of a second or two.

So by simply adding myself to the term store administrators group I could recreate the issue.

Digging into the ULS logs I could see that SharePoint was in fact trying to resolve all of the users in the term store administrators group.

What I found was a large group of term store administrators in the farm (entire support and development team) and these were all listed in the term store administrators box shown below.









So before loading the page for term store administrators it will first be doing multiple name resolutions, also any users listed in the term store administrators or managers groups but no longer in AD will cause extra delay in this process.

SPRequest.GetNTFullNamefromLoginEx: UserPrincipalName=, AppPrincipalName= ,bstrLogin=domain\testuser

Microsoft.SharePoint.Taxonomy.WebServices.TermStoreGenericObject..ctor(Group group) at Microsoft.SharePoint.Taxonomy.WebServices.TermStoreGenericObject..ctor(Group group)     at Microsoft.SharePoint.Taxonomy.WebServices.TaxonomyInternalService.GetGroups

SPAce encoded user claim i:0#.w|domain\testuser cannot be resolved. Microsoft.SharePoint.SPException: Cannot complete this action.  Please try again. ---> System.Runtime.InteropServices.COMException: Cannot complete this action.  Please try again.     at Microsoft.SharePoint.Library.SPRequestInternalClass.GetNTFullNamefromLoginEx(String bstrLogin, Boolean& pbIsDL)

Part of the call stack is shown below:

Microsoft_SharePoint!Microsoft.SharePoint.Library.SPRequest.GetNTFullNamefromLoginEx(System.String, Boolean ByRef)
Microsoft_SharePoint!Microsoft.SharePoint.Utilities.SPUtility.GetFullNameFromLoginEx(System.String, Boolean ByRef)
Microsoft_SharePoint!Microsoft.SharePoint.Administration.SPAce`1[[Microsoft.SharePoint.Taxonomy.TaxonomyRights, Microsoft.SharePoint.Taxonomy]].get_DisplayName()
Microsoft_SharePoint_Taxonomy!Microsoft.SharePoint.Taxonomy.WebServices.TermStoreGenericObject..ctor(Microsoft.SharePoint.Taxonomy.Group)
Microsoft_SharePoint_Taxonomy!Microsoft.SharePoint.Taxonomy.WebServices.TaxonomyInternalService.GetGroups(System.Guid, System.Guid, System.Guid, Boolean, Int32)

To improve the performance for the term store administrators I recommend using an AD security group in the place of individuals names.

On page load only the AD group will be resolved in AD to ensure it exists, none of the group members will be resolved and should therefore reduce the number of name resolutions your page has to complete before loading its content.

I hope this post helps you understand more about this page and what happens on loading, as it took me long hours of investigation to get to the bottom of the performance issues I faced.

Thanks for reading,

Matt


Thursday, 24 September 2015

Deleting Orphaned Sites From SharePoint Content Database


Recently I was carrying out a cumulative update release on a client SharePoint farm, when running the Test-SPContentDatabase commandlet I was faced with an orphaned site which could cause upgrade failures.

If you have orphaned sites in your content databases, get these cleaned up and tested in another environment before starting your production upgrade or it could be a very stressful time when sites within that content database will not load!

Test each content database before the upgrade:

Test-SPContentDatabase -Name ContentDB_Matt -WebApplication https://matt.test.com

In my case I was seeing the below errors for one database in particluar.

Category: SiteOrphan
Error: True
UpgradeBlocking: False
Message: Database [ContentDB_Matt] contains a site (Id = [********************], Url = [matt.test.com]) that is not found in the site map. Consider detach and reattach the database which contains the orphaned sites.
Restart upgrade if neccesary.


Category: SiteOrphan
Error: True
UpgradeBlocking: False
Message: Database [ContentDB_Matt] contains a site (Id = [*********************], Url = [matt.test.com]) whose url is already used by a diferent site, in database (Id = [***********************], name = [ContentDB_Matt], in the same web application. Consider deleting one of the sites which have conflicting urls.
Remedy: The orphaned sites could cause upgrade failures. Try detach and reattach the database whcih contains the orphaned sites.
Restart upgrade if neccesary.


You will need to use the id in the above error message, and do a Get-SPSite for the web application. If a site with the above ID is not returned via PowerShell then this orphaned site needs to be deleted.

Be carefull to check that the site ID in the error is not retuned in the Get-SPSite commands or you will be deleting a current site by mistake, take extra time to validate the site IDs before running any delete.

Using a non-production environment will give you a safety net here.

Using trusty old stsadm, enumerate all the sites in the content database to get a full list or sites and their subsites.

stsadm -o enumallwebs -database ContentDB_Matt

If you have orphaned sites you will see a mismatch in the number returned by enumallwebs and the number returned using Get-SPSite in PowerShell.

Get-SPSite -ContentDatabase "ContentDB_Matt" | select ID

Once you have compared the site IDs of those returned by enumallwebs and Get-SPSite you will have the IDs of the orphaned sites to be deleted.

Again stsadm comes to the rescue here, as within PowerShell I could not find the orphaned sites by ID.

stsadm -o deletesite -force -SiteID ********************* -DatabaseName ContentDB_Matt -DatabaseServer Matt_SQL_Instance

Run enumallwebs after the delete and that orphaned site should now be gone, and once all orphaned sites are deleted running enumallwebs and Get-SPSite should return a matching number of sites!

stsadm -o enumallwebs -database ContentDB_Matt

Get-SPSite -ContentDatabase "ContentDB_Matt" | select ID

Now when you run Test-SPContentDatabase it should not return errors about orphaned sites.

Good luck with your upgrades.

Matt


Friday, 12 June 2015

PowerShell to Output SharePoint Site Collection Database Information


Thanks to everyone who visits my blog, I have just reached 100,000 page views! I hope some of my posts have been helpfull to others in the SharePoint community.

I was asked by a member of the development team to extract some key details required for a large list of site collections in one of our farms.

I knocked up a quick script, and saved it to my library of handy reusable scripts.

I thought I would share this one, as it shows some basic fundamentals of using PowerSheell to extract informtion from SharePoint.
  • Using an input file to pick up a variable (In this case site URL)
  • Using a foreach loop to iterate through the variables in the input file
  • Writing out the varibales being picked up during the foreach loop (you could also use 'out-file' instead of 'write-host' if you would prefer to send the output to a document)


Copy of the script below:

#Input file for your list of site collection URLs
$InputFile = get-content "D:\APPS\Scripts\input.txt"
foreach ($URL in $InputFile)
{
#Set your variables
$Site = Get-SPSite -Identity "$URL"
$SiteOwner = $Site.Owner
$SiteID = $Site.ID
$ContentDBID = $Site.contentdatabase.ID
$ContentDBName = $Site.contentDatabase
$ContentDBSize = $site.ContentDatabase.DiskSizeRequired
#Write the output to the PowerShell screen
write-host "Output for site:" -ForegroundColor Cyan
write-host $URL -ForegroundColor Green
write-host "Site Collection Owner: " $SiteOwner -ForegroundColor Green
write-host "Site Collections ID: " $SiteID -ForegroundColor Green
write-host "Content DB ID: " $ContentDBID -ForegroundColor Green
write-host $ContentDBName -ForegroundColor Green
write-host "Content DB Size (MB): "$ContentDBSize -ForegroundColor Green
}


The format of the input file is simply a list of URLs

Example:

https://testsite1.net
https://testsite2.net


Thanks for reading,
Matt

Friday, 22 May 2015

HRESULT: 0x80131904 SharePoint Database Full


As most companies who have adopted SharePoint will know, a huge part of any SharePoint project is migration of content from other legacy systems and file shares.

Even with the most controlled migration plan, there is always the possibiltiy of a proactive user manually dumping huge amounts of content on their SharePoint site (especially once they discover windows explorer view!).

When this occurs you will see sudden SQL growth and especially within the DocStreams table within the content database.

When a file is uploaded to SharePoint 2013 shredded storage breaks an individual BLOB into shredded BLOBs to be stored in the DocStreams table.

For some more detailed information regarding shredded storage check out this technet blog.

Back to the point, which may or may not be related to a migration. If a large amount of content is uploaded to your content database this can result in SQL storage issues.

You may see errors on the SharePoint pages, including HRESULT: 0x80131904

In the ULS log I could see that various tables were growing rapidly and filling their primary file group, namely the EventCache and AuditData tables due to the high usage.

ULS log errors:

Database full error on SQL Server instance 'SQL-SharePoint-Prd' in database 'ContentDB_Intranet'.
Additional error information from SQL Server is included below.  Could not allocate space for object 'dbo.EventCache'.'
EventCache_Id' in database 'ContentDB_Intranet' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files,
dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.


Database full error on SQL Server instance 'SQL-SharePoint-Prd' in database 'ContentDB_Intranet'.
Additional error information from SQL Server is included below.  Could not allocate space for object 'dbo.AuditData'.
'AuditData_OnSiteOccurred' in database 'ContentDB_Intranet' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files,
dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.

You will also see event viewer errors:


Event viewer error:

Database full error on SQL Server instance 'SQL-SharePoint-Prd' in database 'ContentDB_Intranet'. Additional error information from SQL Server is included below.

Could not allocate space for object 'dbo.DocStreams'.'DocStreams_CI' in database 'ContentDB_Intranet' because the 'PRIMARY'

filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup,

adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.

You could do maintenance to shrink these DBs, but the likely fix for most people will be to add the required disk space to allow the required growth of the databases.

You should consider your autogrowth settings, have a read of this for more details.

To avoid issues like this in the future monitoring of the free space on the SQL disks is essential for your production environment and there are various tools out there to help you with this.

Thanks for reading,
Matt