Remote SharePoint site connections are not supported.

When I do SharePoint development, I usually work with multiple web applications. One for My Site, one for publishing, one for search, etc. I also use host headers to identify each web application so that I don’ t have to remember different port numbers.

On my developer box, I add a number of entries to the hosts file that map each host name to 127.0.0.1. This has worked on all previous versions of SharePoint and Visual Studio, but stopped with the Beta version of the Visual Studio extensions for SharePoint 2013.

When trying to use http://col.sp15.dev as the Site Url

You end up with the following error message:

If I enter http://machinename, everything works as expected.

Why do I get this error message? Everything is running on my local machine. I only have one machine and all host names resolve to the loopback address.
Refusing to accept this limitation, I started exploring the extension code written by Microsoft.

When setting this property, the code makes a lot of checks. One of them is the IsLocalByIp(). The url being passed is the one you type in the property window.

If you copy this code to a console application and run it, it’s easy to discover why things are not working. The hostAddresses variable contains all the addresses of the computer except the 127.0.0.1 address. The second variable contains the 127.0.0.1 address specified in the hosts file. Since this is not considered a local address, the test fails.

Hopefully Microsoft will add an exception rule for 127.0.0.1 in this code. Until then, the solution is to use the IP address of your machine in the hosts file, not 127.0.0.1. If you have a dynamic IP address, remember to update the host file.

Advertisements

The road to RTM is littered with dead features…

Now that the Office 15 bandwagon has started it’s journey, information is bound to crop up here and there. Can you trust this information? Maybe. I would not be making any life altering decisions based on it¬†until just before the SharePoint Conference in November. At that point in time, the relase should hopefully be so close that they have locked down the feature set. Until then, anything is up for grabs, or in this case, ends up on the cutting room floor. The only thing we can be fairly sure about is that the stuff that worked in SharePoint 2010 will most likely be around in SharePoint 15.

One of the central features of SharePoint is the user profile system; this allows you to import information from Active Directory and present it through the My Site feature. In SharePoint 2007 it was trivial to add new user profile properties. When SharePoint 2010 was launched, the API’s had changed considerably. Now it looks like an overengineered grapefruit. The reason for this can be found in Central Administration; Organization profiles. It looks like the idea was to introduce profiles for Organizations that leveraged the same property structure as user profiles. The only reference to Organization profiles is in Central Administration, nowhere else. By the look of things, this was something they were working on, and had to cut in order to ship in time. If this will be completed in SharePoint 15 or completely removed again is anybody’s guess.

The fact that something exists in a version doesn’t guarantee that it will continue in the next. The first version of Windows Home Server had a feature called Drive Extender; this allowed you to merge a lot of physical disks into one big logical one. In the latest version of WHS, this feature is no longer present. Apparently it was causing some issues with some other functionality, so they decided to cut it. A lot of people were not happy.

Late last year I attended a course on Duet Enterprise; an integration product between SharePoint and SAP. One of the core pieces this code depends on is the claims based authentication. The instructors told us that at one point during the development cycle, the SharePoint team was considering to cut claims authentication. Since the SAP integration was an important strategic choice, they managed to keep it. Just imagine what we wouldn’t be able to do had the decision fallen the other way.

It’s rather appropriate that the next conference is in Las Vegas. Until then it’s pretty much a betting game if features will make it or not. Since SharePoint 15 will include everything from on premise to cloud solutions, it’s not unthinkable that something gets cut if it prevents one or the other from working. If the community shouts loud enough, any missing pieces might make it into a feature pack or service pack. At this point in time, who the hell knows. I doubt even Microsoft knows.

SharePoint 15; a peek under the covers

First of all, at the time of writing this post, I do not have access to any of the SharePoint 15 TAP bits, or have received any presentations related to SharePoint 15. All the information is based on public information released by Microsoft.

On January 30th, they released SharePoint 15 Technical Preview Managed Object Model Software Development Kit, This contains a CHM file that outlines the changes to the SharePoint API’s. Looking at the API changes, we can get some idea of what is planned.

Since they have started to document the API’s already, maybe SharePoint 15 will be fully documented when released ūüôā

NOTE: As it is a long time until it ships, and Betas are available, Microsoft could choose to cut or change things completely.

Support for Apps

There are several API changes that indicate that SharePoint will be getting App support.

SPApp: Represents an app loaded onto Microsoft SharePoint Server and ready to be installed.
SPAppInstance: Represents an SPApp object installed to a specific SPWeb site
SPAppCatalog: Represents all of the SPAppInstance objects installed on an instance of Microsoft SharePoint Server. It provides querying capabilities for discovering installations.
SPWeb.LoadAndInstallApp(): Uploads and installs an app package.
PackageSource enum: Specifies the source of the package that is associated with a provisioned database. Some of the valid values are StoreFront, CorporateCatalog, and DeveloperSite
IDatabaseProvider interface: Provides methods to manage the lifecycle of database

What do these changes mean for Sandboxed Solutions? When do you chose what approach to use? What are the capabilities of an App? PackageSource.StoreFront is described as MarketPlace, so it looks like Microsoft will set up some central repository for apps.

Since we already have Web Applications (web apps), I’m just looking forward to the confusion when people start talking abouts apps. What type of app do they mean?

Multiple compatibility levels

Several API’s have been introduced to handle compatibility levels.

SPFarm.GetPersistedFeatureDefinition method (Guid featureId,int compatibilityLevel): Returns the SPFeatureDefinition object for the given compatibility level based on the featureId parameter value.
SPUtility.GetLayoutsFolder (): Returns the versioned layouts folder for the specified site collection/site
SPUtility.ContextLayoutsFolder: Gets the versioned layouts folder for the context site.
SPSite.CompatibilityLevel: Gets the major version of this site collection for purposes of major version-level compatibility checks.

Since the system supports multiple compatibiliy levels of features and layouts folder, it will be interesting to see what the file structure will look like for the rest of the system. Are they moving to a single file structure where each release has seperate folders? It wouldn’t surprise me if this functionality was driven by Office 365; that way they can use the same farm to host multiple versions and let customers upgrade when they are ready.

Licensing framework?

SPWebApplication.IsUserLicensedForEntity():Checks if the currently logged in user has the proper license to access the specified entity.

Could this indicate generic licensing framework that we can leverage in our solutions?

Conclusion

With the amount of changes Microsoft implemented between 2007 and 2010, the information currently available in the API documentation is probably just a thimble of a vast ocean of changes planned for the next release. It will be very interesting to see what else will be coming down the road.

At least it looks like us ‘boring’ SharePoint developers can become ‘cool’ by putting ‘app developer’ on our resume.

Site Collection Provisioning: meta-data

For some types of sites it makes sense to have additional meta-data. If you have a project site, the presence of a project number or charge code would allow the system to fetch information from other systems and display it to the user. It’s also important to enforce this no matter how the site was created. If the site was created from a third party tool, the first time a user accesses the site, they should be prompted for filling in the required information.

Our way of solving this involves a number of components.

The first step is to create a meta-data page where users can edit whatever is relevant for their site. Different types of sites will have different types of data, so a flexible approach is needed.

With the Mystery Foundation you get a meta-data page. This is a simple application page with space for a delegate control.

The information about the owner is generic for all types of sites, whilst the project information is specific for a type of site. You can use the delegate control to add additional information to this page.

The control that you implement must implement the interface Mystery.SharePoint.ISiteMetadataControl. This will handle sending information between the page and the back-end storage.

The next step for handling meta-data is to capture it somewhere.

We have a feature, SPM.MetadataRequired, that sets the provider used to capture the data. This feature will also add a site action link under site settings that takes you to the meta-data page. The object used as the provider must inherit from Mystery.SharePoint.SiteMetadata. Your custom type will be passed to the ISiteMetadataControl methods.

Access to your meta-data can be achieved using the following method. Specify the type you want returned; it can either be your inherited type or a base type.

Now that you have wired up the meta-data handling, you  have to ensure that any required data is filled in. Remember that the site may have been created in many different ways; central admin, PowerShell, custom site collection provision UI, third party tool, etc.

The only reliable way to handle all these scenarios is when the user accesses the site for the first time. The most elegant way is to use a delegate control that redirects you to the meta-data page. Ensure that the  SPM.MetadataRedirect feature is activated as part of your web template.

When the user has filled in the required information, the feature will be deactivated, so on subsequent visits, things work as normal.  Using a delegate control only adds an overhead until information has been added. If you add some logic to the master page, the overhead will be incurred on every page visit to the site.

Site collection provisioning

When you have reached the decision to use multiple site collections, you need to think about a number of things.

  • Who should be able to provision site collections?
  • Should only a predefined set of templates be available in the web application?
  • From where will provisioning take ¬†place? Web UI, PowerShell, third party tools?
  • How do you manage users across all these site collections?
  • Who should be the site collection administrators of the new site collection?
  • What type of approval system should be in place?
  • How does one determine when a site collection is no longer in use so that the space can be reclaimed?
  • Does the site collection require any additional meta-data?
  • What Quota should be applied to the site collection?

The Mystery foundation contains functionality that helps solve a number of these issues. I haven’t solved all of the them yet, but I’m working on it.

The current solution consists of the following components:

  • A custom application page that allows users to provision site collections
  • An object to store various configuration data:¬† Mystery.SharePoint.SiteProvisioningSettings.
  • A timer job that handles various site administration tasks:¬† Mystery.SharePoint.SiteManagementJob

Custom page:

Standard page:

Deciding what data to capture

Comparing these two pages, they are very similar in what data they capture; this is by design. The values captured here is basically the information needed to create a site no matter how you do it; Web UI, PowerShell, Object model. When you operate in a UI world, it’s easy to go overboard and ask for all sorts of additional information. It works great as long as you control the¬† UI, but the day you need to create 100 sites from a third party tool, you get problems. Better to plan for it from the start because you never know what is coming around the next corner.

Limiting the number of templates

One of the major difference between the standard and custom page are the number of available templates. You probably don’t want to allow everybody to create any type of site. ¬†The standard page has no way of limiting the selection, so you will have to roll out your own or use the one we provide.

One of the values stored in the SiteProvisionSettings is a list over what templates should be made available. A custom developed web template picker uses this information to present the reduced selection.  The SiteProvisionSettings object can be retrieved and updated using the Get-SPMSiteProvisioningSettings PowerShell Cmdlet.

Limiting the user that may provision sites

One of the other settings in the SiteProvisionSettings is a list of Authentication providers and the active authentication provider. This is an extensible framework, so you can create your own. Just implement Mystery.SharePoint.ISiteProvisioningAuthenticationProvider and register it using PowerShell. The package contains two authentication providers; one that allows anybody to create site collections, the other only allows members of the root site collection to provision them.

User management

When you have a system with multiple site collections, handling users becomes an issue. ¬†Most companies are all about sharing information. The moment you create a new site collection, you create a new security container that by default doesn’t allow everybody access. This goes against the information sharing aspect. In order to overcome this, we have included a site scoped feature¬†SPM.SynchronizeVisitors. If this feature is activated, all the members of the visitor group at the root site collection will be copied to the visitor group of the provisioned site collection. The timer job will make sure that it’s up to date every time it runs. If you have certain types of sites that shouldn’t be available to everyone or want to turn it off, just deactivate the feature and you can control your visitor group manually.

Another area the can cause issues is the site collection administrators. These have to be added individually to a site collection; you cannot specify an active directory group. Certain people should probably have access to all site collection. Keeping this updated across 100s of site collections is a challenge. The site collection feature SPM.SynchronizeAdmin will behave in same way as the visitor synchronization. Activate it based on your needs.

When a user requests a new site collection, do you want them to become a site collection administrator? With this comes a lot of power; are they ready for it? If you want to give people this level of power, make sure that the site collection feature  SPM.OwnerRemainsAdmin remains activated. If this is not activated, the user that requested the site collection will no longer remain and administrator, but rather become a normal owner.

Quota assignment

Major corporations will require a set of different site types. i.e. Projects and Departments. In order to prevent information overflow, they may want to limit the amount of data in each site collection. ¬†The storage needs for a project is probably very different from a department, so it’s not unlikely that they will use different quotas. How does one ensure a different quota is applied for different site collections?

One of the things to be aware of is that you can only change the quota a site collection uses if you are running as a farm administrator. This means that we cannot update our site collection immediately. We have to tell the system what quota we should be using, and then have the timer job set the actual quota. This allows the site to have a free quota for a period of time, but chances of them exceeding it before it is enforced is rather slim.

In order to leverage the quota management, make sure that you activate the SPM.QuotaAdmin feature as part of your web template.

The value specified as the default template will be assigned by the timer job. This feature will also give owners an overview over how much space their site is consuming. A notification bar will also be presented when the storage space approaches critical limits.

(in this sample I adjusted the quota after adding content, hence the 112% usage)

Additional meta-data

Some types of site collections may require additional meta-data. ¬†I.e. a project site may need a project number so that it can use this to fetch data and display data from additional systems. SharePoint doesn’t natively support site collection meta-data, so we have to create our own system. A more detailed description on how one can create such a system will be left for a future blog post.

Building an automated SharePoint 2010 deployment system

When building a deployment system for SharePoint 2010 system, your only practical option is to use PowerShell. STSADM.EXE could in theory be used if you are only interested in SharePoint Foundation, but the moment you need SharePoint Server, things are only available in PowerShell.

SharePoint is a mature product and has its portion of quirks. Some of these come to surface when working through PowerShell. SharePoint can at times be very fond of caching information. If you keep your PowerShell session running, you may not get the result you were expecting. For instance, if you deploy a custom site definition and try to use this to create a new site using the same PowerShell session, you will be informed that SharePoint cannot find the definition. You will also experience issues when deactivating and activating features; if you have deployed updated assemblies to the GAC, the feature may actually be using an old version.

When using PowerShell and SharePoint, be prepared restart PowerShell on a regular basis. This fact is part of the driving force behind the Mystery Foundation deployment system. The deployment system is driven from a normal Command Shell; it makes it easy for you to construct new cmd files to handle your scenarios.

A copy of the deployment system and all required files can be found in the SharePoint Mystery Package.

All our deployment commands follows this pattern.

ps.cmd <Environment File> <Application> <Command> 

Environment File: the name of the ps1 file that contains environment information
Application: a string that identifies your application, you have control over this value
Command: the command you want to issue on the application, you have control over this value.

In the package you will also find BobTheBuilder.cmd that creates my demo system and ConanTheDestroyer.cmd that tears down the demo system.

File structure

Files starting with MGL are sample files and are meant to be customized by the project. Name them however you want. Files starting with SPM are part of the core deployment system and are not meant to be customized by the project.

TheServer.ps1

This file contains information for a specific environment. It contains information relevant for the farm and the various applications and services we will be using.

In my case it’s named after the server where I do the deployment from. Create one file per environment (development, testing, staging, production)

MGL.Settings.ps1

This file contains information that is relevant across all your environments.

The prefix value that is set at the farm level will be used to create the actual database name. That is why we can specify the name in a common location.

MGL.Commands.ps1

This file is where you map the various commands against the applications.  It translates the Application and Command parameters from ps.cmd into actual action. What commands you want and what you do is up to you. Copy and customize this file based on your own requirements.

The following information that is found in TheServer.cmd is important for forwarding information to this file:

SPM.Types.ps1

This is mainly a container for all the various SPM.Type.*.ps1 files. This is also where the SharePoint PowerShell module gets loaded.

SPM.Functions.ps1

This file contains a set of useful functions that are not bound to a specific type.

SPM.Type.*.ps1

There is one file per custom type. This file contains a method that returns a custom PowerShell object; it’s this object that provides all the properties for setting values and any methods that can be called.

SPM.DefaultTopology.ps1

This file creates all the objects for a default topology; all the service applications and a set of web applications. You can use this or create your own. Even if you only need a subset of this topology, you can use it. Which services and applications actually get created depends on what you do in your command file. If you create your own, stick with the same names; there are dependencies between services and applications. i.e. My Site and User Profile Service

Points of interest

Service Principle Names

The deployment system can automatically configure the SPN’s that are needed for Kerberos Authentication. This is controlled by a flag set on the farm object so that it can be turned on for a development deployment, but not in a staging or production deployment.

Databases

The system can automatically create databases at a specific location if desired.  This allows you to create the databases at a different location than the standard SQL Server location. The database folder creation is only relevant if everything is running on the same machine. i.e. developer.

In general, SharePoint will accept an empty database that it can configure, but there are some services that just refuse to configure an empty database. These will be created in the default SQL Server location.

When creating the database, we impersonate the SharePoint farm account. Some services don’t like it if dbo != SharePoint farm account. I.e. the User Profile Service

Transcript

In the command file we create a transcript file for each command.  When errors occur, the files can be sent to whoever can aid in troubleshooting.

Passwords

No passwords are stored in the deployment files. During the farm configuration process, all the required accounts are configured as Managed Accounts. When needed we read information from the Managed Account system to impersonate users.

Pending

Installing binaries and creating the farm

Currently we don’t support installing the SharePoint binaries or creating the actual farm. This deployment system assumes that the environment has been configured as far as Central Administration, but no further. The initial focus is on developers. Personally I maintain a blank SharePoint 2010 image that I use as a basis for my development; this has Central Administration already configured. There are already other scripts out there that will install the binaries and create the initial farm.

With time I plan to include this functionality as well. How soon will depend on demand.

Other stuff

There is probably a whole bunch of stuff that would be useful to include. If you have suggestions, please let me know.

Automating SharePoint deployments, an introduction

No matter if you are working on a 10 000 user or 100 user project, you will need to have some form of deployment system in place. It doesn’t take long before the number of settings you need configure spiral out of control. Some people like Word documents that detail all the settings with tons of wonderful screen shots. I personally hate these because they take a long time to set up and actually perform. You can also be fairly certain that at some point somebody will forget something and you spend a lot of time debugging the problem. I personally favor an automated approach; at least as much as possible. The downside is that setting up an automated deployment system takes time and money. If you are supporting 10 000 users, you may have the budget for it, but probably not for a ‘cheap’ system handling only 100 users.

The Mystery Foundation provides a lot of building blocks and a sample to help you along the way; it should drastically cut down the time to set up an automated deployment system.

Some people don’t understand the value of an automated deployment system; the guilty shall not be named. The foremost value is that it provides an indisputable documented copy of the collective knowledge that the project has when it comes to deployment. Lengthy prose can open for interpretation and unclear descriptions. A deployment system developed in PowerShell is very clear in what it does. Deployment is also something that often gets delegated to one or two people; if something should happen to them, any ‘magical’ undocumented things they may be doing also gets lost.

Automated deployment systems are also very valuable from a development perspective. If a developer knows that they can rebuild their system in a matter of hours, they are not afraid of scrapping their system and starting afresh. This uncovers any undocumented configuration changes. If a developer keeps their environment for 2 months prior to moving into test, you can be assured that some settings haven’t been document and thus not configured correctly.

Another aspect of automated deployments is that they should work across multiple environments; the information relevant to a specific environment should be separated from the rest. This ensures that you also debug your deployment system when moving between environments. When reaching staging and production, most of the bugs should have been removed and deployment into these environments shouldn’t cause many surprises.

So what are some of the attributes of a good deployment system?

Run again, and again, and again, and again...

It should be possible to run the deployment system multiple times without incurring any errors. If the database or web application already exists, it should use them and just move on.

Scoped configuration values

In any configuration you will have configuration values that are specific to the environment and values that are relevant for all environments. Examples of environment specific values include URL’s, service accounts, and database sizes. Information that can remain the same across multiple environments include timeout values, the service application names probably remain the same, as would the logical architecture.

Having a deployment system that allows you to set values for all environments or for a single environment clearly documents what is relevant for the various scopes. If you need to add a new environment, you know what values you have to come up with.

The Mystery Foundation deployment system

This deployment system is built entirely in PowerShell using the functionality provider by SharePoint 2010. Some custom Cmdlets have been developed to provide additional functionality:

  • Add entries to the host file on all servers in the farm
  • Add SPN’s so that Kerberos works
  • Create directory on all server in the farm
  • Configure Blob cache settings
  • Configure Super user accounts for the Publishing infrastructure.