Building My SharePoint 2016 Disaster Recovery Farm Lab on Azure

I have set out to build a SharePoint 2016 disaster recovery farm extending my home-based on-premises SharePoint 2016 farm.

My objectives

  1. Continue to build my networking, windows server and other infrastructure related skills. I come from an application development background.
  2. Build my hands on skills and knowledge with Azure IaaS;
    • Azure Virtual networking, Site-to-site VPN
    • Azure virtual machine management
  3. Gain in depth architecture and system administration knowledge of all the pieces that make up a disaster recovery farm using SQL AlwaysOn (async commit) approach.
    • Understand performance/latency based on asynchronous commit to secondary database replica.

I used the following article as my primary source:
Plan for SQL Server AlwaysOn and Microsoft Azure for SharePoint Server 2013 Disaster Recovery

I tried my best to follow all the steps, but I approached them in a different order per my own DR design.

As a result, the following link are my raw notes and screen shots of some of my detailed steps in building the disaster recovery farm.
https://onedrive.live.com/redir?resid=D50B33B813A3693B!13901&authkey=!ANaDqU9cBkj36s0&ithint=file%2cdocx

My naming conventions are not perfectly consistent since I was building on the go. With these notes, it is my hope you can come away with some steps to a working solution.

The following is a summary of key steps in building my disaster recovery lab in Azure.

On-premises Home Network and Azure Network

Azire-SPDR-1

My personal home network consists of a set of Hyper-V virtual machines with the physical host as a Windows 10 desktop PC. The specifications are Intel Core i5 4 processors, 16 GB RAM, Intel solid-state drive for the virtual machine disks, and D-Link DIR-826L router.

My on-premises environment:

  • homedc virtual machine
    domain controller and DNS
    domain: rkhome.com
    Decided to serve as a general file server. I don’t have enough RAM and CPU for a dedicated file and backup server. This is not the ideal server topology.
  • homesp virtual machine
    SharePoint 2016 single server farm and SQL 2014SP1 database. SP is installed.
    Single server farm instead of a desired 2-server topology because I don’t have enough CPU and RAM.
  • homerras virtual machine
    Routing and Remote Access Server (RRAS)
    Used to establish site-to-site VPN connectivity with an Azure virtual network. There are other options such as using a hardware VPN router. This server is not domain joined.
  • D-Link router
    Port forwarding feature is leveraged to support site-to-site VPN connectivity.

Azure Disaster Recovery Site

The Microsoft cloud-based disaster recovery site.

  • Virtual Network
    Configured two subnets. One for the SharePoint farm and the other for the Gateway subnet for the site-to-site VPN.
  • rkdc virtual machine
    domain controller and DNS (no domain controller promotion just yet)

Note: At least set this server as a static IP rather than dynamic IP in the Azure portal.

  • rksp virtual machine
    SharePoint 2016 single server (not installed yet)
  • rksql virtual machine
    SQL 2014SP1 database server

Site-to-site VPN and DC Replica

Azire-SPDR-2.png

Enable cross network connectivity between the on-premises home network and the Azure virtual network. The other option is using ExpressRoute, which is more suited for production scenarios for its private connection, higher bandwidth, better performance and reliability.

Port forwarding configured in the D-link home router to allow internet connectivity to the homerras server for a VPN connection.

Virtual Network Gateway

Serves as the cross-premises gateway connecting your workloads in the Azure Virtual Network to on-premises sites. This gateway has a public IP address accessible from the internet.

Local Network Gateway

Enables interaction with on-premises VPN devices represented in the Gateway Manager. Therefore, needs to be configured with the home router’s public WAN IP address. The port forwarding setup always communicates to the RRAS server as the VPN device.

Connection

Represents a connection between two gateways – the virtual network gateway and the local network gateway.

homerras RRAS Server

Configuration of an interface named as “Remote Router” to have the public IP address 40.114.x.x for the virtual network gateway.

Domain Controller replica on the Azure virtual network

Prerequisite: site-to-site VPN connection needs to be active.

Install a replica Active Directory domain controller (i.e. rkhome.com) in the Azure virtual network

Domain join rksp, rksql servers to rkhome.com

Any added DNS records and AD accounts will be synchronized between the two domain controllers.

In testing the VPN connection, any machine connected to the on-premises network was able to ping or RDP, with a domain account, into any other server in the Azure virtual network and vice versa.

SharePoint 2016, WSFC, and SQL Server AlwaysOn

Azire-SPDR-3

SharePoint 2016

Installed on Azure rksp virtual machine as a single-server farm with mysites host and portal site collection. SharePoint 2016 is already installed on the on-premises farm before the start of this lab.

Windows Server Failover Cluster

Installed Windows Server Failover Cluster feature on homesp and rksql as they are database server roles.

Name: SPSQLCluster
IP Address: 192.168.0.102

File share cluster quorom is hosted on homedc. This quorom should be on a dedicated file server, but do not have enough memory resources for another VM.

Set Node weight = 1 on primary homesp node

SQL Server AlwaysOn

Enabled SQL AlwaysOn and asynchronous commit configuration. This is recommended  for higher network latency due to the VPN connection and geographic distance between the two sites. Synchronous commit is recommended for network latency of <1ms for SharePoint. When I ping servers across the two environments (Toronto and North Central US), I get an average of about 75ms ranging from 30ms to 110ms.

The supported databases for asynchronous commit in the article Supported high availability and disaster recovery options for SharePoint databases (SharePoint 2013)

https://technet.microsoft.com/en-us/library/jj841106.aspx

The below databases below were deleted in rksql secondary before replication from homesp primary database instance.

Availability groups

  • AG_SPContent
    • MySites
    • PortalContent
  • AG_SPServicesAppsDB
    • App Management
    • Managed Metadata
    • Subscription Settings
    • User Profile
    • User Social
    • Secure Store

Configuration databases are farm specific. Search databases can be updated with a full crawl upon failover.

Availability Listener configuration for each availability group

  • agl_spcontent1 for AG_SPContent
    0.0.8 (on-premises)
    192.168.0.103 (azure DR)
  • agl_spservice for AG_SPServicesAppsDB
    0.0.9 (on-premises)
    192.168.0.107 (azure DR)

 

Evaluating AlwaysOn Availability Group in Asynchronous Commit Mode

 

Failover Test

Azire-SPDR-4.png

  1. Manual shut down IIS Web
    sites of SharePoint
    Simulate a failure event such as a IIS shut down
  2. For each Availability Group, failover to secondary replica
    Resume database movement
  3. Adjust WSFC node voting rights
  4. Update DNS records of SharePoint sites to DR
    Start IIS on original primary on-premises site

 

This can be repeated to failover once again to the on-premises site making it the primary once again.

Comments on Azure costs

Virtual Machines

  • Domain controller and DNS – Basic A1 1 cpu 1.75GB RAM
    • Left running
  • SQL Server database server – Basic A1 2cpu 3.5GB RAM
    • Left running
  • SharePoint 2016 single-server – Basic A4 4CPU 7GB RAM
    • Turned off in cold standby
  • VPN Gateway
    • ~$31CAD/month
    • Pricing is based on time; however, I didn’t find a way to stop or pause usage to save on costs.

I approximate the cost of running the above resources to be $130CAD/month, if the SP VMs are stopped per cold standby methods.

Final Remarks

This has been a great learning experience as I understand how all the little pieces work together. Out in the enterprise world, disaster recovery tends to be lower in priority in a project roadmap or not at all. However, as the business criticality of a technology solution increases, so is the need for a DR solution. Hosting in Azure is a cost effective option since you are actually paying for what you use, especially in cold standby scenarios. Leveraging Azure regions in geographically remote areas are appropriate for mitigating widespread disaster situations such as hurricanes, mass power outages, earthquakes, floods or even outbreaks that can affect a data centre’s operability.

In technology, something’s you do not really know until you build it with your own hands – learning is by doing.

Windows Server 2012 R2 Web Application Proxy and ADFS 3.0 Azure Lab

The following diagrams are based on a lab I built on Microsoft Azure IaaS leveraging Web Application Proxy and ADFS 3.0. to demonstrate single sign-on with claims based applications.

As I come from an application development and architecture background, I learned a great deal with Azure IaaS and system administration with respect to Azure Virtual Networks, Virtual Machines, IP addressing, Azure PowerShell and the Azure management portal, domain controllers, DNS, subnets, certificates and other relevant Windows Server Roles and Features. At the present time of May 2016, I thought I share my notes to help others who may find this helpful in the manner that it is built. Note that I have built this lab in March of 2015 given the Azure’s feature and capabilities at that time.

Lab Architectural Overview

Hosting Infrastructure

  • Microsoft Azure Infrastructure-as-a-Service

Virtual Network

  • One Virtual Network with three subnets
  • Subnet-DC for the domain controller and ADFS server
  • Subnet-Web for web applications and other applications such as SharePoint Server.
  • Subnet-DMZ for the Web Application Proxy

Network Security Groups

  • I didn’t implement any NSG yet, but for proper network security you would have NSG around each subnet to allow/deny traffic based on a set of Access Control List rules.

Windows Domain

  • All servers except for the DMZ are on the same rk.com domain, except for the Web Application Proxy server. For trivial reasons of it being in the DMZ and as a proxy server to the internet.

Public domain name

  • I purchased rowo.ca domain name to be used as part of public urls to internal applications.

Certificates

  • There was a great deal of certificate dependencies between WAP and ADFS and Relying Party (web apps) and token signing. This was a challenging learning point for me and to set things up appropriately and troubleshooting. The detailed topics involved public/private key, export/import certificates, authority chain, thumbprint, certificate subject name, SSL, server authentication, expiry, revocation, browser certificate errors, etc.

screenshot1464024458932

Azure Virtual Network configuration involving address spaces and subnets

screenshot1464024922211.png

I setup ADFS and added my simple .NET claims aware web application as a relying party trust.

screenshot1464025034973.png

I conducted the following test:

Logging into the rkweb1 web server (i.e. internal to the network), I opened the browser
1.Enter the url: https://rkweb1.rk.com/ClaimApp
2.Redirected to ADFS and then authenticated
3.Redirect back to the ClaimApp with access.

screenshot1464025058988.png

Testing withing internal network:

screenshot1464025175345.png

I configured the Web Application Proxy to publish the following applications to the internet.

Internet-facing External URLs are start with https://rowo.ca/ and are mapped to backend URLs starting with https://rkweb1.rk.com for the following applications.

ClaimApp

  • .NET claims based application using Windows Identity Foundation.
  • WAP Pre-authentication is ADFS

HTMLApp

  • HTML web application with no authentication.
  • WAP Pre-authentication is Pass-through. No authentication.

TodoListService

  • REST API with windows authentication
  • WAP Pre-authentication is ADFS

Capture.JPG

Accessing ClaimApp from the internet:

screenshot1464025578290.png

Accessing a REST API via a .NET WPF desktop application from the internet. User will be prompted for credentials in a separate dialog per OAuth.

screenshot1464025704524.png

Accessing ClaimApp through iOS Sarafi browser with device registration. In AD there is a dev

screenshot1464025974358.png

In Active Directory, my iPhone mobile device has been registered for added authentication and conditional access rules to applications.

screenshot1464030919794 (1).png

In conclusion, I loved the fact that Azure has become my IT sandbox to learn and build solutions such as this remote access solution. Also, the Web Application Proxy is one of many other options in the market to publish out internal on-premises applications using ADFS to support single sign-on.

Online References that helped me build this lab

Operational

 

SharePoint 2013 Workflow Integration with the WaitForCustomEvent Activity

Implementing an integration scenario with SharePoint 2013 Workflows using WaitForCustomEvent Activity in Visual Studio 2012 Technical Requirement: Integrate a SharePoint workflow with another application to call into the application and wait for a response with data.

Applicable business scenarios

  • A document management approval workflow notifies a CRM system of a customer engagement and provides a reference number back to the workflow.
  • A SharePoint workflow assigns and emails an end user to do some work in another application. The end user goes to the other application to do this work and it notifies the workflow of completion and other application data. The SharePoint workflow continues.

The following is an implementation flow that is applied to the above business scenario.

  1. The workflow instance calls an external application through a RESTful service passing correlating information and the custom wait event name.
  2. Workflow is in a wait state by the WaitForCustomEvent.
  3. The external application executes its relevant business logic and is then ready to notify the workflow through the SharePoint API by passing correlating information, event name and any event args (e.g. data).
  4. The workflow custom wait event handles the call and continues execution with the given event args. At this point, the workflow status can be set.
WaitForCustomEvent 2

High level development

  1. In Visual Studio 2013
    • Create a Workflow Custom Activity project item.
    • Deploy solution to a site.
  2. In SharePoint Designer 2013
    • Create a SharePoint 2013 workflow. The deployed activity will apear in the Actions menu
    • Add the custom action to the workflow design surface.
    • Publish workflow

Updating workflow custom activity and redeploy steps

  1. Create Workflow in SharePoint Designer 2013
    • Clear SP Designer website cache
  2. Create Workflow Activity in Visual Studio 2012
  3. Create external application to receive and publish the even back to the running workflow instance.

The WaitForCustomEvent Activity

This activity is part of the toolbox when you want to create a Custom Workflow Activity in Visual Studio 2012.

WaitForCustomEvent 3

When deployed to the SharePoint server, it will show up as a custom action in SharePoint Designer 2013. This activity has an input of EventName and an output of Result.

WaitForCustomEvent 12

SharePoint Client Side Object Model: WorkflowInstanceService.PublishCustomEvent

In the external application, leveraging the SharePoint Client Side Object Model, use the WorkflowInstanceService.PublishCustomEvent method to call to the waiting workflow instance to continue. This activity would handle the event is by a CSOM API call using the PublishCustomEvent method.

Parameter Description Type
instance The instance of a workflow that is running. WorkflowInstance
eventName The event name that would be declared in the WaitForCustomEvent activity String
payload Data that would be passed to the workflow. String

As you can see, the EventName input is associated to the eventName parameter and the Result output to the payload parameter. http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.workflowservices.workflowinstanceservice.publishcustomevent.aspx

WorkflowServicesManager workflowServiceManager = new WorkflowServicesManager(web);
var workflowInstanceService = workflowServiceManager.GetWorkflowInstanceService();
workflowInstanceService.PublishCustomEvent(workflowInstance, "CustomEventName", "Eventpayload:value;key:value");

Sample code on the use of this method is in  Sohel Blog post

Creating the Workflow Custom Activity in Visual Studio 2012

    1. Create new project SharePoint 2013 – Empty Project
    2. Right click on the project > Add new item > Select Workflow Custom ActivityWaitForCustomEvent 5
    3. Click on the xaml file and see the designer surface.
    4. Open the Toolbox pane
    5. Drag and drop the WaitForCustomEvent activity on to the designer surface. I have also added WriteToHistory activity for debugging/tracing purposes. WaitForCustomEvent 6
    6. Create arguments to make the design of this custom workflow activity dynamic and reusable in Sharepoint Designer workflows. At the bottom of the designer surface, click on the Arguments tab.WaitForCustomEvent 7

      Set “EventName” as an input argument and the “EventOuput” as the output argument. To give you an idea of what we are trying to achieve by “dynamic” here is a peak of how it will be used in SharePoint Designer workflow. The following blue text are placeholders for literal values or variables.

      WaitForCustomEvent 11 WaitForCustomEvent 9

    7. So now let’s get back to setting up these arguments in Visual Studio 2012. Click on the WaitForCustomEvent activity properties pane. Enter “EventName” argument for the EventName input Enter “EventOuput” argument for the Result ouput.WaitForCustomEvent 10
    8. Let’s surface these arguments to SharePoint Designer to look likeWaitForCustomEvent 8 Click on the .actions4 file. Setup as follows:
      <Action Name="WaitEventActivity" ClassName="WaitEvent.WaitEventActivity" Category="Custom" AppliesTo="all">
        <RuleDesigner Sentence="Wait Event Name %1 ( Event Args %2 )">
          <FieldBind Field="EventName" Text="Event Name" Id="1"
          DesignerType="TextBox"
          DisplayName="Event Name triggered from an external system" />
          <FieldBind Field="EventOutput" Text="Event Output" Id="2"
          DesignerType="TextBox" DisplayName="Event ouput from an external system"   />
        </RuleDesigner>
        <Parameters>
          <Parameter Name="EventName" Type="System.String, mscorlib" Direction="Optional"
          DesignerType="TextBox"
          Description="Event Name" />
          <Parameter Name="EventOuput" Type="System.String, mscorlib" Direction="Out"
          DesignerType="TextBox"
          Description="Event Output" />
        </Parameters>
      </Action>
      

Note: you can find more examples in the workflow15.actions4 file. These are all the out of the box actions. They are located at C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\TEMPLATE\1033\Workflow

    1. Deploy the solution to your SharePoint site.
      This is deployed as a farm solution and activate the feature at the web scope.
    2.  Open SharePoint Designer 2013 and open the site where you had activated the feature with the WaitForCustomEvent activity.
    3. Create a new Workflow in 2013 workflow platform
    4. In the ribbon, click Action > Add the Call HTTP Web Service to make a call to the external application through a RESTful service such that the necessary business logic will be executed.
      Pass correlating information and custom event name as query. This will be used to by the external application to call back to the workflow.
    5. In the ribbon, click Action and you should see in the Custom group as defined in .actions4 file.
      WaitForCustomEvent 12
    6. Create the following workflow with using the actions Set Workflow Status, Log to History List WaitForCustomEvent 13 

Creating a Mock External Application

This application will serve as a mock external system or application noted as “3” in the diagram above. This application will serve two purposes:

  • Self-hosted RESTful services that wraps the business logic of which the workflow can call into. You may to choose to host in IIS web server.
  • This business logic will make a call to publish a custom event notification to the running workflow instance using the SharePoint .NET Client Side Object Model.

Reference:
How to create Self-Host a Web API
http://www.asp.net/web-api/overview/hosting-aspnet-web-api/self-host-a-web-api

  1. Add New Project
  2. Select Console Application
  3. Right-click the Project > Manage NuGet Packages
  4. Install Microsoft ASP.NET Web API Self Host
  5. Add assembly references > browse C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\
    • Microsoft.SharePoint.Client.WorkflowServices
    • Microsoft.SharePoint.Client.Workflow
    • Microsoft.SharePoint.Client.Runtime
    • Create a class and method
  6. Create a business class with the method
    PublishCustomWorkflowEvent(string url, string listTitle, string documentTitle, string eventName, string eventArgs)
    {
      using (ClientContext ctx = new ClientContext(url))
      {
        ctx.Credentials = new NetworkCredential("\\ ", "");
        // Best practice: retrive credentials in a secured credentials store.
    
        Microsoft.SharePoint.Client.WorkflowServices.WorkflowServicesManager workflowServiceManager = new Microsoft.SharePoint.Client.WorkflowServices.WorkflowServicesManager(ctx, ctx.Web);
        var workflowInstanceService = workflowServiceManager.GetWorkflowInstanceService();
    
        List list = ctx.Web.Lists.GetByTitle(listTitle);
        int itemId = 1;
    
        CamlQuery query = new CamlQuery();
        query.ViewXml =@""
        + documentTitle
        + "";
    
        ctx.Load(list);
        ctx.ExecuteQuery();
    
        ListItemCollection listItems = list.GetItems(query);
        ctx.Load(listItems);
        ctx.ExecuteQuery();
        if (listItems.Count > 0)
          itemId = listItems[0].Id;
    
        var workflowInstances = workflowInstanceService.EnumerateInstancesForListItem(list.Id, itemId);
    
        ctx.Load(workflowInstances);
        ctx.ExecuteQuery();
    
        // Once we get the workflow Instance, we can get the instance properties as shown below. Any properties in Workflow Initiation form will also be available:
        if (workflowInstances.Count > 0)
        {
        foreach (WorkflowInstance instance in workflowInstances)
        {
          WorkflowStatus status = instance.Status;
          if (instance.Properties.Count > 0)
          {
            var itemUrl = instance.Properties["Microsoft.SharePoint.ActivationProperties.CurrentItemUrl"];
    
            Console.WriteLine("Internal Status: " + instance.Status);
            Console.WriteLine(" Item Url: " + itemUrl);
            Console.WriteLine(" Workflow User Status: " + instance.UserStatus);
            string userStatus = instance.UserStatus;
            var propertyValue = instance.Properties["Microsoft.SharePoint.ActivationProperties.ItemId"];
    
            workflowInstanceService.PublishCustomEvent(instance, eventName, eventArgs);
            Console.ForegroundColor = ConsoleColor.Magenta;
            Console.WriteLine("Event Args: " + eventArgs);
         }
        } // end foreach workflowInstances
      }
    }
    
    
  7. Create a RESTful method in a controller class inherited from the APIController.
    [HttpGet]
    public string DoSomeWork(string url, string listTitle, string documentTitle, string eventName)
    {
      string output = string.Emptyy;
      // DO SOME WORK
    
      // publish custom event to running workflow instance
      PublishCustomWorkflowEvent(url, listTitle, documentTitle, eventName, eventArgs);
     return output;</pre>
    }
    

My favourite SharePoint 2013 resource of the week

Out of all the resources posted on many blogs and on technet, I find the following link the best to kick off the learning curve:

SharePoint 2013 training for IT pros
http://technet.microsoft.com/en-us/sharepoint/fp123606

These videos are great because

  • Talk mostly net new features/functionality
  • Great starter for experienced SharePoint professionals
  • Good depth of overview on new features and functionality
  • Beneficial to site administrators/owners/contributors and not just for IT Pros
  • 14 Modules with about 2-7 videos each

My Presentation at Toronto SharePoint Camp 2011

For my 2nd year in a row, I just had delivered a presentation at the Toronto SharePoint Camp 2011 this past weekend. It was another smashing community event. Thanks to the organizers and all who came.
tspug.sharepointspace.com

My presentation this year was on http://www.calgary.ca as a SharePoint 2010 Internet Site. I have been on this project for a year and half from build, test, Go-live, knowledge transfer and maintenance phases.

SP Camp 2011 - Speaking

Presenting http://www.calgary.ca as a SharePoint Internet Site at Toronto SharePoint Camp 2011

In my presentation I had explored the high level architecture of the many components such as the home page, site structure, calendar, mobile, google search appliance integration, interactive map, user forms, url redirection rules. Explained the extent of integration with other technology, achieving performance, customizations and extensibility.

Planning on speaking at more events in the future in local technology groups around Toronto and in Calgary.

I have delivered this presentation to the Calgary SharePoint User Group this past October and will be presenting to the Calgary DotNet User group on Dec 14.
dotnetcalgary.com

Roy Kim

Networking for Application Developers

This post is about IT Networking. Having been in Web Application Development all of my career, I spent a couple of months learning about computer systems networking. My belief in being that well-rounded technology architect, basic networking knowledge is fundamental. More than just knowing how to use ‘ping’ and ‘ipconfig’ command line tools, I wanted to know more.

I wondered about the following:

  • Networking hardware in-depth
  • Network protocols
  • How to troubleshoot extensively
  • Other command line networking tools
  • DNS, DHCP, Firewall
My Sample Network Diagram

My Sample Network Diagram

And so for those of you in any type of application development, especially SharePoint, I highly encourage understanding the many basics elements of networking and server software.
 
The key benefits in doing so are
  • Interface with IT administrators and IT architects
    • Know how to ask the right questions
    • Understand their designs and implementation that they provide
  • Understand the infrastructure of your development machine, testing environments and production environments
    • As a result, troubleshoot problems and escalate to IT
    • Fix your own networking problems
  • Become a well-rounded Technology Architect
    • Design architecture with IT touch points in mind.
 
Understanding networking and system administration involves a different mind set and I can appreciate the skill and breadth of knowledge of a competent IT administrator. For example, IT must understand the relationships and dependencies of all layers of hardware, server software, networking protocols and configuration. Whereas a software developers’ troubleshooting process is simpler with the use of debuggers and custom error/exception logging. IT must rely on multiple tools with a methodical mindset in troubleshooting issues.
 
So invest in learning networking and troubleshooting issues, and you’ll impress the IT folks when you hand them root cause analysis and perhaps a recommendation to the solution 😉
 
And so to wrap up my knowledge and my notes, I have assembled a powerpoint presentation deck. There are some notes and resource links in the Notes section of each slide. I have presented to my colleagues and plan to speak at technology user groups and conferences.
 

Content Organizer Feature for Large Site Hierarchies

The Situation

A typical content management model that most end users are most familiar with are hierarchical in nature. For example, “Windows Explorer”  is hierarchical to manage files. A hierarchical structure has the characteristics to support a simple and natural feel to create classification scheme through folders, finding content and moving content.

However, the challenges and limitations from this simple hierarchical model is as follows:

  • Difficult to organize not knowing where documents should go.
  • Files lost or buried in deep file hierarchies.
  • Duplicated files in across different folders
  • Misplaced files
  • Files in flux tend to be stored in many “temp” folder.

Basic attributes of a content management system

  • Browsing/Navigation
  • Search
  • Maintainability
  • Content Metadata properties
  • Taxonomy Structure (Classification)

The SharePoint 2010 Content Organizer Feature

This feature helps alleviate the limitations, challenges and gaps of a simple hierarchiical content  management model. This feature provides the following value:

  • —Automated placement of content based on content metadata.
  • Avoid duplicate content by use of the versioning capability
  • Avoid misplaced content based on rules.
  • A centralized source drop-off location of content
  • Support Governance policies and processes to guide and control how the organization uses the technologies to accomplish content organization goals
  • Increase ease of use of content management for content authors
  • Contribute to overall information architecture effectiveness.
  • Scalable to Enterprise structure with many libraries.

The Content Organizer Feature use in large site hierarchies

A significant attribute of the content organizer feature is its scalability to accommodate a large site hierarchy with great breadth and depth of sites and document libraries.

The following diagram depicts a scenario where a word document being routed to a target document library.

  1. User uploads a document to a drop off library to the root site. Also populates the metadata properties of the documents
  2. A content organizer rule in this site matches against the values of the metadata properties and is set to be routed to a drop off library to a sub-site.
  3. A content organizer rule in this sub-site matches against the values of the metadata properties and is set to be routed to a drop off library to a sub-site.
  4. A content organizer rule in this sub-site matches against the values of the metadata properties and is set to be routed to a document library within this sub-site.

Note: Routing are not limited to sub site as it is in the diagram below, but can directly to any site within a SharePoint farm.

All in all, based on content organizer rules matching, documents can be routed to a drop off library to another site or to a document library within its site.  Note that a content organizer rule can not route documents from a drop off library of a given site directly to a document library of another site.

Routing documents from one site to any site in the site hierarchy. That is, to child sites, to sibling sites, to parent sites.

  • Analogous to a postal service – just drop the it in the “mail box”

Routing Rules from one site to another

Routing rules are configurable
Based on metadata properties including managed metadata (Taxonomy structures)
Configure versioning, library folder creation, alerts,

Trade-off in automated routing: Managing routing rules

Content Organizer Routing Rules manager must understand site hierarchy, overall information architecture, hold communication and processes with each organizational unit behind a site

Example scenarios

  • Submit documents to HR regarding certain policies
    Submit by uploading to sender’s site Drop Off Library with a Routing Rule to targeted to the HR site’s drop off library. There would be a rule that HR team would maintain to send document’s to appropriate document library within their HR site.

One Rules Manager

  • Manages all routing rules for all sites between site drop off library to another site’s drop off library; between a site’s drop off library to a site’s document library
  • Must have permissions to all sites.
  • Knowledge and skill is centralized to one person
  • One point of failure. Hit by a bus problem
  • A layer of process between site team and central rules manager

Many Site Rules Manager

  • Self-service model
  • More power and control
  • Faster cycle of updating routing rules
  • Broader training and knowledge to designated site rule manager
  • This person may be a business analyst on the team.

Roles in maintaining the information architecture

Content Author – The many content authors who create and edit documents in respective document libraries

Content Organizer Rules Administrator – The overseer and supervisor of keeping up to date the CO rules within the site hierarchy
Site Owner. Needs to engage with content authors and group leads to be aware of changes in the overall information architecture. Maintenance responsibilities are transferred to this role.  In smaller site hierarchies, a content author can take upon this role as a champion for the other content authors.

Governance Body – Business stake holders, content supervisors/leads, IT, architects to take part in the overall governance of operational and system effectiveness. Content organization and management should definitely be a governance agenda.

Conclusion

The new SharePoint 2010 content organizer feature supports for a much versatile and scalable information architecture. This includes the capability of automated routing given content organizer rules. Content organization can scale to large site hierarchies with much depth and breadth. As each site is abstracted with one and only one document drop off library, scalability is endorsed by rules from one site’s drop off library to another site’s drop off library. As content routing automation is liberated from many content author’s, note that manual administration is centralized to a content rules administrator for all or group of sites. The overall net effect of manual labour is reduced and higher level of information architecture effectiveness is achieved.

By Roy Kim
roykimtoronto@gmail.com
SharePoint Consultant