Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Pat Romanski, Jason Bloomberg, Derek Weeks

Blog Feed Post

How to: Use the SharePoint 2013 Content Enrichment Web Service

The Content Enrichment Web Service (CEWS) allows you to extend the functionality of SharePoint 2013 Search.  Using CEWS, a developer can send the values of managed properties to an external web service and return new or modified managed properties to include in the index.  The process involved implementing a custom WCF service and then registering it with PowerShell.  The PowerShell cmdlet specifies which properties go into and out of the service.

This post has been cross-posted to MSDN Code where you can download a working sample and deploy it.

This example will take the values of the Author and LastModifiedTime managed properties and write a new string such as "Modified by <author> on <LastModifiedTime>." to the managed property TestProperty.  This property need to be created first prior to trying to use your Content Enrichment Web Service.  The property should be configured as type Text with the following attributes: Query, Search,Retrieve, and Refine.

CEWSNewManagedProperty

To get started, create a new WCF Service Project called ContentEnrichmentExampleService.

CEWSVisualStudioNewService

Once the project is created, you can delete the default service Service1.svc and IService.cs as it won't be needed.

Next, you will need to add a reference to the following assembly. 

  • microsoft.office.server.search.contentprocessingenrichment.dll

This assembly can be found in the folder Installation Path\Microsoft Office Servers\15.0\Search\Applications\External.

Now, we need to create the service to do the content enrichment processing.  Create a new service called ContentEnrichmentExampleService.svc. 

CEWSVisualStudioNewService2

Delete the file IContentEnrichmentExampleService.cs as it will not be needed.  The custom service instead inherits from IContentProcessingEnrichmentService.

Now we can start adding our code to ContentEnrichmentProcessingExampleService.svc.cs.  This code will retrieve the values from the input properties, create our new output property TestProperty and send it back to the search index.

Start by adding using statements to the assembly we added.

using Microsoft.Office.Server.Search.ContentProcessingEnrichment;

using Microsoft.Office.Server.Search.ContentProcessingEnrichment.PropertyTypes;

The interface that the class is inheriting from will be shown as broken since you deleted it.  Change it instead to inherit from IContentProcessingEnrichmentService.

public class ContentEnrichmentExampleService : IContentProcessingEnrichmentService

Add a ProcessedItem collection to hold the output managed property values from the service.

private readonly ProcessedItem processedItemHolder = new ProcessedItem

{

    ItemProperties = new List<AbstractProperty>()

};

Then, Implement the ProcessItem method.  This method receives the input managed properties and allows you to write code to generate the output managed properties.

public ProcessedItem ProcessItem(Item item)

{

 

}

Inside the ProcessItem method, initialize the ErrorCode and ItemProperties.

processedItemHolder.ErrorCode = 0;

processedItemHolder.ItemProperties.Clear();

We then, need to Create a new output managed property named TestProperty.  The property object takes types based on what type of managed property you defined.

var testProperty = new Property<string>();

testProperty.Name = "TestProperty";

Now we are going to retrieve the managed properties using a simple lamdba expression.  Remember that the names of properties are case sensitive and need to match exactly how it shows on the Search Schema page.  You also need to cast the object to the appropriate type.  Since the Author managed property is a multi-valued property, we need to use List<string>.  The LastModifiedTime is a date so we use a DateTime type.

var authorProperty = item.ItemProperties.FirstOrDefault(i => i.Name == "Author") as Property<List<string>>;

var writeProperty = item.ItemProperties.FirstOrDefault(i => i.Name == "LastModifiedTime") as Property<DateTime>;

Now, we need to verify that the properties aren't null.

if ((authorProperty != null) && (writeProperty != null))

{

 

}

We are then going to write out a new string to TestProperty in the format Modified by {Author} on {LastModifiedTime}.  Since Author supports multiple values, only the first value was used.  This value goes in the Value property.  Once we set the value, we have to add it processedItemHolder so that it can send the values back to the search index.

testProperty.Value = string.Format("Modified by {0} on {1}.", authorProperty.Value.First(), writeProperty.Value);

processedItemHolder.ItemProperties.Add(testProperty);

Return the processItemHolder

return processedItemHolder;

At this point, we can run and debug our service using F5.  Leave the service running as it will be called when doing a full crawl.

To register the service with SharePoint we use using the New-SPEnterpriseSearchContentEnrichmentConfiguration cmdlet.  Use the following PowerShell script  to register the Content Enrichment Web Service.  Verify that the Endpoint parameter contains the correct URL to your service.  The example below has the location used in the source code I provided.  If you start from scratch or you have deployed you service to a remote server, then you will need to update the address.

$ssa = Get-SPEnterpriseSearchServiceApplication  $config = New-SPEnterpriseSearchContentEnrichmentConfiguration  $config.Endpoint = "http://localhost:54641/ContentEnrichmentExampleService.svc"  $config.InputProperties = "Author", "LastModifiedTime"  $config.OutputProperties = "TestProperty"  $config.SendRawData = $false  $config.Timeout = 30000  $config  Set-SPEnterpriseSearchContentEnrichmentConfiguration –SearchApplication $ssa –ContentEnrichmentConfiguration $config

The InputProperties parameter specifies the managed properties sent to the service.  The OutputProperties specifies the managed properties returned by the service.  Note, that both are case sensitive.  All managed properties referenced need to be created in advance.  Set the Timeout propety higher to give yourself sufficient time to debug.  For a complete reference on parameters, see this MSDN reference.

After registering your content enrichment service, start a full crawl.  Again, ensure that your Content Enrichment Web Service is running in the debugger.  While it is crawling, you can set breakpoints as desired. 

To verify the functionality after the crawl is complete, issue a query using REST in the browser like the one below.

http://server/_api/search/query?querytext='*'&selectproperties='title,path,author,testproperty'

This query will return every item in the index and include the new TestProperty field.  You can verify that the new property was included and has the expected result as shown in the example below.

CEWSRESTAPIQuery

I hope this gets you started with Content Enrichment Web Services.  I have a few follow-up posts to include on some more of the PowerShell parameters, but I hope this helps.

Again, you can find the complete source code and PowerShell script on MSDN Code.  Feel free to leave me a comment if you run into an issue or have a question.

Read the original blog entry...

More Stories By Corey Roth

Corey Roth, a SharePoint Server MVP, is a consultant at Hitachi Consulting specializing in SharePoint and Office 365 for clients in the energy sector. He has more than ten years of experience delivering solutions in the energy, travel, advertising and consumer electronics verticals.

Corey specializes in delivering ECM and search solutions to clients using SharePoint. Corey has always focused on rapid adoption of new Microsoft technologies including Visual Studio 2013, Office 365, and SharePoint.

He is a member of the .NET Mafia (www.dotnetmafia.com) where he blogs about the latest technology and SharePoint. He is dedicated to the community and speaks regularly at user groups and SharePoint Saturdays.

@MicroservicesExpo Stories
This is not a small hotel event. It is also not a big vendor party where politicians and entertainers are more important than real content. This is Cloud Expo, the world's longest-running conference and exhibition focused on Cloud Computing and all that it entails. If you want serious presentations and valuable insight about Cloud Computing for three straight days, then register now for Cloud Expo.
SYS-CON Events announced today that Stratoscale, the software company developing the next generation data center operating system, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Stratoscale is revolutionizing the data center with a zero-to-cloud-in-minutes solution. With Stratoscale’s hardware-agnostic, Software Defined Data Center (SDDC) solution to store everything, run anything and scale everywhere...
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
Digital means customer preferences and behavior are driving enterprise technology decisions to be sure, but let’s not forget our employees. After all, when we say customer, we mean customer writ large, including partners, supply chain participants, and yes, those salaried denizens whose daily labor forms the cornerstone of the enterprise. While your customers bask in the warm rays of your digital efforts, are your employees toiling away in the dark recesses of your enterprise, pecking data into...
SYS-CON Events announced today that Men & Mice, the leading global provider of DNS, DHCP and IP address management overlay solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. The Men & Mice Suite overlay solution is already known for its powerful application in heterogeneous operating environments, enabling enterprises to scale without fuss. Building on a solid range of diverse platform support,...
You deployed your app with the Bluemix PaaS and it's gaining some serious traction, so it's time to make some tweaks. Did you design your application in a way that it can scale in the cloud? Were you even thinking about the cloud when you built the app? If not, chances are your app is going to break. Check out this webcast to learn various techniques for designing applications that will scale successfully in Bluemix, for the confidence you need to take your apps to the next level and beyond.
I had the opportunity to catch up with Chris Corriere - DevOps Engineer at AutoTrader - to talk about his experiences in the realm of Rugged DevOps. We discussed automation, culture and collaboration, and which thought leaders he is following. Chris Corriere: Hey, I'm Chris Corriere. I'm a DevOps Engineer AutoTrader. Derek Weeks: Today we're going to talk about Rugged DevOps. It's a subject that's gaining a lot of traction in the community but not a lot of people are really familiar with wh...
Wow, if you ever wanted to learn about Rugged DevOps (some call it DevSecOps), sit down for a spell with Shannon Lietz, Ian Allison and Scott Kennedy from Intuit. We discussed a number of important topics including internal war games, culture hacking, gamification of Rugged DevOps and starting as a small team. There are 100 gold nuggets in this conversation for novices and experts alike.
In 2006, Martin Fowler posted his now famous essay on Continuous Integration. Looking back, what seemed revolutionary, radical or just plain crazy is now common, pedestrian and "just what you do." I love it. Back then, building and releasing software was a real pain. Integration was something you did at the end, after code complete, and we didn't know how long it would take. Some people may recall how we, as an industry, spent a massive amount of time integrating code from one team with another...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
With DevOps becoming more well-known and established practice in nearly every industry that delivers software, it is important to continually reassess its efficacy. This week’s top 10 includes a discussion on how the quick uptake of DevOps adoption in the enterprise has posed some serious challenges. Additionally, organizations who have taken the DevOps plunge must find ways to find, hire and keep their DevOps talent in order to keep the machine running smoothly.
Call it DevOps or not, if you are concerned about releasing more code faster and at a higher quality, the resulting software delivery chain and process will look and smell like DevOps. But for existing development teams, no matter what the velocity objective is, getting from here to there is not something that can be done without a plan. Moving your release cadence from months to weeks is not just about learning Agile practices and getting some automation tools. It involves people, tooling and ...
Between the mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at 18th Cloud Expo, Charles Kendrick, CTO & Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how business and devel...
The notion of customer journeys, of course, are central to the digital marketer’s playbook. Clearly, enterprises should focus their digital efforts on such journeys, as they represent customer interactions over time. But making customer journeys the centerpiece of the enterprise architecture, however, leaves more questions than answers. The challenge arises when EAs consider the context of the customer journey in the overall architecture as well as the architectural elements that make up each...
Much of the discussion around cloud DevOps focuses on the speed with which companies need to get new code into production. This focus is important – because in an increasingly digital marketplace, new code enables new value propositions. New code is also often essential for maintaining competitive parity with market innovators. But new code doesn’t just have to deliver the functionality the business requires. It also has to behave well because the behavior of code in the cloud affects performan...
APIs have taken the world by storm in recent years. The use of APIs has gone beyond just traditional "software" companies, to companies and organizations across industries using APIs to share information and power their applications. For some organizations, APIs are the biggest revenue drivers. For example, Salesforce generates nearly 50% of annual revenue through APIs. In other cases, APIs can increase a business's footprint and initiate collaboration. Netflix, for example, reported over 5 bi...
As the software delivery industry continues to evolve and mature, the challenge of managing the growing list of the tools and processes becomes more daunting every day. Today, Application Lifecycle Management (ALM) platforms are proving most valuable by providing the governance, management and coordination for every stage of development, deployment and release. Recently, I spoke with Madison Moore at SD Times about the changing market and where ALM is headed.
If there is anything we have learned by now, is that every business paves their own unique path for releasing software- every pipeline, implementation and practices are a bit different, and DevOps comes in all shapes and sizes. Software delivery practices are often comprised of set of several complementing (or even competing) methodologies – such as leveraging Agile, DevOps and even a mix of ITIL, to create the combination that’s most suitable for your organization and that maximize your busines...
Struggling to keep up with increasing application demand? Learn how Platform as a Service (PaaS) can streamline application development processes and make resource management easy.
New Relic, Inc. has announced a set of new features across the New Relic Software Analytics Cloud that offer IT operations teams increased visibility, and the ability to diagnose and resolve performance problems quickly. The new features further IT operations teams’ ability to leverage data and analytics, as well as drive collaboration and a common, shared understanding between teams. Software teams are under pressure to resolve performance issues quickly and improve availability, as the comple...