Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Carmen Gonzalez, Sematext Blog, Pat Romanski

Blog Feed Post

How to: Use the SharePoint 2013 Content Enrichment Web Service

The Content Enrichment Web Service (CEWS) allows you to extend the functionality of SharePoint 2013 Search.  Using CEWS, a developer can send the values of managed properties to an external web service and return new or modified managed properties to include in the index.  The process involved implementing a custom WCF service and then registering it with PowerShell.  The PowerShell cmdlet specifies which properties go into and out of the service.

This post has been cross-posted to MSDN Code where you can download a working sample and deploy it.

This example will take the values of the Author and LastModifiedTime managed properties and write a new string such as "Modified by <author> on <LastModifiedTime>." to the managed property TestProperty.  This property need to be created first prior to trying to use your Content Enrichment Web Service.  The property should be configured as type Text with the following attributes: Query, Search,Retrieve, and Refine.

CEWSNewManagedProperty

To get started, create a new WCF Service Project called ContentEnrichmentExampleService.

CEWSVisualStudioNewService

Once the project is created, you can delete the default service Service1.svc and IService.cs as it won't be needed.

Next, you will need to add a reference to the following assembly. 

  • microsoft.office.server.search.contentprocessingenrichment.dll

This assembly can be found in the folder Installation Path\Microsoft Office Servers\15.0\Search\Applications\External.

Now, we need to create the service to do the content enrichment processing.  Create a new service called ContentEnrichmentExampleService.svc. 

CEWSVisualStudioNewService2

Delete the file IContentEnrichmentExampleService.cs as it will not be needed.  The custom service instead inherits from IContentProcessingEnrichmentService.

Now we can start adding our code to ContentEnrichmentProcessingExampleService.svc.cs.  This code will retrieve the values from the input properties, create our new output property TestProperty and send it back to the search index.

Start by adding using statements to the assembly we added.

using Microsoft.Office.Server.Search.ContentProcessingEnrichment;

using Microsoft.Office.Server.Search.ContentProcessingEnrichment.PropertyTypes;

The interface that the class is inheriting from will be shown as broken since you deleted it.  Change it instead to inherit from IContentProcessingEnrichmentService.

public class ContentEnrichmentExampleService : IContentProcessingEnrichmentService

Add a ProcessedItem collection to hold the output managed property values from the service.

private readonly ProcessedItem processedItemHolder = new ProcessedItem

{

    ItemProperties = new List<AbstractProperty>()

};

Then, Implement the ProcessItem method.  This method receives the input managed properties and allows you to write code to generate the output managed properties.

public ProcessedItem ProcessItem(Item item)

{

 

}

Inside the ProcessItem method, initialize the ErrorCode and ItemProperties.

processedItemHolder.ErrorCode = 0;

processedItemHolder.ItemProperties.Clear();

We then, need to Create a new output managed property named TestProperty.  The property object takes types based on what type of managed property you defined.

var testProperty = new Property<string>();

testProperty.Name = "TestProperty";

Now we are going to retrieve the managed properties using a simple lamdba expression.  Remember that the names of properties are case sensitive and need to match exactly how it shows on the Search Schema page.  You also need to cast the object to the appropriate type.  Since the Author managed property is a multi-valued property, we need to use List<string>.  The LastModifiedTime is a date so we use a DateTime type.

var authorProperty = item.ItemProperties.FirstOrDefault(i => i.Name == "Author") as Property<List<string>>;

var writeProperty = item.ItemProperties.FirstOrDefault(i => i.Name == "LastModifiedTime") as Property<DateTime>;

Now, we need to verify that the properties aren't null.

if ((authorProperty != null) && (writeProperty != null))

{

 

}

We are then going to write out a new string to TestProperty in the format Modified by {Author} on {LastModifiedTime}.  Since Author supports multiple values, only the first value was used.  This value goes in the Value property.  Once we set the value, we have to add it processedItemHolder so that it can send the values back to the search index.

testProperty.Value = string.Format("Modified by {0} on {1}.", authorProperty.Value.First(), writeProperty.Value);

processedItemHolder.ItemProperties.Add(testProperty);

Return the processItemHolder

return processedItemHolder;

At this point, we can run and debug our service using F5.  Leave the service running as it will be called when doing a full crawl.

To register the service with SharePoint we use using the New-SPEnterpriseSearchContentEnrichmentConfiguration cmdlet.  Use the following PowerShell script  to register the Content Enrichment Web Service.  Verify that the Endpoint parameter contains the correct URL to your service.  The example below has the location used in the source code I provided.  If you start from scratch or you have deployed you service to a remote server, then you will need to update the address.

$ssa = Get-SPEnterpriseSearchServiceApplication  $config = New-SPEnterpriseSearchContentEnrichmentConfiguration  $config.Endpoint = "http://localhost:54641/ContentEnrichmentExampleService.svc"  $config.InputProperties = "Author", "LastModifiedTime"  $config.OutputProperties = "TestProperty"  $config.SendRawData = $false  $config.Timeout = 30000  $config  Set-SPEnterpriseSearchContentEnrichmentConfiguration –SearchApplication $ssa –ContentEnrichmentConfiguration $config

The InputProperties parameter specifies the managed properties sent to the service.  The OutputProperties specifies the managed properties returned by the service.  Note, that both are case sensitive.  All managed properties referenced need to be created in advance.  Set the Timeout propety higher to give yourself sufficient time to debug.  For a complete reference on parameters, see this MSDN reference.

After registering your content enrichment service, start a full crawl.  Again, ensure that your Content Enrichment Web Service is running in the debugger.  While it is crawling, you can set breakpoints as desired. 

To verify the functionality after the crawl is complete, issue a query using REST in the browser like the one below.

http://server/_api/search/query?querytext='*'&selectproperties='title,path,author,testproperty'

This query will return every item in the index and include the new TestProperty field.  You can verify that the new property was included and has the expected result as shown in the example below.

CEWSRESTAPIQuery

I hope this gets you started with Content Enrichment Web Services.  I have a few follow-up posts to include on some more of the PowerShell parameters, but I hope this helps.

Again, you can find the complete source code and PowerShell script on MSDN Code.  Feel free to leave me a comment if you run into an issue or have a question.

Read the original blog entry...

More Stories By Corey Roth

Corey Roth, a SharePoint Server MVP, is a consultant at Hitachi Consulting specializing in SharePoint and Office 365 for clients in the energy sector. He has more than ten years of experience delivering solutions in the energy, travel, advertising and consumer electronics verticals.

Corey specializes in delivering ECM and search solutions to clients using SharePoint. Corey has always focused on rapid adoption of new Microsoft technologies including Visual Studio 2013, Office 365, and SharePoint.

He is a member of the .NET Mafia (www.dotnetmafia.com) where he blogs about the latest technology and SharePoint. He is dedicated to the community and speaks regularly at user groups and SharePoint Saturdays.

@MicroservicesExpo Stories
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Today’s IT environments are increasingly heterogeneous, with Linux, Java, Oracle and MySQL considered nearly as common as traditional Windows environments. In many cases, these platforms have been integrated into an organization’s Windows-based IT department by way of an acquisition of a company that leverages one of those platforms. In other cases, the applications may have been part of the IT department for years, but managed by a separate department or singular administrator. Still, whether...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Logs are continuous digital records of events generated by all components of your software stack – and they’re everywhere – your networks, servers, applications, containers and cloud infrastructure just to name a few. The data logs provide are like an X-ray for your IT infrastructure. Without logs, this lack of visibility creates operational challenges for managing modern applications that drive today’s digital businesses.
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2017' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great t...
If you haven’t heard yet, CollabNet just put out some very big news for managing and gaining value from DevOps. We introduced CollabNet DevOps Lifecycle Manager (DLM) — a platform designed exclusively for providing a single pane of glass, dashboard, and traceability views across your DevOps toolchain and processes from planning to operations and that can be traced back to planning and development.
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
SYS-CON Events announced today that Fusion, a leading provider of cloud services, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Fusion, a leading provider of integrated cloud solutions to small, medium and large businesses, is the industry’s single source for the cloud. Fusion’s advanced, proprietary cloud service platform enables the integration of leading edge solutions in the cloud, including cloud...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Kubernetes is a new and revolutionary open-sourced system for managing containers across multiple hosts in a cluster. Ansible is a simple IT automation tool for just about any requirement for reproducible environments. In his session at @DevOpsSummit at 18th Cloud Expo, Patrick Galbraith, a principal engineer at HPE, discussed how to build a fully functional Kubernetes cluster on a number of virtual machines or bare-metal hosts. Also included will be a brief demonstration of running a Galera MyS...