Click here to close now.



Welcome!

Microservices Expo Authors: Elizabeth White, Liz McMillan, Cynthia Dunlop, Pat Romanski, AppDynamics Blog

Blog Feed Post

How to: Use the SharePoint 2013 Content Enrichment Web Service

The Content Enrichment Web Service (CEWS) allows you to extend the functionality of SharePoint 2013 Search.  Using CEWS, a developer can send the values of managed properties to an external web service and return new or modified managed properties to include in the index.  The process involved implementing a custom WCF service and then registering it with PowerShell.  The PowerShell cmdlet specifies which properties go into and out of the service.

This post has been cross-posted to MSDN Code where you can download a working sample and deploy it.

This example will take the values of the Author and LastModifiedTime managed properties and write a new string such as "Modified by <author> on <LastModifiedTime>." to the managed property TestProperty.  This property need to be created first prior to trying to use your Content Enrichment Web Service.  The property should be configured as type Text with the following attributes: Query, Search,Retrieve, and Refine.

CEWSNewManagedProperty

To get started, create a new WCF Service Project called ContentEnrichmentExampleService.

CEWSVisualStudioNewService

Once the project is created, you can delete the default service Service1.svc and IService.cs as it won't be needed.

Next, you will need to add a reference to the following assembly. 

  • microsoft.office.server.search.contentprocessingenrichment.dll

This assembly can be found in the folder Installation Path\Microsoft Office Servers\15.0\Search\Applications\External.

Now, we need to create the service to do the content enrichment processing.  Create a new service called ContentEnrichmentExampleService.svc. 

CEWSVisualStudioNewService2

Delete the file IContentEnrichmentExampleService.cs as it will not be needed.  The custom service instead inherits from IContentProcessingEnrichmentService.

Now we can start adding our code to ContentEnrichmentProcessingExampleService.svc.cs.  This code will retrieve the values from the input properties, create our new output property TestProperty and send it back to the search index.

Start by adding using statements to the assembly we added.

using Microsoft.Office.Server.Search.ContentProcessingEnrichment;

using Microsoft.Office.Server.Search.ContentProcessingEnrichment.PropertyTypes;

The interface that the class is inheriting from will be shown as broken since you deleted it.  Change it instead to inherit from IContentProcessingEnrichmentService.

public class ContentEnrichmentExampleService : IContentProcessingEnrichmentService

Add a ProcessedItem collection to hold the output managed property values from the service.

private readonly ProcessedItem processedItemHolder = new ProcessedItem

{

    ItemProperties = new List<AbstractProperty>()

};

Then, Implement the ProcessItem method.  This method receives the input managed properties and allows you to write code to generate the output managed properties.

public ProcessedItem ProcessItem(Item item)

{

 

}

Inside the ProcessItem method, initialize the ErrorCode and ItemProperties.

processedItemHolder.ErrorCode = 0;

processedItemHolder.ItemProperties.Clear();

We then, need to Create a new output managed property named TestProperty.  The property object takes types based on what type of managed property you defined.

var testProperty = new Property<string>();

testProperty.Name = "TestProperty";

Now we are going to retrieve the managed properties using a simple lamdba expression.  Remember that the names of properties are case sensitive and need to match exactly how it shows on the Search Schema page.  You also need to cast the object to the appropriate type.  Since the Author managed property is a multi-valued property, we need to use List<string>.  The LastModifiedTime is a date so we use a DateTime type.

var authorProperty = item.ItemProperties.FirstOrDefault(i => i.Name == "Author") as Property<List<string>>;

var writeProperty = item.ItemProperties.FirstOrDefault(i => i.Name == "LastModifiedTime") as Property<DateTime>;

Now, we need to verify that the properties aren't null.

if ((authorProperty != null) && (writeProperty != null))

{

 

}

We are then going to write out a new string to TestProperty in the format Modified by {Author} on {LastModifiedTime}.  Since Author supports multiple values, only the first value was used.  This value goes in the Value property.  Once we set the value, we have to add it processedItemHolder so that it can send the values back to the search index.

testProperty.Value = string.Format("Modified by {0} on {1}.", authorProperty.Value.First(), writeProperty.Value);

processedItemHolder.ItemProperties.Add(testProperty);

Return the processItemHolder

return processedItemHolder;

At this point, we can run and debug our service using F5.  Leave the service running as it will be called when doing a full crawl.

To register the service with SharePoint we use using the New-SPEnterpriseSearchContentEnrichmentConfiguration cmdlet.  Use the following PowerShell script  to register the Content Enrichment Web Service.  Verify that the Endpoint parameter contains the correct URL to your service.  The example below has the location used in the source code I provided.  If you start from scratch or you have deployed you service to a remote server, then you will need to update the address.

$ssa = Get-SPEnterpriseSearchServiceApplication  $config = New-SPEnterpriseSearchContentEnrichmentConfiguration  $config.Endpoint = "http://localhost:54641/ContentEnrichmentExampleService.svc"  $config.InputProperties = "Author", "LastModifiedTime"  $config.OutputProperties = "TestProperty"  $config.SendRawData = $false  $config.Timeout = 30000  $config  Set-SPEnterpriseSearchContentEnrichmentConfiguration –SearchApplication $ssa –ContentEnrichmentConfiguration $config

The InputProperties parameter specifies the managed properties sent to the service.  The OutputProperties specifies the managed properties returned by the service.  Note, that both are case sensitive.  All managed properties referenced need to be created in advance.  Set the Timeout propety higher to give yourself sufficient time to debug.  For a complete reference on parameters, see this MSDN reference.

After registering your content enrichment service, start a full crawl.  Again, ensure that your Content Enrichment Web Service is running in the debugger.  While it is crawling, you can set breakpoints as desired. 

To verify the functionality after the crawl is complete, issue a query using REST in the browser like the one below.

http://server/_api/search/query?querytext='*'&selectproperties='title,path,author,testproperty'

This query will return every item in the index and include the new TestProperty field.  You can verify that the new property was included and has the expected result as shown in the example below.

CEWSRESTAPIQuery

I hope this gets you started with Content Enrichment Web Services.  I have a few follow-up posts to include on some more of the PowerShell parameters, but I hope this helps.

Again, you can find the complete source code and PowerShell script on MSDN Code.  Feel free to leave me a comment if you run into an issue or have a question.

Read the original blog entry...

More Stories By Corey Roth

Corey Roth, a SharePoint Server MVP, is a consultant at Hitachi Consulting specializing in SharePoint and Office 365 for clients in the energy sector. He has more than ten years of experience delivering solutions in the energy, travel, advertising and consumer electronics verticals.

Corey specializes in delivering ECM and search solutions to clients using SharePoint. Corey has always focused on rapid adoption of new Microsoft technologies including Visual Studio 2013, Office 365, and SharePoint.

He is a member of the .NET Mafia (www.dotnetmafia.com) where he blogs about the latest technology and SharePoint. He is dedicated to the community and speaks regularly at user groups and SharePoint Saturdays.

@MicroservicesExpo Stories
SYS-CON Events announced today that AppNeta, the leader in performance insight for business-critical web applications, will exhibit and present at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. AppNeta is the only application performance monitoring (APM) company to provide solutions for all applications – applications you develop internally, business-critical SaaS applications you use and the networks that deli...
For Valentine's Day, here's a lighthearted look at the "relationship" between two complementary technologies: service virtualization and cloud dev/test labs. Hey, I know it's been a while since we started being "a thing." When we met, everyone said you were just mocking, and that I wasn't real enough to make a living, with my head in the clouds. Yet, here we are, a few years later.
SYS-CON Events announced today that Column Technologies will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Established in 1998, Column Technologies is a global technology solutions provider with over 400 employees, headquartered in the United States with offices in Canada, India, and the United Kingdom. Column Technologies provides “Best of Breed” technology solutions that automate the key DevOps principal...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Microservices are a type of software architecture where large applications are made up of small, self-contained units working together through APIs that are not dependent on a specific language. Each service has a limited scope, concentrates on a specific task and is highly independent. This setup allows IT managers and developers to build systems in a modular way. In his book, “Building Microservices,” Sam Newman said microservices are small, focused components built to do a single thing very w...
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
Join us at Cloud Expo | @ThingsExpo 2016 – June 7-9 at the Javits Center in New York City and November 1-3 at the Santa Clara Convention Center in Santa Clara, CA – and deliver your unique message in a way that is striking and unforgettable by taking advantage of SYS-CON's unmatched high-impact, result-driven event / media packages.
At first adopted by enterprises to consolidate physical servers, virtualization is now widely used in cloud computing to offer elasticity and scalability. On the other hand, Docker has developed a new way to handle Linux containers, inspired by version control software such as Git, which allows you to keep all development versions. In his session at 17th Cloud Expo, Dominique Rodrigues, the co-founder and CTO of Nanocloud Software, discussed how in order to also handle QEMU / KVM virtual machin...
How is your DevOps transformation coming along? How do you measure Agility? Reliability? Efficiency? Quality? Success?! How do you optimize your processes? This morning on #c9d9 we talked about some of the metrics that matter for the different stakeholders throughout the software delivery pipeline. Our panelists shared their best practices.
Microservices are all the rage right now — and the industry is still learning, experimenting, and developing patterns, for successfully designing, deploying and managing Microservices in the real world. Are you considering jumping on the Microservices-wagon? Do Microservices make sense for your particular use case? What are some of the “gotchas” you should be aware of? This morning on #c9d9 we had experts from popular chat app Kik, SMB SaaS platform Yodle and hosted CI solution Semaphore sha...
SYS-CON Events announced today that FalconStor Software® Inc., a 15-year innovator of software-defined storage solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. FalconStor Software®, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged, hardware-agnostic, software-defined storage and data services platform. Its flagship solution FreeStor®, utilizes a horizonta...
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
If we look at slow, traditional IT and jump to the conclusion that just because we found its issues intractable before, that necessarily means we will again, then it’s time for a rethink. As a matter of fact, the world of IT has changed over the last ten years or so. We’ve been experiencing unprecedented innovation across the board – innovation in technology as well as in how people organize and accomplish tasks. Let’s take a look at three differences between today’s modern, digital context...
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
In a previous article, I demonstrated how to effectively and efficiently install the Dynatrace Application Monitoring solution using Ansible. In this post, I am going to explain how to achieve the same results using Chef with our official dynatrace cookbook available on GitHub and on the Chef Supermarket. In the following hands-on tutorial, we’ll also apply what we see as good practice on working with and extending our deployment automation blueprints to suit your needs.