|By Laurent Simoneau||
|May 4, 2011 08:15 AM EDT||
IDC analysts predict that by 2020 there will be 15 quintillion files in existence. That kind of volume has brought us to a point where we are using a term like "exabyte" (and quintillion!) to describe how much data exists. How much of that data will your organization own and how will you access it?
Dispersed across multiple systems, silos, geographies, and regions, data on its own provides no real inherent value. Native search boxes within applications only show a narrow picture of your data contained within that one application - certainly not enough information to make fast, informed business decisions. What about data contained in customer and employee communities? How can you access the valuable information contained in these rapidly growing, knowledge-intensive communities?
Let's face it - data will never be contained; it will continue to proliferate, particularly with the growth in popularity of social networks and communities. Trying to move it into a single knowledgebase or other system of record is a losing battle. Instead, the key to unleashing your data's potential lies in the ability to access it anywhere, anytime, and across any and all systems.
We call this "Enterprise Search 2.0," which enables today's organizations to transform vast stores of raw data into actionable knowledge. Not only can enterprises now index and search enormous quantities of information, but they also can put that data to use and monetize it. In a world of quintillions of files and exabytes of data, Enterprise Search 2.0 makes great business sense.
A Unified Index Provides a New Alternative
With generations X and Y comprising the majority of today's workforce, these "digital natives" expect to find whatever they need - immediately - inside of the enterprise, just as they can outside of it. Blame it on Google if you'd like. Because these digital natives want more freedom than is generally had in SAP or SharePoint, the consumerization of search is bound to explode within the enterprise. Combine these needs with the fact that information relevancy is both personal and contextual, and requires a "learning engine," which understands both, and you find an overwhelming need among most knowledge-intensive organizations for Enterprise Search 2.0.
The characteristics of Enterprise Search 2.0 can be categorized as follows:
- The ability to federate content, meaning both the consolidation and correlation of structured and unstructured information regardless of the source or format. Enterprise Search 2.0 pushes beyond repository-centric information retrieval to leverage information through composite, information mash-ups and dashboards. Compiled from multiple data sources, these dashboards display search results in tables, charts, and other easy-to-digest information views, along with the ability to dive deeper into the details.
- It enables quality information access, which is comprehensive, relevant, and just-in-time. Control over relevance is placed in the hands of the user through self-service and dynamic interfaces (dashboards, search interfaces, analytics, graphics).
- A unified data index that brings all content the enterprise will search into a single layer, and includes both unstructured and structured information.
The central, unified index is perhaps the defining characteristic of Enterprise Search 2.0; it increases the value of both the content and the systems in which the content is housed, because it decouples access from the content source and makes the content source-agnostic. This is a significant breakthrough in enterprise search technology.
Data is pulled into the unified index from virtually any enterprise system, from systems that are behind the firewall to those in the cloud, and includes access to social networking and community content, and web content as well. Following the initial index, data is re-indexed as changes and updates are made, enabling searches that always provide the most recent, relevant content.
Enterprise Search 2.0, unlike basic 1.0 search, integrates with the full knowledge ecosystem. Whether the information is structured or unstructured, in text or voice, Enterprise Search 2.0 will bring it into a single, unified index from which companies can provide self-service information access to various constituencies, from specific groups of employees and customers, to prospects and partners. This approach allows IT departments to leverage their existing technologies and avoid significant costs associated with system integrations and data migration projects. It also helps companies avoid pushing their processes into a one-size-fits-all, cookie-cutter framework.
By combining structured and unstructured data from virtually any enterprise system into a central, unified index, companies not only gain superior insight into composite information, they can also deliver greater value by leveraging existing technologies, and avoid significant costs associated with system integrations and data migration projects.
Rather than searching within individual systems, and then amalgamating the search results from multiple search tools, Enterprise Search 2.0 federates the content. For example, Enterprise Search 2.0 can correlate information from disparate sources and formats and assemble a new, consolidated, dashboard view of information for the user, which would not be available from each independent content source, or which would require significant amounts of time to research and correlate manually. Compiled from multiple data sources, search results appear in tables, charts, and other easy-to-digest dashboard views; the user can also dive deeper into the details by clicking on the charts and other elements, effectively conversing with the information.
Dashboards help companies become more agile and facilitate better and faster decision-making at the top of the organization, as well as providing cross-functional awareness and collaboration. In this way, Enterprise Search 2.0 makes real-time, customized business analytics available to all employees.
Stop Moving Data
My advice to organizations is to stop moving data. In an effort to better manage and access data, companies have spent countless resources moving information to centralized "systems of record," only to have the data continue to proliferate outside of those systems. The software industry has promised for decades a single, integrated solution to handle all enterprise information needs, but not only has that solution not materialized, the number of systems and content sources has continued to proliferate.
Moving data is a losing game. With a unified index though, organizations can eliminate this costly process and harvest existing IT infrastructures while providing actionable insight into information and knowledge.
Enterprise Search 2.0 is an important, transformational technology that is changing the way employees work. Enterprise Search 2.0 helps connect people to people through information, and provides the relevant content and context that helps organizations unleash their data's potential into on-demand, actionable knowledge that better - and more quickly - informs critical business decisions.
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
Feb. 11, 2016 10:00 PM EST Reads: 119
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
Feb. 11, 2016 10:00 PM EST Reads: 225
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Feb. 11, 2016 07:00 PM EST Reads: 270
Microservices are a type of software architecture where large applications are made up of small, self-contained units working together through APIs that are not dependent on a specific language. Each service has a limited scope, concentrates on a specific task and is highly independent. This setup allows IT managers and developers to build systems in a modular way. In his book, “Building Microservices,” Sam Newman said microservices are small, focused components built to do a single thing very w...
Feb. 11, 2016 06:00 PM EST
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
Feb. 11, 2016 05:00 PM EST Reads: 387
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Feb. 11, 2016 04:15 PM EST Reads: 178
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
Feb. 11, 2016 03:45 PM EST Reads: 409
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
Feb. 11, 2016 02:45 PM EST Reads: 441
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
Feb. 11, 2016 01:30 PM EST Reads: 439
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
Feb. 11, 2016 12:00 PM EST Reads: 211
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
Feb. 11, 2016 12:00 PM EST
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Avere delivers a more modern architectural approach to storage that doesn’t require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers ...
Feb. 11, 2016 11:30 AM EST
In most cases, it is convenient to have some human interaction with a web (micro-)service, no matter how small it is. A traditional approach would be to create an HTTP interface, where user requests will be dispatched and HTML/CSS pages must be served. This approach is indeed very traditional for a web site, but not really convenient for a web service, which is not intended to be good looking, 24x7 up and running and UX-optimized. Instead, talking to a web service in a chat-bot mode would be muc...
Feb. 11, 2016 11:15 AM EST Reads: 286
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
Feb. 11, 2016 11:00 AM EST Reads: 248
SYS-CON Events announced today that AppNeta, the leader in performance insight for business-critical web applications, will exhibit and present at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. AppNeta is the only application performance monitoring (APM) company to provide solutions for all applications – applications you develop internally, business-critical SaaS applications you use and the networks that deli...
Feb. 11, 2016 11:00 AM EST Reads: 420
CIOs and those charged with running IT Operations are challenged to deliver secure, audited, and reliable compute environments for the applications and data for the business. Behind the scenes these tasks are often accomplished by following onerous time-consuming processes and often the management of these environments and processes will be outsourced to multiple IT service providers. In addition, the division of work is often siloed into traditional "towers" that are not well integrated for cro...
Feb. 11, 2016 08:00 AM EST Reads: 486
In a previous article, I demonstrated how to effectively and efficiently install the Dynatrace Application Monitoring solution using Ansible. In this post, I am going to explain how to achieve the same results using Chef with our official dynatrace cookbook available on GitHub and on the Chef Supermarket. In the following hands-on tutorial, we’ll also apply what we see as good practice on working with and extending our deployment automation blueprints to suit your needs.
Feb. 11, 2016 05:00 AM EST
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Feb. 11, 2016 02:30 AM EST Reads: 260
How is your DevOps transformation coming along? How do you measure Agility? Reliability? Efficiency? Quality? Success?! How do you optimize your processes? This morning on #c9d9 we talked about some of the metrics that matter for the different stakeholders throughout the software delivery pipeline. Our panelists shared their best practices.
Feb. 11, 2016 02:00 AM EST Reads: 111
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 11, 2016 01:15 AM EST Reads: 218