Click here to close now.

Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Liz McMillan, Plutora Blog, Carmen Gonzalez

Related Topics: @BigDataExpo, Java IoT, Industrial IoT, Microservices Expo, Microsoft Cloud, @CloudExpo

@BigDataExpo: Article

Simplified Data Retention on a Massive Scale Speeds Access to Big Data

Organizations can gain competitive advantages when able to rely on data retention for improved decision making & trend analysis

There are numerous applications for cost-effective data retention. Organizations can gain substantial competitive advantages when able to rely on data retention for improved decision making and trend analysis. Research enterprises can make use of large scale data sets enabling them to study information more completely than ever before.

Simplified data retention on a massive scale speeds up access time to Big Data. Big Data is defined as large-scale data sets that are too large to analyze and manage using ordinary methods. This data in both structured and unstructured form is valuable and comes from sources such as trading systems.

In many cases existing systems cannot process data of this variety and volume. Some organizations store such data in file systems so as not to overburden their databases. This may be a temporary stop gap, but it will not suffice in the long run. Because Big Data is increasing at an exponential rate, this is only a temporary solution. It's likely that machine-generated data will exceed the processing capability of conventional systems. The cost of extracting this data can be so high that many organizations will just shy away from it.

Today technology is just beginning to address Big Data issues. Many organizations try to apply existing strategies to manage this data effectively. Standard methods from relational database queries to complex analysis tools are being used. Data retention software is also being applied to extract relevant information from Big Data sources.

Currently Big Data retention technology is available that is scalable and easy to implement. Using this technology it's possible to access Big Data online using SQL along with business intelligence software. Components of this type of system are storage platforms with specialized software and a specialized massive scale data repository developed for data retention online. This unique Big Data management system is scalable and designed to process machine-generated data at 40:11 compression ratios while maintaining its online availability.

Organizations that need to process Big Data may benefit by using databases specifically designed for this purpose. Such databases will prove cost-effective and are currently being used in numerous organizations internationally. Such databases work in parallel allowing tens of billions of records to be processed each day. At the same time, the retention capability is practically limitless. This database can fit content addressable storage (CAS), direct attach storage (DAS), and storage area network (SAN). Some of the benefits of this data storage and retrieval system are reduction in infrastructure through reduction in physical storage demand and effective, configurable record management.

One Big Data retention solution has three components. The first is paired server level service managers that share metadata and provide import and query capability. The second is a data archive residing on a cluster services node as well as storage nodes. It's designed with enough scalability to process billions of objects. The third component consists of shared storage that can be local direct access storage, a network file system or a comprehensive clustered file system.

This type of system was recently tested on 508 GB of artificially generated using stock trading test data, modeled after NASDAQ. Performance test results for data import showed a rate of close to 12 billion records imported within an hour. Data compression resulted in a data reduction of 476.1 GB. The archive data was only about 6.3% of the original size prior to compression. A SQL query was executed selecting the three largest volume stocks having trades of well over 4 million per day. This query against 11.6 billion records took approximately 5.5 seconds to execute.

Big Data is high-volume, high-velocity and perhaps highly variable as well. Big Data retention solutions can lead to better decision making, new discoveries and even process optimization. Science is a major area that can benefit from Big Data solutions. Meteorology is just one example that can reap rewards by using new technological advances in data retention on a massive scale. The ability to do research and analysis with extremely large sets of data gives greater understanding to those who are modeling weather, oceanographic conditions, the economy or social trends. With new cost-effective technology available many new organizations will consider the possibilities of Big Data retention in their enterprise.

More Stories By Alan McMahon

Alan McMahon works for Dell. He has worked for Dell for the past 13 years and is involved in enterprise solution design across a range of products from servers and storage to virtualization. He now focuses his attention on marketing for Dell. He is based in Ireland and enjoys sailing as a past time.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
"We help to transform an organization and their operations and make them more efficient, more agile, and more nimble to move into the cloud or to move between cloud providers and create an agnostic tool set," noted Jeremy Steinert, DevOps Services Practice Lead at WSM International, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
The most often asked question post-DevOps introduction is: “How do I get started?” There’s plenty of information on why DevOps is valid and important, but many managers still struggle with simple basics for how to initiate a DevOps program in their business. They struggle with issues related to current organizational inertia, the lack of experience on Continuous Integration/Delivery, understanding where DevOps will affect revenue and budget, etc. In their session at DevOps Summit, JP Morgenthal...
In his session at 16th Cloud Expo, Simone Brunozzi, VP and Chief Technologist of Cloud Services at VMware, reviewed the changes that the cloud computing industry has gone through over the last five years and shared insights into what the next five will bring. He also chronicled the challenges enterprise companies are facing as they move to the public cloud. He delved into the "Hybrid Cloud" space and explained why every CIO should consider ‘hybrid cloud' as part of their future strategy to achie...
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at 16th Cloud Expo, Jake Moshenko, Product Manager at CoreOS, examined how CoreOS + Quay.io fit into the development lifecycle from pushing gi...
SYS-CON Events announced today that Harbinger Systems will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Harbinger Systems is a global company providing software technology services. Since 1990, Harbinger has developed a strong customer base worldwide. Its customers include software product companies ranging from hi-tech start-ups in Silicon Valley to leading product companies in the US a...
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult – let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and liv...
This week we're attending SYS-CON Event's DevOps Summit in New York City. It's a great conference and energy behind DevOps is enormous. Thousands of attendees from every company you can imagine are focused on automation, the challenges of DevOps, and how to bring greater agility to software delivery. But, even with the energy behind DevOps there's something missing from the movement. For all the talk of deployment automation, continuous integration, and cloud infrastructure I'm still not se...
DevOps Summit, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development...
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises ar...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the ...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of...
Data center models are changing. A variety of technical trends and business demands are forcing that change, most of them centered on the explosive growth of applications. That means, in turn, that the requirements for application delivery are changing. Certainly application delivery needs to be agile, not waterfall. It needs to deliver services in hours, not weeks or months. It needs to be more cost efficient. And more than anything else, it needs to be really, dc infra axisreally, super focus...
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction. ...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes. In his session at DevOps Summit, Michael Demmer, VP of Engineering at Jut, will discuss how this can...