Welcome!

Microservices Expo Authors: Liz McMillan, Derek Weeks, Elizabeth White, PagerDuty Blog, Pat Romanski

Related Topics: @CloudExpo, Microservices Expo, Open Source Cloud, @BigDataExpo, SDN Journal, @DevOpsSummit

@CloudExpo: Article

Big Data, Open Data and Cloud Strategy

Open Data initiatives should be based on strong foundations of technologies such as Shared Services, Big Data and Cloud

The Big Data and Cloud market has been growing at a staggering pace. Data is becoming unmanageable and too big to be handled by relational database systems alone and there is a need to effectively provision, manage elastic scalable systems. Information technology is undergoing a major shift due to new paradigms and a variety of delivery channels. The drivers for these technologies are social networks, proliferation of devices such as tablets and phones. Social business and collaboration are continuing to develop further to enhance productivity and interaction. There has been a big void in the Big data area and a need to come up with solutions that can manage Big Data. Part of the problem has been that there was so much focus on the user interfaces that not many organizations were thinking further about the core - Data. So now with the proliferation of large and unstructured data, it is important to extract and process large data sets from different systems expeditiously. To deliver strategic business value, there should be the capability to process Big data and have the analytics for enhanced decision making. In addition, systems that process Big Data can rely on the Cloud to rapidly provision and deploy elastic and scalable systems.

The key elements of a comprehensive strategy for Big Data, Open Data and Cloud includes conducting a cost benefit analysis, hiring resources with the right skills, evaluating requirements for data and analytics, developing a sound platform that can process and analyze large volumes of data quickly and developing strong analytic capabilities to respond to important business questions. A sound strategy also includes assessing the existing and future data, services, applications as well as the projected growth. In addition there should be a focus on ensuring that the infrastructure can support and store unstructured as well as structured data. As part of the strategy, data protection including security and privacy is very important. With the evolution to complex data sets, data can be compromised at the end points or while it is being transmitted. Hence proper security controls have to be developed to address these issues. Organizations also need to develop policies, practices and procedures that support the effective transition to these technologies.

As part of the strategic transition to Big Data and Cloud it is important to select a platform that can handle such data, parse through records quickly and provide adequate storage for the data. With the high velocity of data coming through systems, in memory analytics and fast processing are key elements that the platform should support. It should have good application development capabilities and the ability to effectively manage, provision systems and related monitoring. The platform should have components and connectors for Big Data to come up with integrated solutions. From a development perspective, Open source software such as Hadoop, Hive, Pig, R are being leveraged for Big Data. Hadoop was developed as a framework for the distributed processing of large data sets and to scale upwards. Hadoop can handle  data from diverse systems including structured, unstructured, media. NoSQL is being used by organizations to store data that is not structured. In addition, there are vendors who offer proprietary software Hadoop solutions. The choice to go with a proprietary or open source solution depends on many factors and requires a through assessment.

Systems that process Big Data need the Cloud for rapid provisioning and deployment. The elastic and scalable aspects of the Cloud support the storage and management of massive amounts of data. The data can be obtained and stored in a Cloud based storage solution or database adapters can be used to obtain the data from databases with Hadoop, Pig, Hive. Vendors also offer data transfer services that move Big data from and to the Cloud. Cloud adds the dynamic computing, elasticity, self-service, measured aspects in addition to other aspects for rapid provisioning and on demand access. Cloud solutions may offer lower life cycle costs based on usage and the monitoring aspects can lay out a holistic view of usage, cost assessments and charge back information. All this information can enhance the ability of the organization to plan and react to changes based on performance and capacity metrics.

Open Data initiatives should be based on strong foundations of technologies such as Shared Services, Big Data and Cloud. There are initiatives underway related to Open data that drive the development and deployment of innovative applications. Making data accessible enables the development of new products and services. This data should be made available in a standardized manner so that developers can utilize it quickly and effectively. Open data maximizes value creation built on the existing structured and unstructured data.

Open Data strategy and initiatives should define specific requirements of what data will be made available based on the utility of that information. Just providing massive dumps of data that are hard to use is not the solution. There has to be proper processing that can extract useful information from the data. The data that is obtained should support automated processing  to develop custom applications and can be rendered as html, xml etc. This  can promote greater number of not just traditional applications, but also mobile applications. There has to be great emphasis on security and privacy since any errors can compromise important information when the data is made accessible. A comprehensive strategy for Big Data, Cloud and Open Data will enable a smooth transition to achieve big wins!

(This has been extracted from and is reference to blog. All views and information expressed here do not represent the positions and views of anyone else or any organization)

More Stories By Ajay Budhraja

Ajay Budhraja has over 24 years in Information Technology with experience in areas such as Executive leadership, management, strategic planning, enterprise architecture, system architecture, software engineering, training, methodologies, networks, and databases. He has provided Senior Executive leadership for nationwide and global programs and has implemented integrated Enterprise Information Technology solutions.

Ajay has a Masters in Engineering (Computer Science), and a Masters in Management and Bachelors in Engineering. He is a Project Management Professional certified by the PMI and is also CICM, CSM, ECM (AIIM) Master, SOA, RUP, SEI-CMMI, ITIL-F, Security + certified.

Ajay has led large-scale projects for big organizations and has extensive IT experience related to telecom, business, manufacturing, airlines, finance and government. He has delivered internet based technology solutions and strategies for e-business platforms, portals, mobile e-business, collaboration and content management. He has worked extensively in the areas of application development, infrastructure development, networks, security and has contributed significantly in the areas of Enterprise and Business Transformation, Strategic Planning, Change Management, Technology innovation, Performance management, Agile management and development, Service Oriented Architecture, Cloud.

Ajay has been leading organizations as Senior Executive, he is the Chair for the Federal SOA COP, Chair Cloud Solutions, MidTech Leadership Steering Committee member and has served as President DOL-APAC, AEA-DC, Co-Chair Executive Forum Federal Executive Institute SES Program. As Adjunct Faculty, he has taught courses for several universities. He has received many awards, authored articles and presented papers at worldwide conferences.

@MicroservicesExpo Stories
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
After more than five years of DevOps, definitions are evolving, boundaries are expanding, ‘unicorns’ are no longer rare, enterprises are on board, and pundits are moving on. Can we now look at an evolution of DevOps? Should we? Is the foundation of DevOps ‘done’, or is there still too much left to do? What is mature, and what is still missing? What does the next 5 years of DevOps look like? In this Power Panel at DevOps Summit, moderated by DevOps Summit Conference Chair Andi Mann, panelists l...
When building DevOps or continuous delivery practices you can learn a great deal from others. What choices did they make, what practices did they put in place, and how did they connect the dots? At Sonatype, we pulled together a set of 21 reference architectures for folks building continuous delivery and DevOps practices using Docker. Why? After 3,000 DevOps professionals attended our webinar on "Continuous Integration using Docker" discussing just one reference architecture example, we recogn...
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
Thanks to Docker and the DevOps revolution, microservices have emerged as the new way to build and deploy applications — and there are plenty of great reasons to embrace the microservices trend. If you are going to adopt microservices, you also have to understand that microservice architectures have many moving parts. When it comes to incident management, this presents an important difference between microservices and monolithic architectures. More moving parts mean more complexity to monitor an...
In recent years, containers have taken the world by storm. Companies of all sizes and industries have realized the massive benefits of containers, such as unprecedented mobility, higher hardware utilization, and increased flexibility and agility; however, many containers today are non-persistent. Containers without persistence miss out on many benefits, and in many cases simply pass the responsibility of persistence onto other infrastructure, adding additional complexity.
Docker containers have brought great opportunities to shorten the deployment process through continuous integration and the delivery of applications and microservices. This applies equally to enterprise data centers as well as the cloud. In his session at 20th Cloud Expo, Jari Kolehmainen, founder and CTO of Kontena, will discuss solutions and benefits of a deeply integrated deployment pipeline using technologies such as container management platforms, Docker containers, and the drone.io Cl tool...
The rise of containers and microservices has skyrocketed the rate at which new applications are moved into production environments today. While developers have been deploying containers to speed up the development processes for some time, there still remain challenges with running microservices efficiently. Most existing IT monitoring tools don’t actually maintain visibility into the containers that make up microservices. As those container applications move into production, some IT operations t...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
As Enterprise business moves from Monoliths to Microservices, adoption and successful implementations of Microservices become more evident. The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Documenting hurdles and problems for the use of Microservices will help consultants, architects and specialists to avoid repeating the same mistakes and learn how and when to use (or not use) Microservices at the enterprise level. The circumstance w...
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, Cloud Expo and @ThingsExpo are two of the most important technology events of the year. Since its launch over eight years ago, Cloud Expo and @ThingsExpo have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, I provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading the...
This week's news brings us further reminders that if you're betting on cloud, you're headed in the right direction. The cloud is growing seven times faster than the rest of IT, according to IDC, with a 25% spending increase just from 2016 to 2017. SaaS still leads the pack, with an estimated two-thirds of public cloud spending going that way. Large enterprises, with more than 1,000 employees, are predicted to account for more than half of cloud spending and have the fastest annual growth rate.
Microservices (μServices) are a fascinating evolution of the Distributed Object Computing (DOC) paradigm. Initial design of DOC attempted to solve the problem of simplifying developing complex distributed applications by applying object-oriented design principles to disparate components operating across networked infrastructure. In this model, DOC “hid” the complexity of making this work from the developer regardless of the deployment architecture through the use of complex frameworks, such as C...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace. Traditional approaches for driving innovation are now woefully inadequate for keeping up with the breadth of disruption and change facing...
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...