Welcome!

Microservices Expo Authors: XebiaLabs Blog, Elizabeth White, Jnan Dash, Pat Romanski, Liz McMillan

Related Topics: @CloudExpo, Microservices Expo, Open Source Cloud, @BigDataExpo, SDN Journal, @DevOpsSummit

@CloudExpo: Article

Big Data, Open Data and Cloud Strategy

Open Data initiatives should be based on strong foundations of technologies such as Shared Services, Big Data and Cloud

The Big Data and Cloud market has been growing at a staggering pace. Data is becoming unmanageable and too big to be handled by relational database systems alone and there is a need to effectively provision, manage elastic scalable systems. Information technology is undergoing a major shift due to new paradigms and a variety of delivery channels. The drivers for these technologies are social networks, proliferation of devices such as tablets and phones. Social business and collaboration are continuing to develop further to enhance productivity and interaction. There has been a big void in the Big data area and a need to come up with solutions that can manage Big Data. Part of the problem has been that there was so much focus on the user interfaces that not many organizations were thinking further about the core - Data. So now with the proliferation of large and unstructured data, it is important to extract and process large data sets from different systems expeditiously. To deliver strategic business value, there should be the capability to process Big data and have the analytics for enhanced decision making. In addition, systems that process Big Data can rely on the Cloud to rapidly provision and deploy elastic and scalable systems.

The key elements of a comprehensive strategy for Big Data, Open Data and Cloud includes conducting a cost benefit analysis, hiring resources with the right skills, evaluating requirements for data and analytics, developing a sound platform that can process and analyze large volumes of data quickly and developing strong analytic capabilities to respond to important business questions. A sound strategy also includes assessing the existing and future data, services, applications as well as the projected growth. In addition there should be a focus on ensuring that the infrastructure can support and store unstructured as well as structured data. As part of the strategy, data protection including security and privacy is very important. With the evolution to complex data sets, data can be compromised at the end points or while it is being transmitted. Hence proper security controls have to be developed to address these issues. Organizations also need to develop policies, practices and procedures that support the effective transition to these technologies.

As part of the strategic transition to Big Data and Cloud it is important to select a platform that can handle such data, parse through records quickly and provide adequate storage for the data. With the high velocity of data coming through systems, in memory analytics and fast processing are key elements that the platform should support. It should have good application development capabilities and the ability to effectively manage, provision systems and related monitoring. The platform should have components and connectors for Big Data to come up with integrated solutions. From a development perspective, Open source software such as Hadoop, Hive, Pig, R are being leveraged for Big Data. Hadoop was developed as a framework for the distributed processing of large data sets and to scale upwards. Hadoop can handle  data from diverse systems including structured, unstructured, media. NoSQL is being used by organizations to store data that is not structured. In addition, there are vendors who offer proprietary software Hadoop solutions. The choice to go with a proprietary or open source solution depends on many factors and requires a through assessment.

Systems that process Big Data need the Cloud for rapid provisioning and deployment. The elastic and scalable aspects of the Cloud support the storage and management of massive amounts of data. The data can be obtained and stored in a Cloud based storage solution or database adapters can be used to obtain the data from databases with Hadoop, Pig, Hive. Vendors also offer data transfer services that move Big data from and to the Cloud. Cloud adds the dynamic computing, elasticity, self-service, measured aspects in addition to other aspects for rapid provisioning and on demand access. Cloud solutions may offer lower life cycle costs based on usage and the monitoring aspects can lay out a holistic view of usage, cost assessments and charge back information. All this information can enhance the ability of the organization to plan and react to changes based on performance and capacity metrics.

Open Data initiatives should be based on strong foundations of technologies such as Shared Services, Big Data and Cloud. There are initiatives underway related to Open data that drive the development and deployment of innovative applications. Making data accessible enables the development of new products and services. This data should be made available in a standardized manner so that developers can utilize it quickly and effectively. Open data maximizes value creation built on the existing structured and unstructured data.

Open Data strategy and initiatives should define specific requirements of what data will be made available based on the utility of that information. Just providing massive dumps of data that are hard to use is not the solution. There has to be proper processing that can extract useful information from the data. The data that is obtained should support automated processing  to develop custom applications and can be rendered as html, xml etc. This  can promote greater number of not just traditional applications, but also mobile applications. There has to be great emphasis on security and privacy since any errors can compromise important information when the data is made accessible. A comprehensive strategy for Big Data, Cloud and Open Data will enable a smooth transition to achieve big wins!

(This has been extracted from and is reference to blog. All views and information expressed here do not represent the positions and views of anyone else or any organization)

More Stories By Ajay Budhraja

Ajay Budhraja has over 24 years in Information Technology with experience in areas such as Executive leadership, management, strategic planning, enterprise architecture, system architecture, software engineering, training, methodologies, networks, and databases. He has provided Senior Executive leadership for nationwide and global programs and has implemented integrated Enterprise Information Technology solutions.

Ajay has a Masters in Engineering (Computer Science), and a Masters in Management and Bachelors in Engineering. He is a Project Management Professional certified by the PMI and is also CICM, CSM, ECM (AIIM) Master, SOA, RUP, SEI-CMMI, ITIL-F, Security + certified.

Ajay has led large-scale projects for big organizations and has extensive IT experience related to telecom, business, manufacturing, airlines, finance and government. He has delivered internet based technology solutions and strategies for e-business platforms, portals, mobile e-business, collaboration and content management. He has worked extensively in the areas of application development, infrastructure development, networks, security and has contributed significantly in the areas of Enterprise and Business Transformation, Strategic Planning, Change Management, Technology innovation, Performance management, Agile management and development, Service Oriented Architecture, Cloud.

Ajay has been leading organizations as Senior Executive, he is the Chair for the Federal SOA COP, Chair Cloud Solutions, MidTech Leadership Steering Committee member and has served as President DOL-APAC, AEA-DC, Co-Chair Executive Forum Federal Executive Institute SES Program. As Adjunct Faculty, he has taught courses for several universities. He has received many awards, authored articles and presented papers at worldwide conferences.

@MicroservicesExpo Stories
If you are part of the cloud development community, you certainly know about “serverless computing”, almost a misnomer. Because it implies there are no servers which is untrue. However the servers are hidden from the developers. This model eliminates operational complexity and increases developer productivity. We came from monolithic computing to client-server to services to microservices to serverless model. In other words, our systems have slowly “dissolved” from monolithic to function-by-func...
While some vendors scramble to create and sell you a fancy solution for monitoring your spanking new Amazon Lambdas, hear how you can do it on the cheap using just built-in Java APIs yourself. By exploiting a little-known fact that Lambdas aren’t exactly single-threaded, you can effectively identify hot spots in your serverless code. In his session at @DevOpsSummit at 21st Cloud Expo, Dave Martin, Product owner at CA Technologies, will give a live demonstration and code walkthrough, showing how ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Many organizations are now looking to DevOps maturity models to gauge their DevOps adoption and compare their maturity to their peers. However, as enterprise organizations rush to adopt DevOps, moving past experimentation to embrace it at scale, they are in danger of falling into the trap that they have fallen into time and time again. Unfortunately, we've seen this movie before, and we know how it ends: badly.
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
API Security has finally entered our security zeitgeist. OWASP Top 10 2017 - RC1 recognized API Security as a first class citizen by adding it as number 10, or A-10 on its list of web application vulnerabilities. We believe this is just the start. The attack surface area offered by API is orders or magnitude larger than any other attack surface area. Consider the fact the APIs expose cloud services, internal databases, application and even legacy mainframes over the internet. What could go wrong...
With Cloud Foundry you can easily deploy and use apps utilizing websocket technology, but not everybody realizes that scaling them out is not that trivial. In his session at 21st Cloud Expo, Roman Swoszowski, CTO and VP, Cloud Foundry Services, at Grape Up, will show you an example of how to deal with this issue. He will demonstrate a cloud-native Spring Boot app running in Cloud Foundry and communicating with clients over websocket protocol that can be easily scaled horizontally and coordinate...
In his session at 20th Cloud Expo, Chris Carter, CEO of Approyo, discussed the basic set up and solution for an SAP solution in the cloud and what it means to the viability of your company. Chris Carter is CEO of Approyo. He works with business around the globe, to assist them in their journey to the usage of Big Data in the forms of Hadoop (Cloudera and Hortonwork's) and SAP HANA. At Approyo, we support firms who are looking for knowledge to grow through current business process, where even 1%...
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
From manual human effort the world is slowly paving its way to a new space where most process are getting replaced with tools and systems to improve efficiency and bring down operational costs. Automation is the next big thing and low code platforms are fueling it in a significant way. The Automation era is here. We are in the fast pace of replacing manual human efforts with machines and processes. In the world of Information Technology too, we are linking disparate systems, softwares and tool...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
Web services have taken the development world by storm, especially in recent years as they've become more and more widely adopted. There are naturally many reasons for this, but first, let's understand what exactly a web service is. The World Wide Web Consortium (W3C) defines "web of services" as "message-based design frequently found on the Web and in enterprise software". Basically, a web service is a method of sending a message between two devices through a network. In practical terms, this ...
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
SYS-CON Events announced today that Calligo has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Calligo is an innovative cloud service provider offering mid-sized companies the highest levels of data privacy. Calligo offers unparalleled application performance guarantees, commercial flexibility and a personalized support service from its globally located cloud platform...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
We have already established the importance of APIs in today’s digital world (read about it here). With APIs playing such an important role in keeping us connected, it’s necessary to maintain the API’s performance as well as availability. There are multiple aspects to consider when monitoring APIs, from integration to performance issues, therefore a general monitoring strategy that only accounts for up-time is not ideal.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
As today's digital disruptions bounce and smash their way through conventional technologies and conventional wisdom alike, predicting their path is a multifaceted challenge. So many areas of technology advance on Moore's Law-like exponential curves that divining the future is fraught with danger. Such is the problem with artificial intelligence (AI), and its related concepts, including cognitive computing, machine learning, and deep learning.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...