|By Rupert Tagnipes||
|July 6, 2012 03:30 PM EDT||
For many years, companies collected data from various sources that often found its way into relational databases like Oracle and MySQL. However, the rise of the Internet, Web 2.0, and recently social media began an enormous increase in the amount of data created as well as in the type of data. No longer was data relegated to types that easily fit into standard data fields. Instead, it now came in the form of photos, geographic information, chats, Twitter feeds, and emails. The age of Big Data is upon us.
Big Data Beginnings
A study by IDC titled "The Digital Universe Decade" projects a 45-fold increase in annual data by 2020. In 2010, the amount of digital information was 1.2 zettabytes (1 zettabyte equals 1 trillion gigabytes). To put that in perspective, the equivalent of 1.2 zettabytes is a full-length episode of "24" running continuously for 125 million years, according to IDC. That's a lot of data. More important, this data has to go somewhere, and IDC's report projects that by 2020, more than one-third of all digital information created annually will either live in or pass through the cloud. With all this data being created, the challenge will be how to collect, store, and analyze what it means.
Business intelligence (BI) systems have always had to deal with large data sets. Typically the strategy was to pull in "atomic" data at the lowest level of granularity, then aggregate the information to a consumable format for end users. In fact, it was preferable to have a lot of data because you could also drill-down from the aggregation layer to get at the more detailed information, as needed.
In other words, large data sets have been around a long time. And there have been many attempts at trying to manage, wrangle, and tame the onslaught of data being generated from everywhere. But it wasn't until Jeffrey Dean and Sanjay Ghemawat of Google Labs wrote their influential paper on MapReduce in 2003 that Big Data really started to take shape. Google has had to deal with large amounts of raw data (such as crawled documents and web request logs) that needed to be analyzed in a timely manner. Creating MapReduce was their way of being able to abstract the compute parallelization, distribution of data, fault tolerance, and load balancing from developers so they could focus on expressing the computations necessary to analyze the data. This seminal paper reportedly inspired Doug Cutting to develop an open-source implementation of the MapReduce framework called "Hadoop," which was named after his son's toy elephant. Yahoo famously embraced this implementation after hiring Cutting in 2004. Yahoo continued to build upon this technology and first used Hadoop in production in 2008 for its search "webmap," which was an index of all known webpages and all the metadata needed to search them.
One of the key characteristics of Hadoop was that it could run on commodity hardware and automatically distribute jobs. By its nature, it is designed to be fault tolerant so jobs aren't impacted by the failure of a single node. According to an article in Wired magazine about Yahoo's use of Hadoop, "Hadoop could ‘map' tasks across a cluster of machines, splitting them into tiny sub-tasks, before ‘reducing' the results into one master calculation." Soon after, companies like eBay and Facebook were adopting the technology and implementing it internally. Reportedly, Facebook has the largest Hadoop cluster in the world, currently at 30 petabytes (PB).
Although early adopters of Hadoop and other Big Data technologies tended to form around the Internet, social media, and ad networks, Big Data solutions are intended to be general-purpose tools. With most companies now integrating social media into their offerings, the amount of data created internally combined with those extracted externally will only increase. This is an indication that companies from all industries will need to start investigating how to implement Big Data technologies to make use of all this data they're collecting and creating.
Making Sense of Big Data
The problem with the term Big Data is that it's used in a lot of different ways. One definition is that Big Data is any data set that is too large for on-hand data management tools. According to Martin Wattenberg, a scientist at IBM, "The real yardstick ... is how it [Big Data] compares with a natural human limit, like the sum total of all the words that you'll hear in your lifetime." Essentially, what makes something Big Data is that it:
- Is at a large scale (petabytes, not gigabytes)
- Has high velocity (frequently polled, generated, or collected)
- Is unstructured (not only from a relational database)
Collecting that data is a solvable problem, but making sense of it, (particularly in real time), is the challenge that technology tries to solve. This new type of technology is often listed under the title of NoSQL (or Not Only SQL) and includes distributed databases that are a departure from relational databases like Oracle and MySQL. These systems are specifically designed to be able to parallelize compute, distribute data, and create fault tolerance on a large cluster of servers. Some examples of NoSQL projects and software are Cassandra, Hadoop, Membase, MongoDB, and Riak.
The techniques vary, but there is a definite distinction between SQL relational databases and their NoSQL brethren. Most notably, NoSQL systems share the following characteristics:
- Do not use SQL as their primary query language
- May not require fixed table schemas
- May not give full ACID guarantees (Atomicity, Consistency, Isolation, Durability)
- Scale horizontally
Because of the lack of ACID, NoSQL is used when performance and real-time results are more important than consistency. For example, if a company wants to update its website in real time based on an analysis of the behaviors of a particular user interaction with the site, it will most likely turn to NoSQL technologies to solve this use case.
However, this shortcoming doesn't mean relational databases are going away. In fact, it's likely that in larger implementations, NoSQL and SQL will function together. Just as NoSQL was designed to solve a particular use case, so do relational databases solve theirs. Relational databases excel at organizing structured data and are the standard for serving up ad-hoc analytics and BI reporting. In fact, Apache Hadoop even has a separate project called Sqoop that is designed to link Hadoop with structured data stores. Most likely, those who implement NoSQL will maintain their relational databases for legacy systems and for reporting off their NoSQL clusters.
Big Data Moves to the Cloud
The early adopters of Big Data tended to be companies with capital budgets that could be invested into dedicated data centers. However, with the incredible increase in the amount of data generated, collected, and analyzed, smaller companies can take advantage of the cloud and off-load the hardware management to those vendors. Two traits that many of these NoSQL solutions have in common make them a seemingly natural fit for the cloud: One is that the nodes are distributed, and the second is that they run on commodity hardware. The cloud is designed for horizontal scaling and often built on low-cost, commodity hardware, especially at the infrastructure-as-service (IaaS) layer, where customers simply need infrastructure and have the application expertise to build and configure their own Big Data application (whether it is with Hadoop, Cassandra, or any number of products).
Not all clouds are built the same, however. One of the design elements you should look for is the ability for each virtual server in the Big Data cluster to be deployed on different nodes. Although the servers are all on the same private VLAN, ensuring that each server is on different hardware solves for two problems: (1) all the traffic and processing aren't hitting the same hardware, and (2) the cluster is protected against hardware failure because all the servers are distributed. Whether or not the architecture is assuming a name node and data node construct or a Ring design, this setup ensures performance and reliability. In addition, the option of using local storage on the virtual machine and a high-performance network will reduce latency and improve performance.
Given what most users are trying to achieve with Big Data applications-large-scale data sets, large-scale analysis, often in real time-performance is a key factor. Depending on the problem to be solved, users can also leverage a hybrid implementation that combines both virtual and dedicated servers. This setup offers maximum flexibility that balances the elastic, scalable nature of virtual machines with the single-tenancy of dedicated servers. Big Data projects don't happen in a vacuum: Although a NoSQL database can leverage dedicated servers, the app or web servers that present the results of the analysis to end users or that are used to add additional functionality like log file processing can easily be added to as many virtual machines as needed to meet demand. In addition, using the cloud means that users won't need to invest in expensive equipment, pay for power and connectivity, or hire additional resources to maintain hardware. Users simply pay for the infrastructure they need and can scale it as desired over time. The ability to scale up or down to match demand (and to pay only for the infrastructure you actually use) is one of the values of using the cloud for Big Data.
Conclusion: Succeeding with Big Data
With whatever solution you select, you should also take into account the nature of the application and where you'll want to house the processing and the output. The amount of data you collect, analyze, and present will only increase over time. The advantage will go to companies that can collect and analyze this data quickly and efficiently, allowing them to react instantly to customer sentiment and to changing trends in the ever-quickening pace of business. Make sure to select the right infrastructure vendor who can match your performance criteria and has the capacity to grow with you as your data and application needs increase to match the changing demands of your business.
With the rise of Docker, Kubernetes, and other container technologies, the growth of microservices has skyrocketed among dev teams looking to innovate on a faster release cycle. This has enabled teams to finally realize their DevOps goals to ship and iterate quickly in a continuous delivery model. Why containers are growing in popularity is no surprise — they’re extremely easy to spin up or down, but come with an unforeseen issue. However, without the right foresight, DevOps and IT teams may lo...
Sep. 25, 2016 04:30 AM EDT Reads: 886
As applications are promoted from the development environment to the CI or the QA environment and then into the production environment, it is very common for the configuration settings to be changed as the code is promoted. For example, the settings for the database connection pools are typically lower in development environment than the QA/Load Testing environment. The primary reason for the existence of the configuration setting differences is to enhance application performance. However, occas...
Sep. 25, 2016 03:45 AM EDT Reads: 633
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Sep. 25, 2016 03:15 AM EDT Reads: 3,341
When scaling agile / Scrum, we invariable run into the alignment vs autonomy problem. In short: you cannot have autonomous self directing teams if they have no clue in what direction they should go, or even shorter: Alignment breeds autonomy. But how do we create alignment? and what tools can we use to quickly evaluate if what we want to do is part of the mission or better left out? Niel Nickolaisen created the Purpose Alignment model and I use it with innovation labs in large enterprises to de...
Sep. 25, 2016 01:15 AM EDT Reads: 1,202
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
Sep. 25, 2016 12:45 AM EDT Reads: 1,928
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Sep. 25, 2016 12:45 AM EDT Reads: 1,033
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
Sep. 25, 2016 12:30 AM EDT Reads: 1,590
Throughout history, various leaders have risen up and tried to unify the world by conquest. Fortunately, none of their plans have succeeded. The world goes on just fine with each country ruling itself; no single ruler is necessary. That’s how it is with the container platform ecosystem, as well. There’s no need for one all-powerful, all-encompassing container platform. Think about any other technology sector out there – there are always multiple solutions in every space. The same goes for conta...
Sep. 24, 2016 08:30 PM EDT Reads: 882
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
Sep. 24, 2016 07:00 PM EDT Reads: 2,792
Let's recap what we learned from the previous chapters in the series: episode 1 and episode 2. We learned that a good rollback mechanism cannot be designed without having an intimate knowledge of the application architecture, the nature of your components and their dependencies. Now that we know what we have to restore and in which order, the question is how?
Sep. 24, 2016 07:00 PM EDT Reads: 1,171
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
Sep. 24, 2016 04:30 PM EDT Reads: 1,514
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Sep. 24, 2016 01:00 PM EDT Reads: 1,490
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
Sep. 24, 2016 10:15 AM EDT Reads: 2,532
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.
Sep. 24, 2016 10:15 AM EDT Reads: 393
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will look at the protocols that communicate data and the emerging data analy...
Sep. 24, 2016 09:00 AM EDT Reads: 1,534
Video experiences should be unique and exciting! But that doesn’t mean you need to patch all the pieces yourself. Users demand rich and engaging experiences and new ways to connect with you. But creating robust video applications at scale can be complicated, time-consuming and expensive. In his session at @ThingsExpo, Zohar Babin, Vice President of Platform, Ecosystem and Community at Kaltura, will discuss how VPaaS enables you to move fast, creating scalable video experiences that reach your...
Sep. 24, 2016 09:00 AM EDT Reads: 788
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
Sep. 24, 2016 08:00 AM EDT Reads: 1,323
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Sep. 23, 2016 12:00 PM EDT Reads: 1,489
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Sep. 21, 2016 09:15 PM EDT Reads: 5,001
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Sep. 21, 2016 08:15 PM EDT Reads: 4,330