Click here to close now.

Welcome!

Microservices Journal Authors: Liz McMillan, Pat Romanski, Elizabeth White, JP Morgenthal, Carmen Gonzalez

Related Topics: Big Data Journal, Microservices Journal, Virtualization, Cloud Expo, GovIT, SDN Journal

Big Data Journal: Blog Post

Public Sector Big Data: Five Ways Big Data Must Evolve in 2013

2012 will go down as a “Big” year for Big Data in the public sector

By

Editor’s note: This guest post provides context on mission focused data analytics in the federal space by one of the leaders of the federal big data movement, Ray Muslimani. -bg

2012 will go down as a “Big” year for Big Data in the public sector. Rhetoric and hype has been followed by tangible action on the part of both government and industry. The $200 million Big Data initiative unveiled by the White House in March 2012 was an injection of R&D and credibility towards efforts to develop tools and technologies to help solve the nation’s most pressing challenges.

On the industry side, the recently issued TechAmerica report, “Demystifying Big Data,” provides agencies with a roadmap for using Big Data to better serve citizens. It also offers a set of policy recommendations and practical steps agencies can take to get started with Big Data initiatives.

For all of the enthusiasm around Big Data this year, every indication is that 2013 will be the year when Big Data transforms the business of government. Below are 5 steps that need to be taken in order for Big Data to evolve in 2013 and deliver on its promise.

Demystify Big Data
Government agencies warmed to the potential of Big Data throughout 2012, but more education is required to help decision makers wade through their options and how further investments can be justified. Removing the ambiguitiessurrounding Big Data requires an emphasis in 2013 on education from both industry and government.

The TechAmerica Big Data report is a good example of how industry can play an active role in guiding agencies through Big Data initiatives. It also underscores that vendors can’t generate more Big Data RFPs through marketing slicks and sales tactics alone. This approach will not demystify Big Data – it will simply seed further doubt if providers of Big Data tools and solutions focus only on poking holes in competitor alternatives.

Industry and government should follow proven templates for education in 2013. For example, agencies can arrange “Big Data Days” in a similar format as Industry Tech Days occur today. Big Data industry days can help IT providers gain better insight into how each Agency plans to approach their Big Data challenges in 2013 and offer these agencies an opportunity to see a wide range of Big Data services.

The Big Data education process must also extend to contracting officers. Agencies need guidance on how RFPs can be constructed to address a service-based model.

Consumerize Big Data
While those within the public sector with the proper training and skills to analyze data have benefited from advanced Big Data tools, it has been far more difficult for everyday business users and decision makers to access the data in a useful way. Sluggish data query responses, data quality issues, and a clunky user experience is undermining the benefits Big Data Analytics can deliver and requiring users to be de facto “data scientists” to make sense of it all.

Supporting this challenge is a 2012 MeriTalk survey, “The Big Data Gap,” that finds just 60 percent of IT professionals indicate their agency is analyzing the data it collects and a modest 40 percent are using data to make strategic decisions. All of this despite the fact that 96 percent of those surveyed expects their agency’s stored data to grow in the next two years by an average of 64 percent.  The gap here suggests a struggle for non “data scientists” to convert data into business decisions. 

What if any government user could ask a question in natural language and receive the answer in a relevant visualization?  For Big Data to evolve in 2013 we must consumerize the user experience by removing spreadsheets and reports, and place the power of analytics in the hands of users of any level without analytics expertise.

Mobilize Big Data
IDC Government Insights predicts that in 2013, 35 percent of new Federal and state applications will be mobile. At the same time, 65 percent of Federal IT executives expect mobile device use to increase by 20 percent in 2013, according to The 2012-2013 Telework/Mobile IT Almanac.

Part of consumerizing Big Data means building it for any device so that users do not need to be tethered to their desktops to analyze data. Agency decision makers must be empowered to easily view and analyze data on tablets and smartphones, while the increase of teleworking in the public sector requires Big Data to be accessible from anywhere, at any time, and on any device.

There is promising innovation at work by both established Federal IT providers and upstarts in taking a mobile-first path to Big Data, rather than the traditional approach of building BI dashboards for the desktop. The degree to which 2013 sees a shift in Big Data from the desktop to tablets and smartphones will depend on how forcefully solutions providers employ a mobile-first approach to Big Data.

Act on Big Data
A tremendous amount of “thought” energy went into Big Data in 2012. For Big Data to evolve in a meaningful way in 2013, initiatives and studies must generate more action in the form of Big Data RFIs and RFPs.

Within the tight budget climate, agencies will not act on Big Data if vendor proposals require massive investments in IT infrastructure and staffing. There must be a shift –to the extent possible – of the financial and resource burden from agency to vendor. For example, some vendors have developed “Big Data Clouds” that allow agencies to leverage a secure, scalable framework for storing and managing data, along with a toolset for performing consumer-grade search and analysis on that data.

Open Big Data
Adoption of Big Data solutions has been accelerated by open source tools such as Hadoop, MapReduce, Hive, and HBase. While some agencies will find it tempting to withdraw to the comfort of proprietary Big Data tools that they can control in closed systems, that path undermines the value Big Data can ultimately deliver.

One could argue that as open source goes in 2013, Big Data goes as well. If open source platforms and tools continue to address agency demands for security, scalability, and flexibility, benefits within from Big Data within and across agencies will increase exponentially. There are hundreds of thousands of viable open source technologies on the market today. Not all are suitable for agency requirements, but as agencies update and expand their uses of data, these tools offer limitless opportunities to innovate. Additionally, opting for open source instead of proprietary vendor solutions prevents an agency from being locked into a single vendor’s tool that it may at some point outgrow or find ill suited for their needs.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

@MicroservicesExpo Stories
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
There’s a lot of discussion around managing outages in production via the likes of DevOps principles and the corresponding software development lifecycles that does enable higher quality output from development, however, one cannot lay all blame for “bugs” and failures at the feet of those responsible for coding and development. As developers incorporate features and benefits of these paradigm shift, there is a learning curve and a point of not-knowing-what-is-not-known. Sometimes, the only way ...
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
How can you compare one technology or tool to its competitors? Usually, there is no objective comparison available. So how do you know which is better? Eclipse or IntelliJ IDEA? Java EE or Spring? C# or Java? All you can usually find is a holy war and biased comparisons on vendor sites. But luckily, sometimes, you can find a fair comparison. How does this come to be? By having it co-authored by the stakeholders. The binary repository comparison matrix is one of those rare resources. It is edite...
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers ...
SYS-CON Events announced today that EnterpriseDB (EDB), the leading worldwide provider of enterprise-class Postgres products and database compatibility solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. EDB is the largest provider of Postgres software and services that provides enterprise-class performance and scalability and the open source freedom to divert budget from more costly traditiona...
Do you think development teams really update those BMC Remedy tickets with all the changes contained in a release? They don't. Most of them just "check the box" and move on. They rose a Risk Level that won't raise questions from the Change Control managers and they work around the checks and balances. The alternative is to stop and wait for a department that still thinks releases are rare events. When a release happens every day there's just not enough time for people to attend CAB meeting...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud en...
I’ve been thinking a bit about microservices (μServices) recently. My immediate reaction is to think: “Isn’t this just yet another new term for the same stuff, Web Services->SOA->APIs->Microservices?” Followed shortly by the thought, “well yes it is, but there are some important differences/distinguishing factors.” Microservices is an evolutionary paradigm born out of the need for simplicity (i.e., get away from the ESB) and alignment with agile (think DevOps) and scalable (think Containerizati...
In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, president of Intellyx, panelists Roberto Medrano, Executive Vice President at Akana; Lori MacVittie, IoT_Microservices Power PanelEvangelist for F5 Networks; and Troy Topnik, ActiveState’s Technical Product Manager; will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of ...
Announced separately, New Relic is joining the Cloud Foundry Foundation to continue the support of customers and partners investing in this leading PaaS. As a member, New Relic is contributing the New Relic tile, service broker and build pack with the goal of easing the development of applications on Cloud Foundry and enabling the success of these applications without dedicated monitoring infrastructure. Supporting Quotes "The proliferation of microservices and new technologies like Docker ha...