Click here to close now.




















Welcome!

Microservices Expo Authors: AppDynamics Blog, Lori MacVittie, Trevor Parsons, Mike Kavis, Liz McMillan

Related Topics: @CloudExpo, Microservices Expo, PowerBuilder, Containers Expo Blog, SAP HANA Cloud, Apache

@CloudExpo: Article

Predictive Real-Time Analytics Is the Big Data Lifeline

SAP is not the only firm driving this space, but its work to publicize its competencies in this arena is very prevalent

Big Data is everywhere. Predictive analytics and real time in-memory computing isn't everywhere.

This truth (if we can accept it to be so) represents something of an imbalance.

As a subset of data mining, predictive analytics driven by in-memory computing efficiencies now has an opportunity to bring real-time analysis and insight to fast-moving live transactional data flows. Or to put it another (rather shorter) way, we can now start to manage and understand Big Data better than ever.

Application use cases here might typically include:

  • Meteorology
  • Genetics
  • Economics
  • Climate simulation
  • Oil exploration
  • Financial analysis and scientific research
  • Telecommunications, to name but seven good examples

If we combine contemporary approaches to predictive analytics with the newly arrived Intel Xeon Phi coprocessor that produces what is claimed to be over one teraflop per second in terms of workload computational power for highly-parallel workloads, then CIOs can start to think about what "50-processor core computing" will mean for us in the very near future.

Pushing forward in this space is SAP with its HANA in-memory computing appliance and platform. The firm is openly partnering with HP, Oracle, Cognizant and variety of other big (and smaller specialist) players to form strategic alliances that will help further the uptake of this kind of technology.

Why Is Analytics Working Better?
Part of the reason that in-memory predictive analytics intelligence is now becoming so important and much easier to bring to bear is hardware related and part of it is a software-related development.

On the hardware end, Intel has worked to sharpen data throughput between memory and processor cores. This means that work that goes on right in the heart of the machine that might have traveled at around five Gigabytes per second (Gbps) five years ago, now has a chance to move at 100 Gbps. On the software end, firms like HP and SAP have been working to produce what are typically referred to as "business process solutions" that can produce "context-aware experiences" to enable one sense-and-respond scenarios and, therefore, faster and more personalized interactions with customers.

As a matter of interest, HP also works to provide datacenter services to SAP that support its enterprise-wide hosting solutions for e-business applications - but that's another story.

In terms of actual application solutions running on SAP HANA, new products include the SAP Liquidity Risk Management application, the SAP Accelerated Trade Promotion Planning application and SAP Operational Process Intelligence software.

"These innovations show how SAP is rapidly delivering real-time, data-centric and industry-specific applications on the SAP HANA platform," said Dr Vishal Sikka, member of the SAP executive board for technology and innovation.

To take one example, SAP Liquidity Risk Management aims to provide banks with the ability to perform real-time, high-speed liquidity risk management and reporting on very large volumes of cash flows. In future then, banks will be able to instantly measure key liquidity risk ratios (such as the Basel III liquidity coverage ratio) and cash flow gaps to resolve potential liquidity bottlenecks. The application aims to allow banks to apply different stress scenarios, such as adjusted run-off rates and bond haircuts, to gain a deeper understanding of how market volatility can impact liquidity positions.

Lessons for CIOs
Now you don't have to be a bank CIO or financial analyst to understand the wider importance of this technology, i.e., this is the "harnessing of Big Data" catchline that you've already heard bandied about by countless IT firm's press departments, except now it's really happening.

The lesson for CIOs and the software application developers serving them is that we now have a route to predictive real-time analysis and the power to view billions of stored records and live transactional data at the same time. CIOs should look to their solution architects and business process experts to compose the analytical data models that will drive the next phase of their technology growth.

SAP is not the only firm driving this space, but its work to publicize its competencies in this arena is very prevalent. The fact that SAP pushes many of the interfaces for managing the result of its data analysis to both Apple iPad (and now Windows 8 format) mobile devices may have confounded Steve Jobs at the time, but this is the firm's proof point for showing off its Big Data number crunching applications. One day, quite soon, none of this will be a surprise.

This post first appeared on Enterprise CIO Forum.

More Stories By Adrian Bridgwater

Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
The Internet of Things. Cloud. Big Data. Real-Time Analytics. To those who do not quite understand what these phrases mean (and let’s be honest, that’s likely to be a large portion of the world), words like “IoT” and “Big Data” are just buzzwords. The truth is, the Internet of Things encompasses much more than jargon and predictions of connected devices. According to Parker Trewin, Senior Director of Content and Communications of Aria Systems, “IoT is big news because it ups the ante: Reach out ...
Auto-scaling environments, micro-service architectures and globally-distributed teams are just three common examples of why organizations today need automation and interoperability more than ever. But is interoperability something we simply start doing, or does it require a reexamination of our processes? And can we really improve our processes without first making interoperability a requirement for how we choose our tools?
At DevOps Summit NY there’s been a whole lot of talk about not just DevOps, but containers, IoT, and microservices. Sessions focused not just on the cultural shift needed to grow at scale with a DevOps approach, but also made sure to include the network ”plumbing” needed to ensure success as applications decompose into the microservice architectures enabling rapid growth and support for the Internet of (Every)Things.
Our guest on the podcast this week is Adrian Cockcroft, Technology Fellow at Battery Ventures. We discuss what makes Docker and Netflix highly successful, especially through their use of well-designed IT architecture and DevOps.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
This week, I joined SOASTA as Senior Vice President of Performance Analytics. Given my background in cloud computing and distributed systems operations — you may have read my blogs on CNET or GigaOm — this may surprise you, but I want to explain why this is the perfect time to take on this opportunity with this team. In fact, that’s probably the best way to break this down. To explain why I’d leave the world of infrastructure and code for the world of data and analytics, let’s explore the timing...
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
Alibaba, the world’s largest ecommerce provider, has pumped over a $1 billion into its subsidiary, Aliya, a cloud services provider. This is perhaps one of the biggest moments in the global Cloud Wars that signals the entry of China into the main arena. Here is why this matters. The cloud industry worldwide is being propelled into fast growth by tremendous demand for cloud computing services. Cloud, which is highly scalable and offers low investment and high computational capabilities to end us...
Public Cloud IaaS started its life in the developer and startup communities and has grown rapidly to a $20B+ industry, but it still pales in comparison to how much is spent worldwide on IT: $3.6 trillion. In fact, there are 8.6 million data centers worldwide, the reality is many small and medium sized business have server closets and colocation footprints filled with servers and storage gear. While on-premise environment virtualization may have peaked at 75%, the Public Cloud has lagged in adop...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with ...
JavaScript is primarily a client-based dynamic scripting language most commonly used within web browsers as client-side scripts to interact with the user, browser, and communicate asynchronously to servers. If you have been part of any web-based development, odds are you have worked with JavaScript in one form or another. In this article, I'll focus on the aspects of JavaScript that are relevant within the Node.js environment.
One of the ways to increase scalability of services – and applications – is to go “stateless.” The reasons for this are many, but in general by eliminating the mapping between a single client and a single app or service instance you eliminate the need for resources to manage state in the app (overhead) and improve the distributability (I can make up words if I want) of requests across a pool of instances. The latter occurs because sessions don’t need to hang out and consume resources that could ...
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. The DevOps approach is a way to increase business agility through collaboration, communication, and integration across different teams in the IT organization. In his session at DevOps Summit, Chris Van Tuin, Chief Technologist for the Western US at Red Hat, will discuss: The acceleration of application delivery for the business with DevOps
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Opening Keynote at 16th Cloud Expo, S...
Software is eating the world. The more it eats, the bigger the mountain of data and wealth of valuable insights to digest and act on. Forward facing customer-centric IT organizations, leaders and professionals are looking to answer questions like how much revenue was lost today from platinum users not converting because they experienced poor mobile app performance. This requires a single, real-time pane of glass for end-to-end analytics covering business, customer, and IT operational data.
Approved this February by the Internet Engineering Task Force (IETF), HTTP/2 is the first major update to HTTP since 1999, when HTTP/1.1 was standardized. Designed with performance in mind, one of the biggest goals of HTTP/2 implementation is to decrease latency while maintaining a high-level compatibility with HTTP/1.1. Though not all testing activities will be impacted by the new protocol, it's important for testers to be aware of any changes moving forward.
"ProfitBricks was founded in 2010 and we are the painless cloud - and we are also the Infrastructure as a Service 2.0 company," noted Achim Weiss, Chief Executive Officer and Co-Founder of ProfitBricks, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.