|By Lori MacVittie||
|February 1, 2013 10:30 AM EST||
The success of SOA, which grew out of the popular Object Oriented development paradigm, was greatly hampered by the inability of architects to enforce its central premise of reuse. But it wasn't just the lack of reusing services that caused it to fail to achieve the greatness predicted, it was the lack of adopting the idea of an authoritative source for business critical objects, i.e. data.
A customer, an order, a lead, a prospect, a service call. These "business objects" within SOA were intended to represented by a single, authoritative source as a means to ultimately provide a more holistic view of a customer that could be then be used by various business applications to ensure more quality service.
It didn't turn out that way, mores the pity, and while organizations adopted the protocols and programmatic methods associated with SOA, they never really got down to the business of implementing authoritative sources for business critical "objects". As organizations increasingly turn to SaaS solutions, particularly for CRM and SFA solutions (Gartner’s Market Trends: SaaS’s Varied Levels of Cannibalization to On-Premises Applications published: 29 October 2012) the ability to enforce a single, authoritative source becomes even more unpossible. What's perhaps even more disturbing is the potential inability to generate that holistic view of a customer that's so important to managing customer relationships and business processes.
The New Normal
Organizations have had to return to an integration-focused strategy in order to provide applications with the most current view of a customer. Unfortunately, that strategy often relies upon APIs from SaaS vendors who necessarily put limits on APIs that can interfere with that integration. As noted in "The Quest for a Cloud Integration Strategy", these limitations can strangle integration efforts to reassemble a holistic view of business objects as an organization grows:
"...many SaaS applications have very particular usage restrictions about how much data can be sent through their API in a given time window. It is critical that as data volumes increase that the solution adequately is aware of and handles those restrictions."
Note that the integration solution must be "aware of" and "handle" the restrictions. It is nearly a foregone conclusion that these limitations will eventually be met and there is no real solution around them save paying for more, if that's even an option.
While certainly that approach works for the provider - it keeps the service available - the definition of availability with respect to data is that it's, well, available. That means accessible. The existence of limitations means that at times and under certain conditions, your data will not be accessible, ergo by most folks definition it's not available.
If it's not available, the ability to put together a view of the customer is pretty much out of the question.
But eventually, it'll get there, right? Eventually, you'll have the data.
Eventually, the data you're basing decisions on, managing customers with, and basing manufacturing process on, will be consistent with reality.
Kicking Costs Down the Road - and Over the Wall
Many point to exorbitant IT costs to setup, scale, and maintain on-premise systems such as CRM. It is truth that a SaaS solution is faster and likely less expensive to maintain and scale. But it is also true that if the SaaS is unable to scale along with your business in terms of your ability to access, integrate, and analyze your own data, that you're merely kicking those capital and operating expenses down to the road - and over the wall to the business.
The problem of limitations on cloud integration (specifically SaaS integration) methods are not trivial. A perusal of support forums shows a variety of discussion on how to circumvent, avoid, and workaround these limitations to enable timely integration of data with other critical systems upon which business stakeholders rely to carry out their daily responsibilities to each other, to their investors, and to their customers.
Fulfillment, for example, may rely on data it receives as a result of integration with a SaaS. It is difficult to estimate fulfillment on data that may or may not be up to date and thus may not be consistent with the customer's view. Accounting may be relying on data it assumes is accurate, but actually is not. Most SaaS systems impose a 24 hour interval in which it enforces API access limits, which may set the books off by as much as a day - or more, depending on how much of a backlog may occur. Customers may be interfacing with systems that integrate with back-office SaaS that shows incomplete order histories, payments and deliveries, which in turn can result in increasing call center costs to deal with the inaccuracies.
The inability to access critical business data has a domino effect on every other system in place. The more distributed the sources of authoritative data the more disruptive an effect the inability to access that data due to provider-imposed limitations has on the entire business.
Eventually consistent business models are not optimal, yet the massive adoption of SaaS solutions make such a model inevitable for organizations of all sizes as they encounter artificial limitations imposed to ensure system wide availability but not necessarily individual data accessibility.
Being aware of such limitations can enable the development and implementation of strategies designed to keep data - especially authoritative data - as consistent as possible. But ultimately, any strategy is going to be highly dependent upon the provider and its ability to scale to meet demand - and loosen limitations on accessibility.
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
Jan. 16, 2017 08:00 PM EST Reads: 581
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, drew upon his own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He also discussed the implementation of microservices in data and application integrat...
Jan. 16, 2017 06:45 PM EST Reads: 3,482
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
Jan. 16, 2017 05:00 PM EST Reads: 4,004
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
Jan. 16, 2017 03:30 PM EST Reads: 4,817
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jan. 16, 2017 02:15 PM EST Reads: 5,261
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
Jan. 16, 2017 01:45 PM EST Reads: 4,580
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Jan. 16, 2017 01:30 PM EST Reads: 3,306
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
Jan. 16, 2017 01:00 PM EST Reads: 1,026
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Jan. 16, 2017 12:30 PM EST Reads: 5,008
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Jan. 16, 2017 12:30 PM EST Reads: 3,356
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Jan. 16, 2017 12:00 PM EST Reads: 2,409
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and E...
Jan. 16, 2017 11:30 AM EST Reads: 5,674
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
Jan. 16, 2017 11:30 AM EST Reads: 1,872
The unique combination of Amazon Web Services and Cloud Raxak, a Gartner Cool Vendor in IT Automation, provides a seamless and cost-effective way of securely moving on-premise IT workloads to Amazon Web Services. Any enterprise can now leverage the cloud, manage risk, and maintain continuous security compliance. Forrester's analysis shows that enterprises need automated security to lower security risk and decrease IT operational costs. Through the seamless integration into Amazon Web Services, ...
Jan. 16, 2017 11:15 AM EST Reads: 1,724
Software development is a moving target. You have to keep your eye on trends in the tech space that haven’t even happened yet just to stay current. Consider what’s happened with augmented reality (AR) in this year alone. If you said you were working on an AR app in 2015, you might have gotten a lot of blank stares or jokes about Google Glass. Then Pokémon GO happened. Like AR, the trends listed below have been building steam for some time, but they’ll be taking off in surprising new directions b...
Jan. 16, 2017 10:15 AM EST Reads: 2,146
A lot of time, resources and energy has been invested over the past few years on de-siloing development and operations. And with good reason. DevOps is enabling organizations to more aggressively increase their digital agility, while at the same time reducing digital costs and risks. But as 2017 approaches, the hottest trends in DevOps aren’t specifically about dev or ops. They’re about testing, security, and metrics.
Jan. 16, 2017 08:15 AM EST Reads: 1,209
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
Jan. 16, 2017 08:00 AM EST Reads: 3,072
Software delivery was once specific to the IT industry. Now, Continuous Delivery pipelines are used around world from e-commerce to airline software. Building a software delivery pipeline once involved hours of scripting and manual steps–a process that’s painful, if not impossible, to scale. However Continuous Delivery with Application Release Automation tools offers a scripting-free, automated experience. Continuous Delivery pipelines are immensely powerful for the modern enterprise, boosting ...
Jan. 16, 2017 06:00 AM EST Reads: 1,774
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
Jan. 16, 2017 04:00 AM EST Reads: 5,290
Docker containers have brought great opportunities to shorten the deployment process through continuous integration and the delivery of applications and microservices. This applies equally to enterprise data centers as well as the cloud. In his session at 20th Cloud Expo, Jari Kolehmainen, founder and CTO of Kontena, will discuss solutions and benefits of a deeply integrated deployment pipeline using technologies such as container management platforms, Docker containers, and the drone.io Cl tool...
Jan. 16, 2017 03:00 AM EST Reads: 823