|By Drew Bartkiewicz||
|December 14, 2012 03:30 AM EST||
Since the very beginnings of trade and commerce, it has been a commonality that most information exchange between buyer and seller, customer and business, was treated as a discrete, confidential, and almost intimate affair. Trust was earned, not given.
Consider the not so distant history of the local American bank. Banks have been collecting personal information about their customers for decades, harkening back to consultations over a notepad, paper deposit slips, and hand-written applications. The reputations of applicant and banker, buyer and seller, were local reputations, with personal and professional references limited to the confines of the community and the reality of proximity.
Banks large and small managed piecemeal, disconnected snapshots of personal information in random, unstructured, and ultimately inefficient processes that took place without fanfare over the lifetime relationship between client and local banker. "Data collection" was nowhere to be found in the strategic plan, yet banks were the recipients of valuable information regarding their clients: income, investments, payment history, business and family relationships that involved money. Most customers relied only on their local bankers to know them personally and therefore to be capable of making recommendations and offering personalized financial advice. Customer-to-computer interactions, later known as "self-service," were still a fantasy in the minds of fiction writers like George Orwell.
For over a hundred years banks stored their data locally, first in secured filing cabinets, then safes, and as time progressed, on local computers backed up centrally, just as a precaution. By and large the customer relationship and treatment of privacy was based on local proximity and personal discretion. If a breach of confidentiality occurred, it was entirely local in nature, usually involving only a handful of individuals, with minimal impact to the wider community and certainly little impact on the overall banking institution. The relationship was personal, much like the one embodied by George Bailey, the beleaguered banker in the Christmas classic film It's a Wonderful Life. But the landscape of customer information was unique also for cultural reasons and societal norms. Individuals owned their personal information, not banks. This distinction is significant given the incredibly electronic world that now surrounds us.
Today, this data scenario and concept of "confidentiality" is as outdated as the black-and-white movie. Client confidentiality is no longer parsed out in handfuls among consenting and trusting individuals with personal and community ties. Few bank customers today live in the world of George Bailey and his town full of customers he knew by their first names. In fact, it is just the opposite. This is not merely the emerging era of data exchange; it is the beginning of the largest personal data explosion the world has ever seen.
What explosion? Consider this: the average company doubles its amount of data every year, adding more data to our cyber economy than pennies in the Treasury. Data is not the newest asset, it is the pivotal one. And cloud computing is making that data aggregation cheaper and easier than ever, with APIs creating new capture nets across a multitude of mobile devices.
The explosion of data is also going international and isn't anywhere near over. Personal data aggregation is only expanding as more health, financial, and social information elements find their way from individuals and businesses into the "clouds" of networked computers, handheld devices, and massive data warehouses. Not only does the business and professional world know how to collect more data, it is also capable of storing it at lower and lower costs. The old days when confidentiality and personal privacy were held in trusted cocoons of discrete individual relationships are over. Data, the lubricant of automating modern commerce, is essentially loose in the digital ecosystem. It is flowing without interruption across physical and legal borders, feeding the data-hungry environment we have created.
Company reputation, personal privacy, and business risk have new meaning and unprecedented exposures. Knowing how to succeed, or fail, in such a cyber world gives cause for a better understanding of the technology blind spots, for individuals and businesses alike. But how did we get here so quickly, and have we fully evaluated the unintended consequences of our technology addiction and Internet openness?
Given that we now have the benefit of looking backward, we can see a convergence of three key forces that accelerated the data explosion: information economics, information technology, and information culture. All three factors have occurred so quickly and in such a parallel fashion that it is difficult to determine which came first or which caused the other. Each of these factors will be discussed in sequential articles from the published book, Unseen Liability, the Irreversible Collision of Technology and Business Risk.
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Aug. 30, 2016 09:45 AM EDT Reads: 879
Let's just nip the conflation of these terms in the bud, shall we?
"MIcro" is big these days. Both microservices and microsegmentation are having and will continue to have an impact on data center architecture, but not necessarily for the same reasons. There's a growing trend in which folks - particularly those with a network background - conflate the two and use them to mean the same thing.
They are not.
One is about the application. The other, the network. T...
Aug. 30, 2016 09:45 AM EDT Reads: 4,755
[session] Architecting for the Cloud By @RagsS | @CloudExpo @IBMBluemix #Cloud #Docker #Microservices
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Aug. 30, 2016 09:45 AM EDT Reads: 1,057
The following fictional case study is a composite of actual horror stories I’ve heard over the years. Unfortunately, this scenario often occurs when in-house integration teams take on the complexities of DevOps and ALM integration with an enterprise service bus (ESB) or custom integration. It is written from the perspective of an enterprise architect tasked with leading an organization’s effort to adopt Agile to become more competitive. The company has turned to Scaled Agile Framework (SAFe) as ...
Aug. 30, 2016 09:30 AM EDT Reads: 984
If you are within a stones throw of the DevOps marketplace you have undoubtably noticed the growing trend in Microservices. Whether you have been staying up to date with the latest articles and blogs or you just read the definition for the first time, these 5 Microservices Resources You Need In Your Life will guide you through the ins and outs of Microservices in today’s world.
Aug. 30, 2016 08:45 AM EDT Reads: 5,235
This is a no-hype, pragmatic post about why I think you should consider architecting your next project the way SOA and/or microservices suggest. No matter if it’s a greenfield approach or if you’re in dire need of refactoring. Please note: considering still keeps open the option of not taking that approach. After reading this, you will have a better idea about whether building multiple small components instead of a single, large component makes sense for your project. This post assumes that you...
Aug. 30, 2016 08:30 AM EDT Reads: 5,321
This complete kit provides a proven process and customizable documents that will help you evaluate rapid application delivery platforms and select the ideal partner for building mobile and web apps for your organization.
Aug. 30, 2016 07:00 AM EDT Reads: 3,085
Monitoring of Docker environments is challenging. Why? Because each container typically runs a single process, has its own environment, utilizes virtual networks, or has various methods of managing storage. Traditional monitoring solutions take metrics from each server and applications they run. These servers and applications running on them are typically very static, with very long uptimes. Docker deployments are different: a set of containers may run many applications, all sharing the resource...
Aug. 30, 2016 06:00 AM EDT Reads: 2,126
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Aug. 30, 2016 05:45 AM EDT Reads: 2,479
Sharding has become a popular means of achieving scalability in application architectures in which read/write data separation is not only possible, but desirable to achieve new heights of concurrency. The premise is that by splitting up read and write duties, it is possible to get better overall performance at the cost of a slight delay in consistency. That is, it takes a bit of time to replicate changes initiated by a "write" to the read-only master database. It's eventually consistent, and it'...
Aug. 30, 2016 05:30 AM EDT Reads: 3,253
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 30, 2016 02:30 AM EDT Reads: 1,867
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 30, 2016 02:00 AM EDT Reads: 2,252
A company’s collection of online systems is like a delicate ecosystem – all components must integrate with and complement each other, and one single malfunction in any of them can bring the entire system to a screeching halt. That’s why, when monitoring and analyzing the health of your online systems, you need a broad arsenal of different tools for your different needs. In addition to a wide-angle lens that provides a snapshot of the overall health of your system, you must also have precise, ...
Aug. 30, 2016 01:45 AM EDT Reads: 1,715
It's been a busy time for tech's ongoing infatuation with containers. Amazon just announced EC2 Container Registry to simply container management. The new Azure container service taps into Microsoft's partnership with Docker and Mesosphere. You know when there's a standard for containers on the table there's money on the table, too. Everyone is talking containers because they reduce a ton of development-related challenges and make it much easier to move across production and testing environm...
Aug. 29, 2016 11:45 PM EDT Reads: 5,282
Aug. 29, 2016 10:45 PM EDT Reads: 4,921
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 29, 2016 10:00 PM EDT Reads: 2,502
There's a lot of things we do to improve the performance of web and mobile applications. We use caching. We use compression. We offload security (SSL and TLS) to a proxy with greater compute capacity. We apply image optimization and minification to content. We do all that because performance is king. Failure to perform can be, for many businesses, equivalent to an outage with increased abandonment rates and angry customers taking to the Internet to express their extreme displeasure.
Aug. 29, 2016 08:30 PM EDT Reads: 2,456
Right off the bat, Newman advises that we should "think of microservices as a specific approach for SOA in the same way that XP or Scrum are specific approaches for Agile Software development". These analogies are very interesting because my expectation was that microservices is a pattern. So I might infer that microservices is a set of process techniques as opposed to an architectural approach. Yet in the book, Newman clearly includes some elements of concept model and architecture as well as p...
Aug. 29, 2016 07:30 PM EDT Reads: 10,908
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Aug. 29, 2016 07:00 PM EDT Reads: 1,996
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
Aug. 29, 2016 04:30 PM EDT Reads: 3,580