Welcome!

Microservices Expo Authors: Pat Romanski, Elizabeth White, Derek Weeks, Miska Kaipiainen, Automic Blog

Related Topics: Microservices Expo, Open Source Cloud, Containers Expo Blog, Release Management , @CloudExpo, Apache

Microservices Expo: Blog Feed Post

What Is Software-Defined Datacenter (SDDC)?

I created this short FAQ to help answer some of those questions

At VMworld this year, both in San Francisco and Barcelona, VMware CEO Pat Gelsinger introduced the concept of the Software-Defined Datacenter (SDDC). This builds on the concept that as more and more of the Data Center becomes virtualized (servers, desktops), delivering greater cost-savings and agility to customers, software-defined automation and functionality (network, security, storage, backup) become the next logical steps to help IT deliver greater value to the business.

As with any new technology or vision, there are often many questions about how this will impact the market, how it will affect IT organizations. Wikibon did a nice job providing their view on "Software-led Infrastructure". It's one of many attempts that I've seen to start trying to put a scope around this concept. Some portions are agreed upon, while others are creating some headaches.

I created this short FAQ to help answer some of those questions:

1. VMware is using a new term, "Software-Defined Datacenter" (SDDC), at the center of the 2012 conference. What is Software-Defined Datacenter?
[Steve Herrod blog]. Software Defined Data Center is VMware's vision that greater business value can be created from IT when intelligent software is abstracted from standardized hardware.  In the simplest technical definition, it is the separation (or abstraction) of the "control plane" (configuration, topology awareness, management, operations) from the "data plane" (moving data, storing data).

1a. Is there a clear spelling of this term?

  • Meh. Maybe, but it will have at least 3-5 variations in 2013. Just call it "SDDC" and save yourself a lot of auto-correct headaches.

2. Is there a clear, agreed upon definition (or standard) for Software-Defined Datacenter at this time?

  • Software-Defined Datacenter is not defined by an existing standards body (eg. IETF, ITU, NIST), but rather it is vision for the evolution of how Data Center environments will become more flexible in responding to business demands. SDDC builds upon the abstraction that server virtualization has created and extends this to broader elements of the Data Center (eg. network, storage), as well as expanding the roll that automation will play in the future.

3. How is "Software-Defined Datacenter" different than "Cloud"?

  • Cloud (or Cloud Computing) is fundamentally a new operational model for IT, where resources are delivered on-demand. While Cloud uses technologies such as virtualization or converged infrastructure, it's primarily about the shift in delivery and consumption of IT services. Software Defined Data Center is the next evolution of the underlying technology, where software delivers greater levels of intelligence and value, on top of standardized hardware.

4. Does Software-Defined Datacenter eliminate the need for traditional Data Center hardware?

  • No. There will still be a need for physical serves (CPU, memory), network devices to connect ports and deliver bandwidth, and devices that can store data on flash/disk/tape. But the trend in the industry is that these devices are becoming more standardized on x86 chips, mass produced memory/disks and mass produced ASICs. This trend should allow faster, more simplified "fabrics" (interconnecting servers, networks and storage) to be built, with the intelligence for policy, security, operations to continue to move into software, which is faster to develop and adapt to changing business requirements. Leading companies have been shifting their product strategies to embrace this trend for the last few years.

5. Which market segments does Software-Defined Datacenter target, or which use cases?

  • Software-Defined Datacenter technology are applicable to markets of all sizes (Enterprise, Mid-Market, Service Provider), but the initial adopters have been large Service Providers that are attempting to solve challenges with large-scale Data Centers. As the competition for Public and Hybrid Cloud services increases (Amazon, Google, Rackspace, Microsoft, Cloud Service Providers), the need to drive greater operational efficiency, and associated costs and time-to-market, is pushing them to solve problems in new software-centric ways.
    • As more Enterprise and Mid-Market customers adopt Private Cloud and deliver IT-as-a-Service, I also expect SDDC technologies to evolve to solve challenges at different scale, as well as user-centric challenges such as BYOD.

6. How will Software-Defined Datacenter impact IT organizations?

  • Even more than ever, the current era of IT is ultimately defined by rapid change, in terms of new devices (smartphones, tablets), new application consumption models (PaaS, SaaS), or converging technology silos (virtualization, converged infrastructure). Software-Defined Datacenter is the next step in converging functional areas, while attempting to give IT the ability to respond to business challenges faster.

7. Is Software-Defined Datacenter a competitive threat to traditional hardware companies?

  • As mentioned above, Software-Defined Datacenter does not eliminate the need for physical hardware within the Data Center. Rather it is a vision to enable customers to better take advantage of the trend towards delivering software intelligence on standardized hardware. As with many technology transitions, there are opportunities to evolve technology portfolios, evolve business models and unlock new partnership opportunities.

8. Is Software-Defined Datacenter explicitly linked with open-source technologies such as OpenStack, OpenFlow or Open vSwitch?

  • While there are open-source projects today that will have an influence on Software-Defined Datacenters, by no means does this mean that this is the only delivery mechanism for customers to obtain the technology needed for this IT technology evolution. A few examples of this:
  • VMware's acquisition of Nicira - while Nicira was a major contributor to the OpenStack Quantum project (network virtualization) and the Open vSwitch project, which are both open-source, their core NVP product was a commercial offering.
  • OpenFlow is a standards-based protocol for network virtualization that can be implemented by any vendor, for either open-source or commercial products.
  • "Project Razor" is an open-source project that was jointly created by EMC and Puppet Labs to deliver advanced server and application automation for Data Center and Cloud environments. The software can be used with either commercial products (eg. VMware vSphere, Cisco UCS, etc.) or open-source projects (OpenStack, KVM, CloudFoundry)

Read the original blog entry...

More Stories By Brian Gracely

A 20 year technology veteran, Brian Gracely is VP of product management at Virtustream. He holds a CCIE #3077 and an MBA from Wake Forest University.

Throughout his career Brian has led Cisco, NetApp, EMC and Virtustream into emerging markets and through technology transitions. An active participant in the virtualization and cloud computing communities, his industry viewpoints and writing can also be found on Twitter @bgracely, on his blog Clouds of Change and his podcast The Cloudcast (.net). He is a VMware vExpert and was named a "Top 100" Cloud Computing blogger by Cloud Computing Journal.

@MicroservicesExpo Stories
SYS-CON Events announced today that Tintri Inc., a leading producer of VM-aware storage (VAS) for virtualization and cloud environments, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Tintri VM-aware storage is the simplest for virtualized applications and cloud. Organizations including GE, Toyota, United Healthcare, NASA and 6 of the Fortune 15 have said “No to LUNs.” With Tintri they mana...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially. Open source and third-party packages/components live at the heart of high velocity software development organizations. Today, an average of 106 packages/components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.
Throughout history, various leaders have risen up and tried to unify the world by conquest. Fortunately, none of their plans have succeeded. The world goes on just fine with each country ruling itself; no single ruler is necessary. That’s how it is with the container platform ecosystem, as well. There’s no need for one all-powerful, all-encompassing container platform. Think about any other technology sector out there – there are always multiple solutions in every space. The same goes for conta...
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lea...
Let's recap what we learned from the previous chapters in the series: episode 1 and episode 2. We learned that a good rollback mechanism cannot be designed without having an intimate knowledge of the application architecture, the nature of your components and their dependencies. Now that we know what we have to restore and in which order, the question is how?
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will look at the protocols that communicate data and the emerging data analy...
Large enterprises today are juggling an enormous variety of network equipment. Business users are asking for specific network throughput guarantees when it comes to their critical applications, legal departments require compliance with mandated regulatory frameworks, and operations are asked to do more with shrinking budgets. All these requirements do not easily align with existing network architectures; hence, network operators are continuously faced with a slew of granular parameter change req...
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management solutions, helping companies worldwide activate their data to drive more value and business insight and to transform moder...
Whether they’re located in a public, private, or hybrid cloud environment, cloud technologies are constantly evolving. While the innovation is exciting, the end mission of delivering business value and rapidly producing incremental product features is paramount. In his session at @DevOpsSummit at 19th Cloud Expo, Kiran Chitturi, CTO Architect at Sungard AS, will discuss DevOps culture, its evolution of frameworks and technologies, and how it is achieving maturity. He will also cover various st...
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Video experiences should be unique and exciting! But that doesn’t mean you need to patch all the pieces yourself. Users demand rich and engaging experiences and new ways to connect with you. But creating robust video applications at scale can be complicated, time-consuming and expensive. In his session at @ThingsExpo, Zohar Babin, Vice President of Platform, Ecosystem and Community at Kaltura, will discuss how VPaaS enables you to move fast, creating scalable video experiences that reach your...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...