|By Ignacio M. Llorente||
|July 23, 2010 09:15 AM EDT||
As many of you know, a new open-source cloud platform, OpenStack, was recently announced. Here at OpenNebula, we think this is a very exciting development in the cloud community, and we're glad to see so many major players coalescing around an open-source solution. However, we have also been concerned by the all the high-profile announcements and opinion pieces that describe OpenStack as the first initiative for the definition of an open architecture for IaaS cloud computing and a "real" open-source project, criticizing some existing open-source cloud projects as being "open-core" closed initiatives (in some cases conflating "open-core" with "having an Enterprise edition"), and pointing out their lack of extensibility and inability to efficiently scale to manage tens of thousand of VMs. This is the reason why we have decided to write this post in order to clearly state our position in order to avoid misunderstandings, particularly with our growing community of users.
OpenNebula is and always will be 100% Apache-licensed Open-Source Software
OpenNebula was first established as a research project back in 2005, with first public release in March 2008. We have a strong commitment with open-source, being one of the few cloud management tools that are available under Apache license. The Apache license allows any Cloud and virtualization player to innovate using the technology without the obligation to contribute those innovations back to the open source community (although we encourage that this work be contributed back to the community). This is the case for many third-party commercial products that embed OpenNebula.
OpenNebula is NOT "Open Core"
C12G Labs is a new start-up that has been created to provide the professional integration, certification and technical support that many enterprise IT shops require for internal adoption and to allow the OpenNebula project to not be tied exclusively to public financing (research grants, etc.), contributing to its long-term sustainability. Although C12G Labs does provide an Enterprise edition of OpenNebula, all software extensions and patches created by C12G (distributed in the Enterprise Edition of OpenNebula to support customers and partners) are fully contributed back to OpenNebula and its ecosystem under an OSI-compliant license. So OpenNebula is NOT a feature or performance limited edition of the Enterprise version. C12G Labs contributes to the sustainability of the community edition and is committed to enlarge the OpenNebula community. C12G Labs dedicates an amount of its own engineering resources to support and develop OpenNebula and so to maintain OpenNebula's position as the leading and most advanced open-source technology to build cloud infrastructures.
OpenNebula is an Open-Source Community
The OpenNebula technology has matured thanks to an active and engaged community of users and developers. OpenNebula development is driven by our community in order to support the most demanded features and by international research projects funding OpenNebula in order to address the demanding requirements of several business and scientific use cases for Cloud Computing. We have also created the OpenNebula ecosystem where related tools, extensions and plug-ins are available from and for the community.
OpenNebula is a Production-ready and Highly-scalable Technology
OpenNebula is an open-source project aimed at developing a production-ready cloud management tool for building any type of Cloud deployment, either in scientific or in business environments. OpenNebula releases are tested to assess its scalability and robustness in large scale VM deployments, and under stress conditions. Of course, you don't have to take our word for it: several users have reported excellent performance results to manage tens of thousands of VMs. We have been encouraging some of these users to write on our blog about their experiences with OpenNebula. So far, you can read this recent blog post on how OpenNebula is being used at CERN, with more user experiences blog posts to follow soon.
OpenNebula is a Flexible and Extensible Toolkit
Because two datacenters are not the same, OpenNebula offers a open, flexible and extensible architecture, interfaces and components that fit into any existing data center; and enable its integration with any product and service in the Cloud and virtualization ecosystem, and management tool in the datacenter. OpenNebula is a framework, you can replace and adapt any component to efficiently work in any environment.
OpenNebula is Hypervisor Agnostic and Standards-based
OpenNebula provides an abstraction layer independent from underlying services for security, virtualization, networking and storage, avoiding vendor lock-in and enabling interoperability. OpenNebula is not only built on standards, but has also provided reference implementation of open community specifications, such us the OGF Open Cloud Computing Interface. OpenNebula additionally leverages the ecosystems being built around most popular cloud interfaces, Amazon AWS, OGC OCCI and VMware vCloud
OpenNebula Implements an Open Architecture Defined by Major Players in the Cloud Arena
OpenNebula is the result of many years of research and the interaction with some of the major players in the Cloud arena. This technology has been designed to address the requirements of business use cases from leading companies in the context of flagship international projects in cloud computing. The main international project funding OpenNebula is RESERVOIR. OpenNebula is an implementation of the IaaS management layer of the RESERVOIR open architecture defined by its partners: IBM, Telefonica Investigacion y Desarrollo, University College of London, Umeå University, SAP AG, Thales Services SAS, Sun Microsystems Germany, ElsagDatamat S.p.A, Universidad Complutense de Madrid, CETIC, Universita Della Svizzera italiana, Universita degli Studio di Messina, and the European Chapter of the Open Grid Forum. The outcome of this collaboration is the unique functionality provided by OpenNebula.
OpenNebula will Continue Incorporating State-of-the-Art Features Demanded by Major Players
OpenNebula is used, together with other software components, in new international innovative projects in Cloud Computing. StratusLab with the participation of Centre National de la Recherche Scientifique, Universidad Complutense de Madrid, Greek Research and Technology Network S.A., SixSq Sárl, Telefonica Investigacion y Desarrollo and Trinity College Dublin, aimed at bringing cloud and virtualization to grid computing infrastructures. BonFIRE with the participation of Atos Origin, University of Edinburgh, SAP AG, Universitaet Stuttgart, FRAUNHOFER, Interdisciplinary Institute for Broadband Technology, Universidad Complutense de Madrid, Fundacio Privada I2CAT, Hewlett-Packard Limited, The 451 Group Limited, Technische Universitaet Berlin, IT-Innovation, and Institut National de Recherche en Informatique et en Automatique, aimed at designing, building and operating a multi-site cloud-based facility to support research across applications, services and systems targeting services research community on Future Internet;. And many others, such as 4CaaSt with the participation of UPM, 2nd Quadrant Limited, BonitaSoft, Bull SAS, Telefónica Investigación y Desarrollo, Ericsson GMBH, FlexiScale, France Telecom, Universitat St Gallen, ICCS/NTUA, Nokia Siemens Networks, SAP AG, Telecom Italia, UCM, Universitaet Stuutgart, UvT-EISS, and ZIB, aimed at creating an advanced PaaS Cloud platform which supports the optimized and elastic hosting of Internet-scale multi-tier applications.
* * *
All that said, we'd like to reiterate that we strongly support initiatives like OpenStack. This open source initiative is fully aligned with our vision on what the cloud ecosystem should look like, and we will be happy to contribute to OpenStack with our significant track record in open source and scalable cloud computing management, and with an implementation of the open APIs that will be defined in the context of the OpenStack architecture. However, we felt that some of the buzz surrounding OpenStack unfairly characterized existing open source efforts, and felt it was necessary to reiterate our commitment to an open source cloud ecosystem.
Disclaimer: The above represent our position, and may not reflect the positions of any of the projects and organizations referenced in the post.
Software development is a moving target. You have to keep your eye on trends in the tech space that haven’t even happened yet just to stay current. Consider what’s happened with augmented reality (AR) in this year alone. If you said you were working on an AR app in 2015, you might have gotten a lot of blank stares or jokes about Google Glass. Then Pokémon GO happened. Like AR, the trends listed below have been building steam for some time, but they’ll be taking off in surprising new directions b...
Jan. 23, 2017 12:45 AM EST Reads: 2,379
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
Jan. 23, 2017 12:15 AM EST Reads: 5,141
When building DevOps or continuous delivery practices you can learn a great deal from others. What choices did they make, what practices did they put in place, and how did they connect the dots? At Sonatype, we pulled together a set of 21 reference architectures for folks building continuous delivery and DevOps practices using Docker. Why? After 3,000 DevOps professionals attended our webinar on "Continuous Integration using Docker" discussing just one reference architecture example, we recogn...
Jan. 22, 2017 10:15 PM EST Reads: 1,432
An overall theme of Cloud computing and the specific practices within it is fundamentally one of automation. The core value of technology is to continually automate low level procedures to free up people to work on more value add activities, ultimately leading to the utopian goal of full Autonomic Computing. For example a great way to define your plan for DevOps tool chain adoption is through this lens. In this TechTarget article they outline a simple maturity model for planning this.
Jan. 22, 2017 08:45 PM EST Reads: 948
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, drew upon his own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He also discussed the implementation of microservices in data and application integrat...
Jan. 22, 2017 07:15 PM EST Reads: 3,645
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
Jan. 22, 2017 06:30 PM EST Reads: 5,025
Here’s a novel, but controversial statement, “it’s time for the CEO, COO, CIO to start to take joint responsibility for application platform decisions.” For too many years now technical meritocracy has led the decision-making for the business with regard to platform selection. This includes, but is not limited to, servers, operating systems, virtualization, cloud and application platforms. In many of these cases the decision has not worked in favor of the business with regard to agility and cost...
Jan. 22, 2017 06:00 PM EST Reads: 708
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Jan. 22, 2017 06:00 PM EST Reads: 1,546
As the race for the presidency heats up, IT leaders would do well to recall the famous catchphrase from Bill Clinton’s successful 1992 campaign against George H. W. Bush: “It’s the economy, stupid.” That catchphrase is important, because IT economics are important. Especially when it comes to cloud. Application performance management (APM) for the cloud may turn out to be as much about those economics as it is about customer experience.
Jan. 22, 2017 03:15 PM EST Reads: 4,747
When you focus on a journey from up-close, you look at your own technical and cultural history and how you changed it for the benefit of the customer. This was our starting point: too many integration issues, 13 SWP days and very long cycles. It was evident that in this fast-paced industry we could no longer afford this reality. We needed something that would take us beyond reducing the development lifecycles, CI and Agile methodologies. We made a fundamental difference, even changed our culture...
Jan. 22, 2017 03:00 PM EST Reads: 1,205
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Jan. 22, 2017 02:30 PM EST Reads: 2,640
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
Jan. 22, 2017 02:30 PM EST Reads: 3,761
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Jan. 22, 2017 02:00 PM EST Reads: 5,296
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of being...
Jan. 22, 2017 12:15 PM EST Reads: 2,638
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Jan. 22, 2017 12:00 PM EST Reads: 3,644
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Jan. 22, 2017 11:45 AM EST Reads: 2,977
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed ...
Jan. 22, 2017 11:45 AM EST Reads: 6,468
2016 has been an amazing year for Docker and the container industry. We had 3 major releases of Docker engine this year , and tremendous increase in usage. The community has been following along and contributing amazing Docker resources to help you learn and get hands-on experience. Here’s some of the top read and viewed content for the year. Of course releases are always really popular, particularly when they fit requests we had from the community.
Jan. 22, 2017 11:00 AM EST Reads: 1,257
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, covered the union between the two topics and why this is important. He provided an overview of Immutable Infrastructure then showed how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He ended the session with some interesting case study examples.
Jan. 22, 2017 10:30 AM EST Reads: 3,287
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
Jan. 22, 2017 08:30 AM EST Reads: 992