Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Pat Romanski, Yeshim Deniz, Zakia Bouachraoui

Related Topics: Microservices Expo, Java IoT, Linux Containers, Agile Computing, @CloudExpo, @DXWorldExpo

Microservices Expo: Article

Applying Advanced Agile Methodologies

Our big challenge now is no longer the speed of code propagation. It’s how we can manage effective communication among streams

In the five years since I co-founded Bonitasoft with Miguel Valdes Faura and Rodrigue Le Gall, our organization has come a long way.

We started with seven developers. We now have 17 dedicated full time to Bonita BPM - along with a systems architect, a QA team, a documentation team, and a "human factors" engineer. We've logged 2.75 million downloads, booked 875 customers and built a community of 60,000 contributors.

How do you triple the size of your development team in less than five years and keep consistent control over your processes? Well, even for a company that's in the business of helping others improve processes, it's been a challenge, a learning experience - and a great opportunity to apply some interesting "advanced" agile methodologies.

How We Started with Agile
Our initial small team focused on development of the Bonita Execution Engine, the Bonita User Experience (web), and the Bonita Studio, with each of these groups having a specific skill set and a technical leader. From the very start we applied agile development practices - with everyone in the entire team working together in the same two-week sprint, participating in the daily scrum meetings, and so on.

With a small team, we were able to make very efficient progress all working on the same code - we got the first release of Bonita Open Solution out in six months.

But as we grew our development team and as we dealt with the inevitable errors that crept in, we found ourselves being held up. If the build chain broke, everyone's progress was affected.

With the growing team, to avoid these compilation issues, we broke up R&D into three individual teams (still focused on the Engine, the Web, and the Studio components of the Bonita BPM suite) and gave each team an independent release process for each component. This greatly helped us to isolate bug errors, but for fixes, the Studio team was always last in line - they needed a stable build from the Web team, who needed a stable build from the Engine team. It might take as long as two weeks before a bug-discovered-and-fixed on the same day by the Engine team actually propagated to the Studio team.

The Business Pressure to Change Our Development Approach
The growth of our team was only one aspect of the pressures we faced in engineering. As we moved through our Bonita Open Source version 5 product releases and began to prepare for the release of our new product, Bonita BPM version 6, we began to work more and more closely with the Product Committee. Together we started looking at ways to allow R&D to work on multiple features simultaneously, end-to-end, without pulling resources from one team to another. We wanted to reduce the time to fully develop new features of better quality, and to fix bugs. Bonitasoft's use of Value Streams at the strategic level offered a logical possibility: link R&D to corporate strategic goals for innovation and improvement.

The New R&D Organization: Agile Streams
Our development team is now organized into four streams: Innovation, Core Product, Integration, and Fast-Track. Strategically speaking, Innovation development keeps us at the leading edge of BPM suite capability, Core product development keeps us competitive in the current market, Integration remains one of our key differentiators, and Fast-Track helps ensure that users' needs are given appropriate priority.

The product committee's guidance heavily influences the priorities of the first three streams. The Fast-Track development priorities come from Support, Customer Success, Pre-Sales, and Delivery, the customer-facing groups inside Bonitasoft. In this way we continue to improve our product through both radical innovation and incremental improvements (new and improved features).

Each stream is comprised of Engine, Web, and Studio developers, plus a Product Manager and members of the documentation and Quality Assurance teams. Our systems architect and human factors engineer work across all four streams.

When a feature or improvement is developed in a stream, it is fully developed and tested on the stream's dedicated continuous integration server. A feature is "done" when the language translation is done, the documentation is done and the tests are done. When the entire code stream is stable, then and only then it is pushed to the shared continuous integration server where it can be accessed and used by the other streams.

When it is time for a major release, the code is pushed to another dedicated server where the final QA is done.

The advantages of this development approach are already being realized: the isolation of each stream and the involvement of QA inside each one means that the code is only shared when ready - and no other stream is dependent on work outside of it in order to advance.

It's also much cleaner to always have one stream dedicated to maintenance. We use a round robin approach so each stream has a turn, and only one stream is working on maintenance fixes at a time.

There's Always a Challenge
Our big challenge now is no longer the speed of code propagation. It's how we can manage effective communication among streams. Development may be appropriately isolated, but clear and timely communication on big changes is critical. We're addressing this challenge by sharing information frequently through informal presentations, and each team has a team leader whose responsibility includes sharing information across teams. Their entire mornings are pretty much dedicated to coordination tasks while their afternoons are dedicated to development tasks.

Looking Ahead
We are already seeing excellent results from our agile stream approach. Our maintenance releases are coming regularly each month, and the implementation of development roadmap is better balanced among the four strategic Value Streams. Bonita BPM has had two versions released in 2013, with two more on the way for 2014. With the Fast-track stream, we have been able to quickly respond to customers' and users' innovative suggestions and business needs - with a flexibility that underscores and confirms the very concept of agile.

More Stories By Charles Souillard

Charles Souillard co-founded Bonitasoft in 2009 with Miguel Valdes Faura and Rodrigue Le Gall. As VP of Engineering and CTO, Charles leads the Bonitasoft product development organization. He was previously head of the Bonita core development team within Bull Information Systems. He has significant experience developing mission-critical applications with BPM and SOA technologies. He serves on a number of European Community projects. He holds a Master’s degree in Computer Science from Polytech de Grenoble.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Microservices Articles
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...