Welcome!

Microservices Expo Authors: Liz McMillan, Elizabeth White, Anders Wallgren, Martin Etmajer, Jason Bloomberg

Related Topics: Microservices Expo, Java IoT, Linux Containers, Agile Computing, @CloudExpo, @BigDataExpo

Microservices Expo: Article

Applying Advanced Agile Methodologies

Our big challenge now is no longer the speed of code propagation. It’s how we can manage effective communication among streams

In the five years since I co-founded Bonitasoft with Miguel Valdes Faura and Rodrigue Le Gall, our organization has come a long way.

We started with seven developers. We now have 17 dedicated full time to Bonita BPM - along with a systems architect, a QA team, a documentation team, and a "human factors" engineer. We've logged 2.75 million downloads, booked 875 customers and built a community of 60,000 contributors.

How do you triple the size of your development team in less than five years and keep consistent control over your processes? Well, even for a company that's in the business of helping others improve processes, it's been a challenge, a learning experience - and a great opportunity to apply some interesting "advanced" agile methodologies.

How We Started with Agile
Our initial small team focused on development of the Bonita Execution Engine, the Bonita User Experience (web), and the Bonita Studio, with each of these groups having a specific skill set and a technical leader. From the very start we applied agile development practices - with everyone in the entire team working together in the same two-week sprint, participating in the daily scrum meetings, and so on.

With a small team, we were able to make very efficient progress all working on the same code - we got the first release of Bonita Open Solution out in six months.

But as we grew our development team and as we dealt with the inevitable errors that crept in, we found ourselves being held up. If the build chain broke, everyone's progress was affected.

With the growing team, to avoid these compilation issues, we broke up R&D into three individual teams (still focused on the Engine, the Web, and the Studio components of the Bonita BPM suite) and gave each team an independent release process for each component. This greatly helped us to isolate bug errors, but for fixes, the Studio team was always last in line - they needed a stable build from the Web team, who needed a stable build from the Engine team. It might take as long as two weeks before a bug-discovered-and-fixed on the same day by the Engine team actually propagated to the Studio team.

The Business Pressure to Change Our Development Approach
The growth of our team was only one aspect of the pressures we faced in engineering. As we moved through our Bonita Open Source version 5 product releases and began to prepare for the release of our new product, Bonita BPM version 6, we began to work more and more closely with the Product Committee. Together we started looking at ways to allow R&D to work on multiple features simultaneously, end-to-end, without pulling resources from one team to another. We wanted to reduce the time to fully develop new features of better quality, and to fix bugs. Bonitasoft's use of Value Streams at the strategic level offered a logical possibility: link R&D to corporate strategic goals for innovation and improvement.

The New R&D Organization: Agile Streams
Our development team is now organized into four streams: Innovation, Core Product, Integration, and Fast-Track. Strategically speaking, Innovation development keeps us at the leading edge of BPM suite capability, Core product development keeps us competitive in the current market, Integration remains one of our key differentiators, and Fast-Track helps ensure that users' needs are given appropriate priority.

The product committee's guidance heavily influences the priorities of the first three streams. The Fast-Track development priorities come from Support, Customer Success, Pre-Sales, and Delivery, the customer-facing groups inside Bonitasoft. In this way we continue to improve our product through both radical innovation and incremental improvements (new and improved features).

Each stream is comprised of Engine, Web, and Studio developers, plus a Product Manager and members of the documentation and Quality Assurance teams. Our systems architect and human factors engineer work across all four streams.

When a feature or improvement is developed in a stream, it is fully developed and tested on the stream's dedicated continuous integration server. A feature is "done" when the language translation is done, the documentation is done and the tests are done. When the entire code stream is stable, then and only then it is pushed to the shared continuous integration server where it can be accessed and used by the other streams.

When it is time for a major release, the code is pushed to another dedicated server where the final QA is done.

The advantages of this development approach are already being realized: the isolation of each stream and the involvement of QA inside each one means that the code is only shared when ready - and no other stream is dependent on work outside of it in order to advance.

It's also much cleaner to always have one stream dedicated to maintenance. We use a round robin approach so each stream has a turn, and only one stream is working on maintenance fixes at a time.

There's Always a Challenge
Our big challenge now is no longer the speed of code propagation. It's how we can manage effective communication among streams. Development may be appropriately isolated, but clear and timely communication on big changes is critical. We're addressing this challenge by sharing information frequently through informal presentations, and each team has a team leader whose responsibility includes sharing information across teams. Their entire mornings are pretty much dedicated to coordination tasks while their afternoons are dedicated to development tasks.

Looking Ahead
We are already seeing excellent results from our agile stream approach. Our maintenance releases are coming regularly each month, and the implementation of development roadmap is better balanced among the four strategic Value Streams. Bonita BPM has had two versions released in 2013, with two more on the way for 2014. With the Fast-track stream, we have been able to quickly respond to customers' and users' innovative suggestions and business needs - with a flexibility that underscores and confirms the very concept of agile.

More Stories By Charles Souillard

Charles Souillard co-founded Bonitasoft in 2009 with Miguel Valdes Faura and Rodrigue Le Gall. As VP of Engineering and CTO, Charles leads the Bonitasoft product development organization. He was previously head of the Bonita core development team within Bull Information Systems. He has significant experience developing mission-critical applications with BPM and SOA technologies. He serves on a number of European Community projects. He holds a Master’s degree in Computer Science from Polytech de Grenoble.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
Microservices are all the rage right now — and the industry is still learning, experimenting, and developing patterns, for successfully designing, deploying and managing Microservices in the real world. Are you considering jumping on the Microservices-wagon? Do Microservices make sense for your particular use case? What are some of the “gotchas” you should be aware of? This morning on #c9d9 we had experts from popular chat app Kik, SMB SaaS platform Yodle and hosted CI solution Semaphore sha...
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
SYS-CON Events announced today that Men & Mice, the leading global provider of DNS, DHCP and IP address management overlay solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. The Men & Mice Suite overlay solution is already known for its powerful application in heterogeneous operating environments, enabling enterprises to scale without fuss. Building on a solid range of diverse platform support,...
The (re?)emergence of Microservices was especially prominent in this week’s news. What are they good for? do they make sense for your application? should you take the plunge? and what do Microservices mean for your DevOps and Continuous Delivery efforts? Continue reading for more on Microservices, containers, DevOps culture, and more top news from the past week. As always, stay tuned to all the news coming from@ElectricCloud on DevOps and Continuous Delivery throughout the week and retweet/favo...
How is your DevOps transformation coming along? How do you measure Agility? Reliability? Efficiency? Quality? Success?! How do you optimize your processes? This morning on #c9d9 we talked about some of the metrics that matter for the different stakeholders throughout the software delivery pipeline. Our panelists shared their best practices.
In a previous article, I demonstrated how to effectively and efficiently install the Dynatrace Application Monitoring solution using Ansible. In this post, I am going to explain how to achieve the same results using Chef with our official dynatrace cookbook available on GitHub and on the Chef Supermarket. In the following hands-on tutorial, we’ll also apply what we see as good practice on working with and extending our deployment automation blueprints to suit your needs.
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
If we look at slow, traditional IT and jump to the conclusion that just because we found its issues intractable before, that necessarily means we will again, then it’s time for a rethink. As a matter of fact, the world of IT has changed over the last ten years or so. We’ve been experiencing unprecedented innovation across the board – innovation in technology as well as in how people organize and accomplish tasks. Let’s take a look at three differences between today’s modern, digital context...
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
Microservices are a type of software architecture where large applications are made up of small, self-contained units working together through APIs that are not dependent on a specific language. Each service has a limited scope, concentrates on a specific task and is highly independent. This setup allows IT managers and developers to build systems in a modular way. In his book, “Building Microservices,” Sam Newman said microservices are small, focused components built to do a single thing very w...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...