Five years ago I prepared report for a Lean transformation for a certain client department. The PCE was less than 1%! PCE is defined as the ratio between the time we actually spent working on a feature to the Lead Time. PCE is an average metric.
The purpose of this blog is to explore if there were only one metric to choose, should it be the PCE and why. Read more…
PoC is a widely used metric in traditional organizations to monitor their projects. Normally such projects can have 40+ people with multiple vendors involved. The project is structured into teams some implement Agile others are not. Sometimes the program management wants to become Agile but they are not sure how to structure their team for meeting compliance and regulation expectations, as well as inherited organization’s practices. Read more…
In this post I describe a perspective of visual board when it’s used with an automated ticketing system. In current project which I facilitate its Kanban implementation I introduced a practice of process review. During which we found gaps between the board and the ticketing system, for a given card we can have:
- Visual board is correct state while ticketing system is wrong: This was the case for the majority of cards having inconsistency issues. This is rather easy to fix by agreeing and communicating instructions with team on the mapping rules between the states of the Kanban board and those of the ticketing system.
- Both states in the visual board and ticketing system are incorrect: This was failure in the process implementation and would require that we may change our mind on the implementation approach.
- Visual board is incorrect and ticketing is correct: This was very rare and in fact we didn’t encounter this situation.
Discontinuing the visual board and use electronic tool on top of the ticketing system can lead to drop in the Kanban process implementation. The visual board allowed:
- Collaboration among all team members to discover issues which otherwise could not have been found if we use electronic solution alone.
- The manual board can act as data validation process,which is often a missed step of a measurement system. Data captured in the ticketing system is validated against manual board which creates improvement opportunities for the whole process.
- Increased degree of transparency between what actually takes place and what is being reported.
- As corollary of previous point, the implementation of the visual board has raised the morale level of team members and improved trust.
- Produced accurate data to top management for deriving the overall process improvement.
The ticketing system is not the process; instead it’s a tracking system to help in implementing the process. The real process is what happens at the visual board, and by implementing the agreed on mechanics of engagement among team members. We can’t rely on the data of the ticketing system if the manual implementation of the process is unclear. The data is reflection to a process and in the absence of the process the data can be misleading.
Having a manual board for 2 months period or so before introducing a tool on top of the ticketing system is my preferred approach. This will allow the process to be substantiated among team members and to produce meaningful data from ticketing system.
The visual board has an orthogonal dimension which we can not visualize, this is the quality of engineering work. As Agile/ Lean facilitator we should be proactive in improving such engineering competencies, even if we are non technical. I suggest implementing the following two concepts.
Maintain waste basket
For me this represents noneffective requirement management. This contains the cards which decided not to complete after we had started working on them. For me this is the materialization of risks ignored during the estimation phase. However, they can be due to:
- Requirement was a wish which we discovered later that there is no need for
- Requirement is beyond economy to be implemented
- Requirement is non feasible based on the current product architecture.
For me this represents noneffective engineering development practices. Defect escape is defined as “For given set of cards, it is the percentage of defects found after being DONE compared to overall defects of these cards”. This percentage should be zero. The engineering failure is more severe if this measure is non-zero for non New Product Development cards, which represent the repeatable type of work. Examples for causes of non-zero value of this measure are:
- Inadequate architecture
- Poor product owner review
- Overlooked unit testing
- In effective code review
Ben, an engineer in our development team reported two issues from production and he didn’t like to report them as bugs. A bug is visualized as red card on the visual board and is visual to the whole organization. Read more…
Despite the delivered software has no bug, it is not uncommon for clients to ask for changes. This for me is a Requirement Defect. This corresponds to Validation activity of the V-model to ensure that software would be working and acceptable in its intended environment. Software Validation is implemented in Agile through product manager review. Read more…
The idea of project in operations management helps to understand the demand on the organization by adding the dimensions of required effort, delivery interval, objectives/ scope, client and required skills. Project planning serves to define the previous attributes, while project execution and tracking is a common way for full-filling demand. Read more…
The Japanese came with the idea of delivering products according to client demand. This is less obvious in software development organizations. This post intends to clarify this subject of demand analysis. I guess all organizations want to maximize their Throughput to meet and exceed client demand. The first step towards this goal is simple; understand client demand to arrive at patterns. We should build our software capability to address those patterns so that we can satisfy the demand. Read more…
Lead Time is the average time from when client submits a request till related software is produced. Cycle Time is average time between two successive releases from the system. The lower the Lead Time the higher the Throughput, which is number of client-valued features, released every time interval. Ultimately this impacts the bottom line by having lower cost per feature. Read more…
Going to the plain basics, maturity is to deliver what we promise. I heard this definition from a successful sales executive of product development company. For me this summarizes everything!
Meeting commitment is arguably the most important criteria for success. I use the measure of requests percentage which have deviation from target date exceed “x” days. This measure helps to quantify our maturity as organization. It represents the Voice of the Customer (VoC)
This measure can be organizational wide and it can be used to drive the whole improvement initiative.
The following chart gives high level analysis of causes of immaturity as suggested by this measure.
The foundational cause is the inability of people to timely communicate their issues. I worked with developers who had one month-long tasks and kept reporting that things are according to plan till they report failure at the very last day! I carry the responsibility by not allowing trust environment which encourages them to talk and share their concerns.
Sales people making commitment without consulting engineers is well-known issue which can be solved if they are educated about the capacity measures. These capacity measures can directly improve the above VoC measure. The capacity measures include Average Lead Time for each Class of Service. This measure allows sales people to provide informed estimates in the very narrow window that they might have to secure a deal. They can add percentage of uncertainty based on the inherent risks. I suggest this should be maximum of 20% with promise of being reduced as we proceed into the project.
Finally, a key assumption for failing to meet our commitment is the poor definition of customer valued request. Delivery on time requests which are not meaningful actually invalidates our improvement effort.