On March 7th, we did a webinar called Leading Your Team to Engineering Excellence. Guest speaker Steve Berczuk, Lead Software Engineer at Riva Health, joined Stuart James, Engineering Manager at Codacy, in discussing how you can help your team achieve engineering excellence. During this talk, they discussed:
- What are some of the main challenges in managing engineering teams?
- What do some engineering metrics mean?
- How can one be sure that a team is not over-relying on metrics to achieve goals?
- How to evaluate your team’s performance rather than individual developers?
In case you missed the talk live, don’t worry: you can (re)watch the recording here or below 👇
A live talk on engineering excellence
You can check the detailed talk on our video recording – we even give you the specific time frames! But we’ve summarized the topics for you to read 🤓
What are the main challenges in how engineering teams manage their processes? (00:03:18)
Teams go wrong with metrics because they focus too much on what they see. They see a number and don’t focus enough on why they’re doing it or what goal they’re trying to achieve. This also happens in product planning.
People generally think numbers are very concrete; you can feel like you’re making progress. However, the trick is not blindly applying the metric. Because even if your behavior is changing in the right way, having the “why” gives you an understanding of the problem and a way to simplify the solution.
It’s the same for processes and practices. Doing them without thinking about why and what problems we’re trying to solve can result in resentment and changes afterward. So we need to know why we choose the metrics, what problems we’re trying to solve, and how we get alignment.
How do we approach choosing the right metrics, and how do we use them? (00:05:24)
This is a bit of an art. There are a couple of ways to do it; you can do a lot of research to figure out your problem. It’s also legitimate to try some metrics if they might help us comprehend our problem.
But first, we need to understand if our product owners and product managers are unhappy with the output or the business is unhappy. And then work back from that and figure out what could be happening.
For example, DORA metrics and products like Codacy Pulse allow you to focus on some important metrics and have a feedback process. And retrospective gives you a lot of insight into where to go.
How can we avoid choosing the wrong metric, which leads to the wrong process or practice? (00:06:43)
In the past, we’ve looked at measuring lines of code for productivity or Sprint velocity as performance. Sometimes when we choose metrics, we can choose them for the wrong reasons and have the wrong outcomes. The metrics can be subverted as well.
There are certain things that you can look at, but a general approach could be, in an iterative way, to try specific metrics and see if helps your team. Metrics are a good input to a retrospective, allowing the engineering team to talk about the metrics, if they’re improving, and iterate on it.
Context is crucial. For example, you’ve blown the metrics during one development cycle, but you had a company off-site and weren’t developing. Or somebody was sick, and you needed to reprioritize that work.
It can also be a bigger cultural issue. It might be the case that people don’t review code because they have too many other things to do, and code reviews aren’t a priority. But if you need to do them, then your organizational values aren’t aligning.
How do we interpret metrics? Are there any dangers? (00:10:12)
Many metrics are often indirect indicators. Your customers, business owners, or other stakeholders honestly don’t care what your Lead time is. They might care about your Change failure rate, but as long as what gets to production isn’t causing any problems, they don’t care about the metric itself.
So metrics are more like secondary indicators, where you can look at the trends, link them back to business, and look at business value.
What about the different tiers of metrics we have? Where should we focus? (00:11:58)
As an engineering team, we can probably focus on the team level and then have feedback where we loop them back to the customer.
At the end of the day, the most important thing is to bring value to customers. So we should also think about the business-level metrics. It’s good for engineering teams to work on being better. If the business is happy, let’s try to do things faster and with more quality.
At the end of each retro or Spring, you should do your periodic reviews. If the metrics look bad, but the business is happy, maybe the metrics aren’t telling us anything. If the metrics look good, and the business is unhappy, perhaps the metrics are telling us the wrong thing.
Can there also be times when we look at the metrics, and they tell us we’re doing the right things? (00:16:35)
Sometimes it could be riskier to make a change than to keep it, particularly when everything’s consistent and you’re delivering.
If you’re merging, and it takes a day to merge, somebody might argue that one hour might be better. But maybe if you did that, then you’d end up skimping on tests, or you’d end up missing a feedback loop.
How important are engineering metrics to achieve your goals? (00:18:31)
They are very important; it’s hard to fix what you can’t measure. But you need to make sure you’re measuring the right thing.
For example, the Jira points closed by a developer have gone down. But then, when you talk with them, you realize that person was the one who has been at the company the longest and knows a bit about everything. So they’re spending most of their time helping other people. And Jira didn’t reflect that.
So how do you choose your metrics? (00:23:57)
Start with some of the canonical ones. For example, Steve always has this instinct that Time to merge is important. When he has a pull request sitting there for more than a couple of hours, it can be frustrating sometimes; he wants to get this work in so other people can work with it. There was a period when their Time to merge was longer than usual, and he felt that validated his instinct.
How do you evaluate the team’s performance versus the individual when we’re talking about the metrics? (00:25:24)
You need a culture of psychological safety, be in a very open environment, and everyone agrees on the metrics. Buy-in is important because otherwise, people will want to skew metrics. Individuals need to understand how the metrics will be used, so they don’t feel imposed upon or controlled.
What about fear around metrics? Can we end up with a culture of blame and fear when using metrics as a tool? (00:29:25)
Metrics are about improvement; it’s hard to change for the better when you’re like doing things out of fear. So if the team isn’t meeting the metrics and they’re just arbitrary metrics because somebody three levels up in the org will judge them, that won’t help.
Everyone must align on the metrics, from top to bottom. So, for example, if you’re going to work with Lead time or Cycle time, the team needs to understand why and what the goals are.
Any advice for helping developers or businesses align on metrics? (00:32:25)
There’s a lot an engineering team can do on its own. The business may not care about Cycle time, but for the engineering team, it’s important. We can look at generally good things to improve within a threshold.
Several engineering metrics can help you find out about your processes, and they have a direct implication for the business. However, teams generally go wrong in looking at a number and treating it without context and without validating that it measures what they want to measure.
Can experimentation and innovation impact the metrics? (00:33:39)
That goes down to the context. In the context of a retrospective, one of your rituals might be to show the various metrics dashboards, and if you see a big dip or a big spike, you need first to understand why that is. You need to be able to trace it back to specific actions and learn from the process.
You also need to understand your team dynamics. Are they trying different things or redoing work? Is performance going down and then going up later? You want to look at trends. You can’t say, “we did something new this one cycle, and we’ll never do anything new again because it busted our metrics.” On the other hand, if you’re going two months without getting better and actually getting continually worse, then that’s a sign that something needs to change.
After the talk, we opened the floor to all the questions the audience might have. We were delighted to receive a tone of questions! We’ve listed them for you:
- Should story points estimation be subjective or objective?
- In a team of four developers, how do I keep my team engaged in the problem-solution process? Sometimes people just agree to what comes from the design team instead of assessing the actual feasibility.
- How do you deal with estimates when business dictates your deadlines?
- Are there metrics for managers themselves that the team or company should care about?
- How does technical debt impact engineering performance, and how should we deal with that?
- How would you level different individuals working on completely unrelated projects so it would be viable to have a standard generic set of metrics to serve as a starting point to measure productivity?
- How do you measure efficiency and ensure a good process when your team is working on multiple projects or pieces of work at a time?
- The engineering team can’t operate or be productive if they don’t feel secure that they won’t be punished for outages, bugs, misestimated, and deadlines. How would you suggest making a case for investing in a safer culture to an executive team?
- How can we keep a balance regarding Spring tickets and tasks so that you can complete technical tasks and still keep stakeholders happy?
- How would you compare the usefulness of hard metrics like story points and number of tickets with soft metrics like surveys, reviews, and satisfaction? Is one type more important?
- Speaking from experience, could you talk a bit more about ways and systems of creating or using values of story points?
Thank you to everyone who joined live and those who watched the recording later on! Let’s keep the conversation going in our community.