The World of Jay Little
logo
Art or Science: Measuring Dev Productivity
4/10/2016 12:30 PM
I've recently spent some time mulling over the question of how one can adequately and consistently measure the productivity of software developers. The answer to this question interests me both as a software developer who is ranked against his teammates and as a member of a software development team that is ranked against other teams in the same company. How do you determine which dev is the most productive and which one is the least productive? How do you do that on a team level?

The real question begging to be asked here is: "What kind of data could be gathered in an effort to track productivity?" Well we could try something simple like tracking the lines of code a dev outputs on a daily/weekly/monthly basis, but such simple measurements have long since been rejected and rightfully so. Lines of code have nothing to do with how productive a dev can be especially when we are presented with situations in which more efficient solutions tend to require less lines of code than inefficient ones. The same reasoning can be used to debunk tracking the number of commits a dev makes to a source code repository as well.

Okay then so why not the number of stories completed or a sum of story points associated with those completed stories? Agile development is all the rage nowadays and most teams likely have access to these kinds of data streams as a result, so in theory this makes some sense. Nevertheless at the end of the day the number of stories and story points means very little in my mind. This mostly revolves around the fact that story points after largely fictional units that have no basis in reality whatsoever. While a well disciplined team could perhaps use a tally of story points to compare its current performance with its performance in the past (assuming that story points are applied consistently and thoughtfully) it seems ludicrous to use that as a metric for comparing performance between teams as the value of a story point is determined on a team by team basis. Moreover the number of stories is even less appealing for use in comparing one team to another as the number of stories on the board or in the backlog is more a responsibility of the Product Owner and the users he/she represents rather than the team itself. Within the team comparing a number of stories between developers is not a very useful metric as stories will vary wildly in terms of complexity so a raw count of completed stories should do little to bolster or dampen perceptions of any particular devs productivity.

What about asking other team members? For instance why not ask the team members tasked with QA to rank developers based on the amount of testing surface area their stories encompass? While I think this idea is better than the previous ideas, it doesn't lend itself as well to empirical measurement. However if you can circumvent any obvious personal bias, it seems to me that QA Engineers are probably in the best position to judge relative dev productivity within a team. Though to be frank, such a mechanism doesn't lend itself well at all to comparing one team to another. It also runs the risk of measuring the wrong thing as the amount of time a QA Engineer spends in the recursive process of identifying bugs and testing the fixes provided by devs should detract from the perception of a devs productivity rather than enhancing it.

What about asking the users? Now to be fair, outside of extreme situations, users are unlikely to know which specific dev worked on the new features that are being delivered to them, but I think this is actually a good thing. At the end of the day a craftsman can be best judged by the fruit of their labors rather than the chaos leading up to it. What's most important here is that users are also in the best position to judge the relative merit of one feature over another. A user is in the unique position of providing the best feedback at the most pure level precisely because they generally aren't well acquainted with the people doing the work behind the scenes but only familiar with whether or not the work provided a direct benefit to them. At the end of the day, software devs exist for one and one reason only: To solve problems and make users more efficient. Anything we produce that lacks those benefits may as well not even exist to begin with.

Now I'm sure some will read this article and take issue with that conclusion (especially the architects of the world), however they won't get much sympathy from me. The reality of the situation with software devs is this: If your work doesn't benefit the user in some way, your work isn't valuable. We can muddle about with class design and service layers for months on end and still provide absolutely no benefit for current and prospective end users. Adopting such an approach to software development, especially in a world focused on agile development (which itself is focused on delivering features to the end user in a timely manner) is stunningly dangerous, ass-backwards and likely career limiting.

In a perfect world, a dev who focuses on end user delivery and satisfaction will likely score well both in terms of QA feedback and on the less than useful story and point metrics as well. Lines of code and number of commits are metrics that really don't apply well in any situation sadly and can be easily gamed by less than capable devs looking to mask their poor performance.
My Plans for Presentation Engine Version 5
12/5/2015 12:42 PM
The time for a new version of the Presentation Engine is almost upon us. Only this time, the up front visible changes will be rather minimal. No instead my intention for the V5 release (which is not currently in active development, just in the planning stages) will be directed to a large platform technology shift. Of course anybody familiar with this project already knows that the major releases are all about keeping up with the changes in the web dev ecosystem, but as for the rest of you: Consider yourselves informed.

What will be changing? That's where it gets fun. For now the intention is to stick with the ASP.NET platform, but transitioning away from all other Microsoft technologies within the application. That includes Windows Server, SQL Server Compact (which is no longer supported by MS of course) and Entity Framework. Instead the general idea is that we'll move to an ASP.NET application that can be hosted on either Windows or Linux with Linux being our preference, SQLite for the database and Dapper for the data layer. In addition I intend on transitioning all of the interactive user interfaces in the application away from jquery/handlebars to angular. Finally all of the MVC Controllers used for AJAX calls will be transitioned to WebAPI Controllers and the current session based authentication will be transitioned to a token based authentication model.

These goals are still tentative and adjustable however. Development however won't begin until I can design, develop, debug and host an ASP.NET application within a fully native Linux environment. Right now I can do all of those things except debug courtesy of Visual Studio Core and Microsoft's latest cross platform ASP.NET work. It's an exciting time for .NET Developers like myself who thoroughly appreciate .NET but very little else about the Microsoft ecosystem it is a part of.

In any event, ASP.NET debugging on Linux is in the works, so development should begin sooner rather than later. It is going to be an interesting ride to say the least.
Search:   [Rss]   [Email]