Just a Few Good Men

Diogenes's picture

For over two years now, the Drupal.org website has had a policy in place regarding contributions by new members. A contribution from a new member is sequestered in a sandbox until it has a passed a rigorous code review process.

Members who have contributed one or more modules (or themes) or who are co-maintainers of an existing project are exempt from this process.

Metrics for the Drupal Project Review Process (DPRP) are manifested in a single issue queue The table below is selected figures from this page for an arbitrary sample day.

A new member in this context is any Drupal.org member that joined less then 2 years, 9 weeks ago -- when this new policy was implemented.

Any member of Drupal.org who did not have CVS rights before the migration to Git would also be considered a new member.

The typical wait for a code review has climbed from 5 days to 5 weeks, to 3 months.

Selected statistics from the Project Application issue queue - September 18, 2012

Group Status Oldest
create date
last update
recent update
records %
Group A closed (fixed) 2 years 9 weeks 1 year 30 weeks 3 days 103 697  
fixed 1 year 10 weeks 1 week 6 days 2 days 18 13  
Subtotal 710 37.1%
Group B needs review 1 year 29 weeks 12 weeks 4 days 1 hour 38 114  
active 34 weeks 3 days 7 weeks 3 days 7 weeks 3 days 25 1  
RTBC 1 year 26 weeks   2 hours 39 19  
Subtotal 134 7.0%
Group C needs work 2 years 1 day 21 weeks 3 days 1 hour 53 260  
postponed 1 year 37 weeks 21 weeks 3 days 3 days 3 hours 60 15  
postponed (maintainer needs more info) 1 year 12 weeks 16 weeks 2 days 21 hours 24 20  
Subtotal 295 15.4%
Group D closed (won't fix) 2 years 4 weeks 1 year 28 weeks 3 days 16 hours 72 596  
closed (duplicate) 2 years 10 weeks 1 year 28 weeks 15 hours 76 172  
closed (worked as designed) 1 year 17 weeks 1 year 10 weeks 11 weeks 18 7  
Subtotal 775 40.5%
Total Groups A-D 1914  

Group A represents the modules that have been promoted to Full Project status in the last 2 years. The current rate is about one per day.

There are about 200 new projects per month that appear on Drupal.org. So 85% of these new modules are from current FP members.

Group B represents the modules/themes that have applied and are active - they have been reviewed, corrected, and set back to needs review. Note the Oldest last update. It represents the current wait time between a request and a review. The record count is is the number of projects awaiting review.

Group C represents modules that have been reviewed and set back to "needs work".  Some module reviews are quite recent, others are not. You can assume abandonment evidenced by the oldest last update or a change in status to "postponed".

Group D represents the modules/themes that have not made the cut for one reason or another, or contributors who have simply given up. The closed (duplicate) total here might be a little alarming.

The Most Replies column is a crude indicator of the effort put into some modules to get them approved. A typical cycle is a request; a review; a thank-you for the review; and finally a new upadte with s status change back to "needs review". Wait as required. Repeat as required.

Four cycles is not unreasonable. That means waiting a year.

Some have waited longer and then given up quit because it is just not worth it anymore. And there is always the possibility that your module will be declared a duplicate... postponed until you can prove you can clean out the barn without supervision..

What does this all mean?

I am going to argue here that these statistics represent an attrition rate of about 55%. That is pretty high for an Open Source project. It is higher than the attrition rate in engineering. Do other open source projects behave like this?

What is so unfair about this system is that current FP members are not subject this level of scrutiny or quality control. I wish they were. I find myself doing plenty of code review on FP modules that could work better or modules that don't work at all.

Collaboration anyone?

The fixation with avoiding duplication kills honest competition.

Would a qsort algorithm survive this process if someone could argue that an advanced bubble sort was already available?

Egos and collaboration are a tricky mix. There are modules that have become so bloated a fresh start or even a binary refactoring (cold fission anyone?) would be an improvement. Is this this even possible now?

The answer is simple. Apply the same standards to everyone.

Make everyone climb the same wall or knock down the wall altogether.