CiviCRM Downloads - By The Numbers

Published
2011-04-08 12:07
Written by
Stoob - member of the CiviCRM community - view blog guidelines

These are some graphs I created from the data publically available at CiviCRM's Sourceforge.  Sourceforge provides limited data only on the release dates of a version (i.e. 3.1.5) and then the number of subsequent downloads to date but we can still interpret some useful conclusions from the data.  Dave Greenberg shared one of these graphs at CiviCon.  The data is from Mar 1, 2011 so I wanted to publish them before the data became too stale.

 

The first graphic is the simplest to understand - CiviCRM all versions 2.x compared to all versions 3.x.  Note that the timeframe measured in this graph is similar.  As such, we can conclude that the number of downloads for 3.x have increased about 27% compared to 2.x.

 

The next graph shows the downloads of all versions and their release dates.  The distribution here is roughly equal, but remember there have been four minor versions of CiviCRM 3 (3.0, 3.1, 3.2, 3.3) so the 3.x version has more overall downloads.

 

The final graph I think is the most interesting, which focuses on revisions which are the third number in the series such as: 3.1.5 or 3.2.3.  This graph shows a tendency for the CiviCRM community to download more of the x.x.2 and x.x.3 revisions than the x.x.0 or x.x.1 revisions.  From speaking with other members of the community, my conclusion was that this behavior is due (at least in part) to concerns about stability of the earlier (x.x.0 or x.x.1) revisions.

 

CiviCRM Core Team and community are working hard to create automated unit tests for the software that will increase stability of revisions. 

 

My suggestion and approach (which I encourage others to consider) is to also do increased testing for my live client data by setting up test servers using real sites and testing beta versions to make sure features important to my client work.  I've already done this for a couple clients and will do more this week. 

 

If you haven't setup testing servers for some of your clients, consider doing so now.  You will not only help your client but CiviCRM as a community.

 

Filed under

Comments

...is that releases are too frequent. I know for some of my clients, by the time they were ready to consider an upgrade to 3.x, 3.1 had been released.

 

Implementors are cautious about upgrades becaus eof having been burned in the past by CiviCRM, and users are cautious because of bad experiences with upgrades of other CRMs. I was told this week that it's common for Raiser Edges users to find that they have to purchase a new add-on to the softwrae after an upgrade in order to restoree functionality they previouslyhad, so happy Raisers Edges users are the ones who never upgrade.

 

People also tend to think of CiviCRM as part of their website, and many organizations think of website upgrades as something that happens every couple years, not several times a year.

 

Slowing down the release cycle should also result in higher quality x.x.0 releases, addressing the other concern.

 

 

I see your point Matt.  Thanks for commenting.  I also have heard similar anecdotes about upgrades.  In addition, many of my smaller clients are also cost-concious.  They know that they can only afford an upgrade occasionally, and cannot afford costly debugging and patches.  Therefore they require that I only install the most stable version (usually the last revision) in a series.

hi stoob - to some extent the data is cumulative eg i have a new client, or an old one who finally undertakes an upgrade - and the latest version is 3.1.0 - so they then may also get taken up through 3.1.1, 3.1.2 etc

then another provider gets a new client and the latest version is 3.1.1 - so they also get taken up through 3.1.2 etc

ditto for another client who starts off at 3.1.3 and so on

i am not sure how you can remove this noise from the figures - and certainly don't want to take away your underlying argument but thought i would share this.

If a new adopter of CiviCRM always installs the latest stable version, then they would theoretically then install all the subsequent revisions of that version.  However, in practice this is not always the case, of course.  Many clients skip revisions: like upgrading directly from 3.1.0 to 3.1.5, or from 3.2.3 to 3.3.5.  All we can do with the limited data is measure the raw popularity of the revision numbers, and try to take a guess as to the reasoning why.   Some folks at CiviCon expressed their reasoning why, but I'm sure there are many reasons.

"If you haven't setup testing servers for some of your clients, consider doing so now. You will not only help your client but CiviCRM as a community." I'd be interested to hear more about your experiences with this - although I am fairly familiar now (for better or worse :-)) with the CiviCRM test suite I haven't figured out how to usefully extend this to client sites / client custom code.

 

Also, the frequent release cycle is a blessing & a curse - usually there is something I'm hanging out for every release but we also probably only upgrade major sites every second release due to the disruption it usually causes. We do run the latest dev version for any sites in development, however. And, of course, we run apiv3 off svn on all sites now :-)

How do I do it?  I have a VPN server of my own where I install various versions of CiviCRM and then import client data from their site elsewhere, and copy in the database to my test site.  After the usual resetting of settings and clearing of the templates (you know the drill), then I have real data to look at.  I also import any custom TPL, PHP and Drupal modules necessary.  

 

I test what features that particular client cares about most within the context of the new CiviCRM version, making sure they work.  I will be testing the new PCP functionality within the next few days. 

I understand this is an investent in time and captial not all clients can afford, but it's better than installing a version with no prior testing and being surprised that a favorite feature is not working properly, then getting a patch, etc etc.  You know the drill..

 

 

Ah OK - I was thinking you meant automated testing - yes testing new features etc does take time. I guess when you have more than one client, however, using similar features there is some efficiency (i.e. one client benefits from work done testing for another client etc)

Given the many problems with this data, some of which you touch on, some of which are mentioned in the comments, would it not be more beneficial to instead analyze the data of CiviCRM sites pinging home? With this you could also get a lot more interesting information like percentage of sites that upgrade and how frequently, or attrition rate. In fact if you released the data (in obfuscated form), I'm sure that someone would take it and pull interesting trends from it.

That's a great idea.  But I've seen that data only once, and no longer know where it is.  The senior devs do where the data is, and might be able to point you in that direction.  From my recollection the data contains only some of the items you think it might.   Perhaps more significant is that I believe the 'pinging-home' features haven't been in place for very long, and not every site pings home.  I printed these graphs only off what public data I could glean from Sourceforge.  For better or worse, vague or not, we can try to interpret something from it.

 

  • Its already in an anonymized manner
  • We definitely should do a better analysis of it
  • We are willing to share it with some folks in the community

 

Please ping us on IRC / email and we can chat about what we are collecting right now and what else we can do. We'd like to get a few more details on the analysis the person will be doing etc

 

lobo

 

Hi,

 

Instead of displaying each version as a separate item on the x axis, could you easily put the time (between 2008 and now) and plot the date of the release?

 

eg. so you could see if the 27% between 2. and 3. is due to an increase of download or simply because it has covered a longer period, same goes for the 3.0 version, less download, but only 3 months while the other releases were longer.

I don't know if it's done on purpose with civi, but that's a common one to launch a .1 version quickly after the .0

 

X+

 

P.S. I used to spend a lot of time on data visualisation, using a bar for each version without making explicit the period covered by each is a classical trick to make the data "lie" ;)