There is not Wikistats issue for which I received more mails than this: since 2010 some metrics on article content were no longer updated: word count, articles above 200 chars, mean size in bytes, percentage above 0.5 or 2 Kb, database size, word count, images and links (internal, interwiki, external). Word count in particular was often mentioned.
Example: Polish Wikipedia
All these metrics need to be collected from the ‘full archive dumps’, the dumps which contain the full raw content of every revision of every page. The sheer amount of data that needs to processed made it no longer feasible to process those full dumps on a monthly basis (it didn’t help that I do rather ambitious cleaning up of the raw page content before counts are generated (e.g. for word count to approach ‘readable body text’).
So in 2010 for most Wikipedias I switched to processing stub dumps, which contain all meta data for every revision, but not the raw page content. For sister projects with much smaller dumps I continued processing full archive dumps.
Now finally I can announce I applied a fix which makes it possible to update those missing metrics roughly on a quarterly cycle. Full archive dumps are now processed on a different server, running as continuous low priority job, and the reporting process combines metrics from both servers.
In the last two weeks some 260 wikis were processed. Only 10 large wikis remain to be done: Arabic, English, French, German, Hebrew, Italian, Japanese, Spanish, Swedish, Russian. I expect in a month time all but English will be ready. English may arrive -fingers crossed- a month later.