unique traffic. must contain All scripts have been coded in perl. The code base is huge: WikiCounts job: 30 scripts, 18,660 lines, 630,000 characters WikiReports job: 53 files, 38,833 lines, 1.700,000 (!) characters There is hardly any formal documentation (this page is a start), for several reasons: Lack of time: there is always some new report or other data/stats request that takes priority Lack of motivation: for 6 years wikistats was a private volunteer project, with re-usability of the code as a lower priority (still Wikia and other sites figured out how to use the scripts) Lack of confidence in inline comments: too often these are out of sync with what the codes really does, or riddles by themselves, or mere generalities (this is no comment on Wikimedia sources, I wouldn’t even know). Erik favors self documenting code, descriptive variables and article names are crucial here. Caveat! Some of his scripts, even major ones, are in blatant violation of this rule, and therefore hard to maintain, esp. certain parts of WikiReports scripts. All major scripts have been checked in at git, section wikistats Some of the scripts date back to 2003, and have been overhauled many times, often to accommodate non trivial database restructuring (pre xml dump era), but even more important: to re-factor the scripts when processing time and limited resource (memory) again and again called for new measures. In 2003 the full history of the English Wikipedia could be rebuilt in 10 minutes. In 2010 complete processing of the English Wikipedia takes 20 days with much more efficient scripts. This is why early 2010 a pragmatic decision was taken to parse only stub dumps on a monthly basis and omit some less 45485
Make link from this topic to the following address: http://www.lotofvisitors.com