Commons:Wiki Loves Earth 2017/Jury tools

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search

High level requirements for Jury tools for WLE 2017[edit]

  • Should be easily approachable by users
    • Users can login via Wikimedia account
    • Tool allows users to create/join a campaign and configure/run agreed upon scenarios without requiring assistance from the authors of the jury tool or technical support
    • Assistance should still be provided (to help understand how the tools work, resolve bugs or non-typical issues) and be timely
    • Need to make sure several people who can help with the tool are available
  • Give enough time and possibility to evaluate the pictures
    • No delays waiting for tools authors/support. Required features should be available to users.
    • Tools uptime and performance should be monitored and kept satisfactory.
    • Tools should allow preselection / prefiltering of images from the start of the contest by local organizers/volunteers. This would give May/June to prefilter images during the contest, so local juries can start on June/July immediately after the contest on filtered set of images.
    • Users should be able to browse images by several criterias to make sure the rates are consistent
      • Rating (selected/rejected/rated/unrated)
    • Users should be able to act on both thumbnails and large image view

Tool evaluation/improvement process[edit]

  • Should be done in close contact with local WLE organizers.
    • Requirements based on their input
    • Last month before the contest the tools should be deployed and open to be tested by local organizers
  • Should be done in open and transparent manner
    • Preferably on Commons, open to review all the arguments and participate by everybody
  • Give enough time for participation for all interested parties
    • Allow for jury tool authors to implement the desired requirements during the process
    • Allow for local organizers to test the tools and give the feedback
  • Should not be authoritative
    • Make sure negotiable baseline functionality/requirements are met
    • Additional features are well documented and explained
    • Final decision is up to specific users
  • Should be done in close contact with the tools authors,
    • Not distancing them away to pretend to have independent evaluation.
    • Independent evaluation is not possible - the WLE team is too tied to WLX Jury tool and the WLM team too tied to Montage. So just need to embrace it and make the process as unbiased as possible.
    • Allow for jury tool authors to implement the desired requirements during the process

Timeline suggestion[edit]

  • 7 - 28 February
    • Contact and agree on the evaluation process with the tool authors and local organizers
    • Gather and agree on requirements
    • Let the tool authors start to implement/improve obviously missing/not polished functionality
    • Gather a team for technical support of the tools
  • 1 - 29 March
    • Iterate with tool authors/technical support on the tools’ functionality that is agreed upon.
    • Start working on documentation and translations
  • 30 March
    • Present the preliminary results of tool evaluation/improvement process
  • 30 March - 27 April
    • Allow to test the tools openly
    • Improve remaining issues/documentation/translations
  • 28 April
    • Final evaluations results are ready
  • 1 May
    • Launch the tools for preselection / prefiltering of images from the start of the contest for countries that participate in May
  • 1 June
    • Countries that finished in May and have only 1 month duration should have made good progress with the preselection and should be ready start jury process.
    • Countries that run in June should start preselection / prefiltering of images from the start of their contest
  • 1 July
    • Countries that finished in May should mostly finish with their top-10
    • Countries that run in June start their jury process
  • 1 August
    • All countries submitted their top-10
    • International jury starts
  • 15 September: deadline for the international results

Participating tools[edit]

Volunteers interested in providing early feedback[edit]