[FOSS4G2016] [Program] Abstract review process draft

Volker Mische volker.mische at gmail.com
Thu Apr 28 12:37:54 PDT 2016


Hi LOC and Program-Committee,

here's the draft of a blog post I'd like to publish tomorrow morning, so
there isn't much time for a review :) It's about our review process.
This blog post will be mentioned in the acceptance/decline emails I'll
also send out tomorrow.

Cheers,
  Volker


Abstract review process for FOSS4G 2016
=======================================

The final selection of the presentations for the FOSS4G used to be a
controversy. Reviewing so many abstracts takes a lot of effort, so it's
not possible to tell the rejected presenters in detail why their
abstract wasn't accepted. Though it's possible to be transparent about
the process that was applied. That's what this blog post is about.

Let's start with some numbers. There were 281 abstracts submitted, 181
were accepted. There was a community review and a separate review by the
program committee. Both reviews were blind, without knowing the author's
name.

Next the program committee met several times using a Google Docs
spreadsheet which had the community rank, reviewer's rank and the
submitted details in it. This time it was including the author's name.


The selection process
---------------------

### Step 1: Get rid of the lowest ranked ones

We were picking a number to eliminate the abstracts that had *both* a
low rank (less thank 180) by the community and by the reviewers. This
way we got the numbers down by 47 to 233.


### Step 2: Go through highest ranked presentations and limit those by
author and company

Only the abstracts where both the community as well as the program
committee were ranking them above 180 were taken into account, that were
142 talks. We went trough them to limit the number of presentations to a
maximum of two per person (with exceptions: some people are so active in
the FOSS4G community, that they represent so many different projects, so
that it make sense to have them give more than two talks. We also
limited the number of talks by company. This wasn't a hard-number, but
depending on how many different topics were covered. This reduced the
total talks by 11 to around 222.


### Step 3: Go through the "undecided" ones

Those presentations where the community and the program committee didn't
agree on got special care. Those were about 90 talks. We sorted them by
the community voting. Then everyone from the committee was able to argue
for talks being in our out. This also included talks being excluded
because of the limitations outlined in step 2. This was the most time
consuming step which lead to a lot of good discussions. In the end it
brought the total count down by 41 to the 181 we accepted.


### Step 4: Group talks by topic

As we were meeting already twice, we decided to take the grouping
offline. All members were working on the spreadsheet of the final
selection and adding tags of what the talks are about. The program
committee chair then took the time to do the final grouping for the website.


### Impact of the community vote

If you compare the top 181 talks from the community voting with the
final program the overlap is 77% (when taking the talks removed in step
2 into account it's even 80%).

When the committee made decisions that were contrary to the community
voting, it had numerous reasons. For example all OSGeo project should
get a stage to present, whether they were voted highly by the community
or not. Other considerations were to include non-developer centric talks
like case studies, about business models or talks matching one of our
four key topics: remote sensing for earth observation, disaster
management, open data and land information.


More information about the FOSS4G2016 mailing list