[OSGeo-Conf] Conference software

b.j.kobben at utwente.nl b.j.kobben at utwente.nl
Thu Sep 26 01:35:43 PDT 2013


I will try to put the whole AT review process in a proper cookbook
document on the OSGEO wiki or elsewhere, but here's a quick overview:
 
We put out a 1st Call for Papers before the Christmas break, with amore
detailed one in January (see
http://2013.foss4g.org/academic-track/call-for-papers/). People had to
submit FULL papers (max 6000 words before Feb 1 -- later extended to March
1). 

We had agreed beforehand with journals on publication space:

- Transactions in GIS for the top papers -- This is a high-ranking
ISI-indexed journal, you need such a journal as an incentive to academics
to find it worth submitting. It was agreed that the reviewers and the
review process  were of a high enough standard to have this act as the
Journals' standard procedure. I.e, they'd only do a "light" review and the
copy-editing. This makes for a fast process (for such a journal) and the
papers are at this moment either published online already or in the last
stage of publishing.
I have an agreement with its chief editor that we can do this again next
year(s) if we both like this years' outcomes.

- OSGEO Journal for other accepted papers -- They'd also receive the
papers ready for publication, so the process should be able to move
forward quickly -- Note that I arranged this with Landon Blake in a
somewhat informal fashion, we probably need a more explicit arrangement
next time.
 
- No publication: i.e. straight rejection (happened with some really bad
ones and some obviously out-of-place ones) or referral to normal non-AT
track (happened I think with 1 or 2 only).

For the AT 19 papers were selected out of the 30 or so submitted. This was
done in a double-blind review process (both reviewers and authors are
anonymous - for this the OJS helps a lot). The reviewers (AT committee
assembled beforehand - see list at http://2013.foss4g.org/academic-track/)
were assigned 3 to a paper, based on their stated fields of expertise.

Reviewers can use the OJS to add comments to authors and to editors, can
rank the paper:
- Strong Accept and recommendation for inclusion in Transactions in GIS
- Strong Accept
- Weak Accept
- Reject

These also state if they wanted certain revisions to be made before
accepting the paper. All of this is nicely tracked in the OJS system,
emails are generated and sent, etcetera...

After revisions where necessary -- here again OJS is of great help to
track things -- the AT chairs did the final selection: Out of the accepted
19 papers 5 were selected for TGIS, 14 for OSGEO journal. We just sent the
papers and the reviews, plus author details (until here they were
anonymous) to the journal editors.

Yours,

--
Barend Köbben 
ITC - University of Twente
PO Box 217, 7500AE Enschede (The Netherlands)
+31-(0)53 4874 253




On 26-09-13 08:59, "Barry Rowlingson" <b.rowlingson at lancaster.ac.uk> wrote:

>On Thu, Sep 26, 2013 at 1:05 AM, Darrell Fuhriman <darrell at garnix.org>
>wrote:
>
>> If OJS was used for the academic track, what was used for the regular
>>track (and the review process).
>
>Presentations and Workshops were submitted via SurveyMonkey. Someone
>must have paid for a Pro plan on there, possibly the AGI already had
>one.
>
>These were collected into a Google Spreadsheet.
>
>For voting we used Paul Ramsey's community vote system - I think we
>supplied him with a spreadsheet (or CSV?) in the right form and the
>magic happens, we get back the vote counts.
>
>We then had a selection process where the committee assessed all the
>submissions so we had one community score and one committee score.
>Now, the exact numbers and details are hazy but the process went
>something like:
>
>1. We need 250 presentations.
>2. The top 150 community score presentations are accepted
>3. The top 150 committee score presentations are accepted
>4. That came to, say 190 presentations due to overlap.
>5. Look at the next top 50 community presentations, check for quality
>and overlaps with existing acceptances, select some.
>6. Look at the next top 50 committee presentations, check for
>quality/overlaps, accept some.
>7. Repeat 5 and 6 until we have 250 accepted presentations.
>
>That seemed a reasonable method - the community vote was given a
>privileged position, top presentations got in, but once we got down to
>the lower orders we debated and considered more to get the programme
>right. The programme probably wasn't much different than if we'd just
>taken the top 250 community score presentations, but this made sure
>everything was eyeballed. Also, there were some interesting and
>important (in the cttee's opinion) talks that would have slipped
>through the net if we'd not done this. W
>
>I'm not sure what the workshop sub-cttee did for selection.
>
>> Alternatively, could Symposion work for the academic track?  Obviously
>>it would be useful to have everything in one place if at all possible.
>
>You'll have to check with the journal about if it can fulfill the
>workflow with respect to anonymity and reviewing. Another possibility
>would be if the journal's own submission system (maybe they use
>something like Manuscript Central) could be used for the AT, then all
>the management is offloaded, and the only problem is then integrating
>selections back into the conference system.
>
>Barry
>_______________________________________________
>Conference_dev mailing list
>Conference_dev at lists.osgeo.org
>http://lists.osgeo.org/mailman/listinfo/conference_dev



More information about the Conference_dev mailing list