[OSGeo-Discuss] any ideas on how to Monitor and Review 'random' files?
davep at confluence.org
Fri Feb 17 05:24:48 PST 2012
We are doing some brainstorming in order to come up with possible
ideas of how to address a problem, so any thoughts, comments, or
suggestions are welcome. The problem is outlined below.
Within a corporate environment the users have workstations
running Microsoft Vista. All users have access to some
network file shares, but different groups of users have
access to different file shares. All file shares are
using the NTFS filesystem.
A group of users - call them the Workers - have a common
file share that they use during the course of their business.
When an "event" takes place, and for some time after, various
Workers will add event-related files to the shared location.
How such files are organized is up to the Workers. There is
no technical mechanism (i.e. filesystem monitoring software)
or procedural mechanism (i.e. business process) that currently
exists that results in 'monitoring' the addition of, or changes to,
the event-related files.
A different user - call them the Reviewer - who works in a different
part of the corporate organization, has a need to 'review and organize'
some of the event-related files that are provided by the Workers.
This process typically takes place 'after the event', however,
event-related files might be added by the Workers well after the
event took place (e.g. months or years later), so the Workers could
be making updates during the same time period that the Reviewer is
doing their 'reviewing and organizing'.
For a particular event the Reviewer may want to review the
event-related files, 'organize' them, and be informed when Workers
add more files for that event. Eventually there may be a need to
make a copy of some of the event-related files, based on criteria
specified by the Reviewer.
It may be possible to add software to the Reviewer's workstation
to assist with this process, but it will be less likely to be
able to deal with the Workers' workstations, and very unlikely
to be able to deal with the servers hosting the file shares.
Although this isn't really about "geospatial processing", there
are some geospatial files involved in this process. As an example:
- an event takes place - call it "ABC123"
- a Worker who has files related to ABC123 will put the original
files, or copies, on a file share (e.g. some raster maps, some
shape files, some word processing documents, some emails, some
JPEG photos, KMZ files, etc.)
- other Workers will also have files related to ABC123, and they
will also put them on the file share
- the above process continues while the event ABC123 is 'active'
- over time the initial set of "ABC123 files" will stabilize,
and there may not be any new files added very often
- the Reviewer gets involved sometime after the event, and starts
with the set of files that exist at that point for event ABC123
- the Reviewer may want to 'organize' the files for event ABC123,
however that might be able to be accomplished by 'organizing'
file metadata, rather than having to make copies of the ABC123
files and organizing the copies
- when files for event ABC123 are updated (e.g. a Worker adds a
"One Year After" report for event ABC123), the Reviewer wants
to be able to know that there has been an update
- at some point the 'organized files' for event ABC123 (and possibly
some 'notes' or 'metadata' about those files) will need to be
copied from the Workers' file share to another file share, in
order to preserve a copy of the files and to provide a location
to use for processing the files as they are loaded into a
'document management system' that the Reviewer uses
(the last step of loading files into the document management system
is already in place, and isn't part of the brainstorming exercise)
Degree Confluence Project:
More information about the Discuss