file size issue
Lowell Filak
lfilak at MEDINACO.ORG
Fri Jun 17 18:17:14 PDT 2005
The following message was sent by Jerl Simpson <jerl.simpso at GMAIL.COM>
on Fri, 17 Jun 2005 09:18:06 -0500.
> Lowell,
>
> That's a good idea. I was thinking of writing each INSERT query to a
> separate file. Then running that after I made sure everthing was
> present and accounted for.
>
> Thanks for the tip.
>
> Jerl
>
> On 6/16/05, Lowell Filak <lfilak at medinaco.org> wrote:
> > The following message was sent by Jerl Simpson <jerl.simpso at GMAIL.COM>
> > on Thu, 16 Jun 2005 08:41:02 -0500.
> >
> > > Hello,
> > >
> > > I have a particular issue with file size limits. I have a layer (2
> > > actually) that contains roads classified as "Local Roads" throughout
> > > the entire contiguous 48 US states. I had to split this up into 2
> > > files due to filesize limits. My apache server keeps throwing errors
> > > that say "[notice] child pid 12855 exit signal File size limit
> > > exceeded (25)". So I would like to add these to a MySQL table and
> > > pull the data from there.
> > >
> > > I am running into two problems.
> > >
> > > 1. I used one of my very small shape files to setup an SQL connection
> > > to grab the data. When I modified my .map file I now get an internal
> > > server error. Which simply means some data was spit out before HTTP
> > > headers terminated. I have a log file and DEBUG ON set in my map file
> > > and nothing gets placed in the file. Is there any documentation that
> > > tells one how to build the queries? I can't seem to find them.
> > >
> > > 2. The shape file I have for Local Roads is 1.2Gb. And the perl
> > > Shapelib module reads the entire shape file into an array. Well, I
> > > don't have 1.2Gb of memory on my machine available so when I run
> > > "shp2mysql.pl LocalRoad" the program terminates with the error
> > > "Terminated". I looked at the documentation and I don't see a way to
> > > just grab records out of the file. I tried get_record() but that
> > > threw error messages.
> > >
> > > I have a tiled this file into 100 smaller files the largest of which
> > > is 63Mb. I could write a script to run to shp2mysql.pl on each file.
> > > I would rather have a program to operate on the original file though,
> > > as I'm afraid I'll end up with 100 seperate tables with the same data.
> > > Or I would need to modify the script to foce them all into the same
> > > table.
> >
> > Jerl,
> >
> > If you want to venture into modifying the shp2mysql.pl what can be done
> > is to write each record of data to a tab delimited text file and then
> > use 'load data local infile' instead of the insert.
Jerl,
I think I may have missed something. If my understanding (from a
conference session today) is correct the problem lies in GEO::Shapelib
itself. It appears that it reads the geometries into an array which
would still mean you would need enough ram to read the entire shapefile.
The only workaround is to use shapelib via C directly instead of through
the perl module. From C you would be able to iterate over each entity.
Lowell
More information about the MapServer-users
mailing list