[postgis-users] Insert data larger than 1 Gig...
David Blasby
dblasby at refractions.net
Mon Jul 7 11:57:46 PDT 2003
Is the resulting data (in binary form) > 1GB? If so, you might have
problems - postgresql has either a 1GB or 2GB size limit for a single
chunk of data.
If the binary is < 1GB, then you could try using Base-64 encoding
instead of hex - it would be a bit more efficient. You'll have to
change the postgis_chip.c "chip_in()" and "chip_out()" functions.
Third, you could load the data in chunks - this would involve writing a
mergeChip(CHIP,CHIP) function in postgis_chip.c, and a split chip
function in your client.
insert into datatable values (<a chip 1 >);
update datatable set mychip = mergeChip(mychip, <a chip 2>);
update datatable set mychip = mergeChip(mychip, <a chip 3>);
update datatable set mychip = mergeChip(mychip, <a chip 4>);
But, the real question is why do you want to put something that big in
the database? Wouldnt it make more sense to cut it up into smaller chunks?
dave
More information about the postgis-users
mailing list