<div dir="ltr">Hi, <div><div>I have been trying to download s3 cloud stored gridded climate data and generate tables with variables, lat, lon and timestamp (year, yearday). To achieve this I used raster2pgsql and imported multiple netcdf files into a database table. <div><br></div><div>Question: How to achieve the extraction of variables using postgis? I tried using ST_value, ST_pixelaspoints but I was getting errors, mainly due to the format in which netcdfs are stored in the database (the error says can't load some characters like 00E30100082...), I even tried changing the datatype to float but still did not work. I mean it is probably not simple like selecting a variable from the netcdf. I have enclosed my sql query below: <br></div><div><br></div><div> -- Iterate through all raster files in the table<br> FOR raster_record IN (SELECT * FROM gfdl_03_prcp) LOOP<br> -- Determine the year from the raster file name, assuming the format is '<a href="http://prcp_03_1950.nc">prcp_03_1950.nc</a>'<br> SELECT<br> regexp_replace(raster_record.filename, '.*_(\d{4})\.nc', '\1')::integer<br> INTO<br> year;<br> <br> -- Calculate the start date of the year<br> year_start := (year || '-01-01')::date;<br> <br> -- Determine if the year is a leap year<br> is_leap_year := EXTRACT(ISODOW FROM (year_start + interval '1 year')) = 7;<br> <br> -- Set the number of bands for the year (365 for non-leap years, 366 for leap years)<br> FOR band_number IN 1..(CASE WHEN is_leap_year THEN 366 ELSE 365 END) LOOP<br> -- Calculate the observation_time using the year and band number<br> observation_time := year_start + (band_number - 1) * interval '1 day';<br> <br> -- Extract X (lon) and Y (lat) coordinates from the raster<br> SELECT<br> ST_X(raster_record.rast) AS lon,<br> ST_Y(raster_record.rast) AS lat<br> INTO<br> lon,<br> lat;<br> <br> -- Insert the lat, lon, prcp, and observation_time into the extracted_values table<br> INSERT INTO extracted_values (lat, lon, prcp, observation_time)<br> VALUES<br> (lat, lon, ST_Value(raster_record.rast, band_number), observation_time);<br> <br> -- Increment the counter<br> counter := counter + 1;<br> <br> -- Commit the transaction periodically in batches<br> IF counter % batch_size = 0 THEN<br> COMMIT;<br> END IF;<br> END LOOP;<br> END LOOP;<br> <br></div><div><br></div><div>The metadata for the two files is as follows: <br></div><div><br></div><div>File from database:</div><div><div class="gmail-output_wrapper"></div><div class="gmail-output_area"></div><div class="gmail-cell gmail-code_cell gmail-rendered gmail-unselected" tabindex="2"><div class="gmail-output_wrapper"><div class="gmail-output"><div class="gmail-output_area"><div class="gmail-output_subarea gmail-output_text gmail-output_result" dir="auto"><pre>{'NC_GLOBAL#Conventions': 'CF-1.5',
'NC_GLOBAL#GDAL': 'GDAL 3.6.4, released 2023/04/17',
'NC_GLOBAL#history': 'Wed Nov 15 13:32:13 2023: GDAL CreateCopy( <a href="http://not_clipped_prcp.nc">not_clipped_prcp.nc</a>, ... )'}</pre></div></div></div></div></div><div class="gmail-cell gmail-code_cell gmail-rendered gmail-selected" tabindex="2"><div class="gmail-input"><div class="gmail-prompt_container"></div></div></div><div class="gmail-input"><div class="gmail-prompt_container"></div><div class="gmail-inner_cell"><div class="gmail-input_area" aria-label="Edit code here"><div class="gmail-CodeMirror gmail-cm-s-ipython"><div style="overflow:hidden;width:3px;height:0px"><textarea style="padding:0px;width:1px;height:1em;min-height:1em;outline:none" tabindex="0"></textarea></div></div></div></div></div><div class="gmail-output_wrapper"></div><div class="gmail-output_area"></div><div class="gmail-output_subarea gmail-output_text gmail-output_result" dir="auto"><pre>File before loading into the database:</pre><pre>{'lat#units': 'degrees_north',
'lon#units': 'degrees_east',
'NC_GLOBAL#title': 'Daily statistically downscaled CMIP5 data for the United States and southern Canada east of the Rocky Mountains, version 1.0, realization 1, 0.1x0.1 degree spatial resolution.',
'NETCDF_DIM_EXTRA': '{time}',
'NETCDF_DIM_time_DEF': '{366,4}',
'NETCDF_DIM_time_VALUES': '{0,1,2,3,4,5,6,7,8,9,10,11,12,13,14....362,363,364,365}',
'prcp#add_offset': '819.17499',
'prcp#long_name': 'daily precipitation accumulation',
'prcp#missing_value': '-32768',
'prcp#scale_factor': '0.025',
'prcp#units': 'mm',
'prcp#_FillValue': '-32768',
'time#units': 'days since 1952-1-1 0:0:0.0'}</pre><pre><br></pre><pre><span style="font-family:Arial,Helvetica,sans-serif;white-space:normal">In case this information is useful: Previously I used python to extract variable information and generate a csv or table using this variable information, and the code is enclosed below. In the code I extracted variable values using lon = dataset.variables['lon'][:] and iterated for loops to write them all in csv. </span><br></pre></div><div><div>Python code:</div><div>import netCDF4 as nc<br><br># Step 1: Read the NetCDF file<br>filename = "/home/manaswini/<a href="http://prcp_03_1950.nc">prcp_03_1950.nc</a>"<br>dataset = nc.Dataset(filename)<br>dataset.set_auto_mask(False)<br>dataset.set_auto_scale(True)<br></div><div><br></div><div>lon = dataset.variables['lon'][:]<br>lat = dataset.variables['lat'][:]<br>time = dataset.variables['time'][:]<br>prcp = dataset.variables['prcp'][:]<br></div><div><br></div><div>import numpy as np<br>import csv<br><br># csv_buffer<br>csv_buffer = open('output.csv', 'w', newline='')<br>csv_writer = csv.writer(csv_buffer)<br><br># Iterate through grid points and write to CSV buffer<br>for i in enumerate(lon):<br> for j in enumerate(lat):<br> for k in enumerate(time):<br> csv_writer.writerow([lat[j], lon[i], prcp[i][j][k], year[k], yearday[k]])<br><br><br># Close the CSV buffer<br>csv_buffer.close()<br></div><div><br></div><div>Thank you, </div><div><span style="font-family:"trebuchet ms",sans-serif">Manaswini Ganjam</span></div></div></div></div></div></div>