[GRASS-SVN] r62287 - in sandbox/krejcmat/src: . mw2 mw2/test_suite

svn_grass at osgeo.org svn_grass at osgeo.org
Sat Oct 18 12:29:34 PDT 2014


Author: krejcmat
Date: 2014-10-18 12:29:34 -0700 (Sat, 18 Oct 2014)
New Revision: 62287

Added:
   sandbox/krejcmat/src/mw2/
   sandbox/krejcmat/src/mw2/db.sh
   sandbox/krejcmat/src/mw2/mv2.py
   sandbox/krejcmat/src/mw2/pgwrapper.py
   sandbox/krejcmat/src/mw2/test_suite/
   sandbox/krejcmat/src/mw2/test_suite/baseltime.ref
   sandbox/krejcmat/src/mw2/test_suite/color.ref
   sandbox/krejcmat/src/mw2/test_suite/test.r.mvprecip.sh
Log:
mw2 o-o version

Added: sandbox/krejcmat/src/mw2/db.sh
===================================================================
--- sandbox/krejcmat/src/mw2/db.sh	                        (rev 0)
+++ sandbox/krejcmat/src/mw2/db.sh	2014-10-18 19:29:34 UTC (rev 62287)
@@ -0,0 +1,5 @@
+#!/bin/bash
+sudo service postgresql restart
+dropdb new_mw
+createdb new_mw
+psql new_mw < /home/matt/Dropbox/BC/bp-krejci/data/ne15.09.2013.backup


Property changes on: sandbox/krejcmat/src/mw2/db.sh
___________________________________________________________________
Added: svn:executable
   + *

Added: sandbox/krejcmat/src/mw2/mv2.py
===================================================================
--- sandbox/krejcmat/src/mw2/mv2.py	                        (rev 0)
+++ sandbox/krejcmat/src/mw2/mv2.py	2014-10-18 19:29:34 UTC (rev 62287)
@@ -0,0 +1,1034 @@
+#!/usr/bin/env python
+# -*- coding: utf-8
+# __author__ = 'matt'
+
+from pgwrapper import pgwrapper as pg
+from math import sin, cos, atan2, degrees, radians, tan, sqrt, fabs
+import sys, os, shutil
+import psycopg2
+import string, random
+import time
+from datetime import datetime, timedelta
+import math
+import re
+import csv
+
+
+try:
+    from grass.script import core as grass
+except ImportError:
+    sys.exit("Cannot find 'grass' Python module. Python is supported by GRASS from version >= 6.4")
+
+
+class LinkTemporalItem():
+    def __init__(self, time):
+        self.time = None
+        self.txpower = None
+        self.rxpower = None
+        self.freq = None
+        self.polariz = None
+        self.recordid = None
+        self.linkid = None
+        self.precip = None
+        self.timeSum = None
+
+
+class RGaugeTemporalItem():
+    def __init__(self, time):
+        self.time = None
+        self.recordid = None
+        self.gaugeid = None
+        self.precip = None
+        self.timeSum = None
+
+
+class VectorLoader():
+    def __init__(self, database):
+        self.database = database
+
+    def selectNodes(self):
+        sql = 'SELECT nodeid, lat, long FROM %s.node;' % self.database.dataScheme
+        return self.database.connection.executeSql(sql, True, True)
+
+    def getASCIInodes(self, selectedPoints):
+        newline = '\n'
+        pointsASCI = 'VERTI:\n'
+        for point in selectedPoints:
+            pointsASCI += 'P 1 1' + newline  # primitive P(point), num of coordinates,num of categories
+            pointsASCI += str(point[2]) + ' ' + str(point[1]) + newline  # coordination of point
+            pointsASCI += '1' + ' ' + str(point[0]) + newline  # first layer, cat=id of point(nodeid)
+        return pointsASCI
+
+    def selectLinks(self):
+        sql = 'SELECT l.linkid, n1.lat, n1.long ,n2.lat, n2.long \
+            FROM %s.node as n1\
+            JOIN %s.link as l \
+            ON n1.nodeid=fromnodeid\
+            JOIN %s.node as n2 \
+            ON n2.nodeid= tonodeid;' % (self.database.dataScheme, self.database.dataScheme, self.database.dataScheme)
+        return self.database.connection.executeSql(sql, True, True)
+
+    def getASCIIlinks(self, selectedLinks):
+        newline = '\n'
+        linksASCII = 'VERTI:\n'
+        for link in selectedLinks:
+            linksASCII += 'L 2 1' + newline  # primitive L(line), num of coordinates,num of categories
+            linksASCII += str(link[2]) + ' ' + str(link[1]) + newline
+            linksASCII += str(link[4]) + ' ' + str(link[3]) + newline
+            linksASCII += '1' + ' ' + str(link[0]) + newline  # first layer, cat=id of point(nodeid)
+        return linksASCII
+
+    def grass_vinASCII(self, asciiStr, outMapName):
+        currDir = os.path.dirname(os.path.realpath(__file__))
+        tmpFile = os.path.join(currDir, 'tmp')
+        f = open(tmpFile, 'w')
+        f.write(asciiStr)
+        f.close()
+        grass.run_command('v.in.ascii',
+                          input=tmpFile,
+                          format='standard',
+                          output=outMapName,
+                          quiet=True,
+                          overwrite=True)
+        os.remove(tmpFile)
+
+
+class Baseline():
+    def __init__(self, type, statFce=None, quantile=None, roundMode=None, aw=0, pathToFile=None):
+        self.quentile = quantile
+        self.roundMode = roundMode
+        self.aw = aw
+        self.pathToFile = pathToFile
+        self.type = type
+        self.statFce = statFce
+
+        if statFce == 'mode':
+            if self.roundMode is None:
+                grass.fatal('Value "round" is  missing.')
+        if statFce == 'quantile':
+            if self.quentile is None:
+                grass.fatal('Value "quentile" is  missing.')
+        if self.type == 'fromDryWin':
+            if self.pathToFile is None:
+                grass.fatal('Dry interval is not defined.')
+        if self.type == 'values':
+            if self.pathToFile is None:
+                grass.fatal('Baseline values are not defined.')
+
+
+class TimeWindows():
+    def __init__(self, database, IDtype, sumStep, startTime=None, endTime=None, ignoredID=None):
+
+        self.startTime = startTime
+        self.endTime = endTime
+        self.sumStep = sumStep
+        self.database = database
+        self.db = database.connection
+        self.path = database.pathWorkSchemeDir
+        self.schema = database.schema
+        self.typeID = IDtype
+        self.viewStatement = database.viewStatement
+        self.tbName = database.computedPrecip
+        self.ignoredIDpath = ignoredID
+
+        self.viewDB = None
+        self.intervalStr = None
+        self.timestamp_max = None
+        self.timestamp_min = None
+        self.temporalRegPath = None
+
+    def createWin(self):
+        self.sumValues()
+        # self.setTimestamp()
+        if self.ignoredIDpath is not None:
+            self.removeLinks()
+        self.crateTimeWin()
+
+    def sumValues(self):
+        if self.sumStep == "minute":
+            self.intervalStr = 60
+        elif self.sumStep == "hour":
+            self.intervalStr = 3600
+        else:
+            self.intervalStr = 86400
+
+        # #summing values per (->user)timestep interval
+        self.viewDB = 'computed_precip_summed'
+        sql = "CREATE %s %s.%s as  \
+               SELECT %s ,round(avg(precip)::numeric,3) as precip_mm_h, date_trunc('%s',time)as time  \
+               FROM %s.%s \
+               GROUP BY %s, date_trunc('%s',time)\
+               ORDER BY time" % (self.viewStatement, self.schema, self.viewDB, self.typeID, \
+                                 self.sumStep, self.schema, self.tbName, self.typeID, self.sumStep)
+        self.database.connection.executeSql(sql, False, True)
+
+    def setTimestamp(self):
+        # num of rows
+        record_num = self.database.connection.count("%s.%s" % (self.database.dataScheme, self.database.recordTableName))
+        # get first timestamp
+        sql = "SELECT time FROM %s.%s LIMIT 1" % (self.database.dataScheme, self.database.recordTableName)
+        self.timestamp_min = self.database.connection.executeSql(sql)[0][0]
+        # get last timestep
+        sql = "SELECT time FROM  %s.%s OFFSET %s" % (
+            self.database.dataScheme, self.database.recordTableName, record_num - 1)
+        self.timestamp_max = self.database.connection.executeSql(sql)[0][0]
+
+        # check if set time by user is in dataset time interval
+        if self.startTime is not None:
+            self.startTime = datetime.strptime(str(self.startTime), "%Y-%m-%d %H:%M:%S")
+            if self.timestamp_min > self.startTime:
+                grass.message("'startTime' value is not in dataset time interval")
+            else:
+                self.timestamp_min = self.startTime
+
+        if self.endTime is not None:
+            self.endTime = datetime.strptime(str(self.endTime), "%Y-%m-%d %H:%M:%S")
+            if self.timestamp_max < self.endTime:
+                grass.message("'endTime' value is not in dataset time interval")
+            else:
+                self.timestamp_max = self.endTime
+
+    def removeLinks(self):
+        '''Remove ignored links'''
+        try:
+            with open(self.ignoredIDpath, 'r') as f:
+                for link in f.read().splitlines():
+                    sql = "DELETE FROM %s.%s WHERE %s=%s " % (self.schema, self.viewDB, self.typeID, link)
+                    self.database.connection.executeSql(sql, False, True)
+        except IOError as (errno, strerror):
+            print "I/O error({0}): {1}".format(errno, strerror)
+
+    def crateTimeWin(self):
+        grass.message('creating time win')
+        time_const = 0
+        i = 0
+        nameList = []
+        tgrass_vector = []
+        cur_timestamp = self.timestamp_min
+        prefix = 'l'
+        if self.typeID == 'gaugeid':
+            prefix = 'g'
+
+        grass.message("from " + str(self.timestamp_min) + " to " + str(self.timestamp_max) + " per " + self.sumStep)
+        # make timewindows from time interval
+        while cur_timestamp < self.timestamp_max:
+            # create name of view
+            a = time.strftime("%Y_%m_%d_%H_%M", time.strptime(str(cur_timestamp), "%Y-%m-%d %H:%M:%S"))
+            view_name = "%s%s%s" % (prefix, self.database.viewStatement, a)
+            vw = view_name + "\n"
+            nameList.append(vw)
+
+            # text format for t.register ( temporal grass)
+            if self.typeID == 'linkid':
+                tgrass = view_name + '|' + str(cur_timestamp) + "\n"
+                tgrass_vector.append(tgrass)
+            else:
+                tgrass = view_name + '|' + str(cur_timestamp) + "\n"
+                tgrass_vector.append(tgrass)
+
+            sql = "CREATE TABLE %s.%s as\
+                   SELECT * from %s.%s \
+                   WHERE time=(timestamp'%s'+ %s * interval '1 second')" % \
+                  ( self.schema, view_name,
+                    self.schema, self.viewDB,
+                    self.timestamp_min,
+                    time_const)
+
+            data = self.database.connection.executeSql(sql, False, True)
+
+            #compute current timestamp (need for loop)
+            sql = "SELECT (timestamp'%s')+ %s* interval '1 second'" % (cur_timestamp, self.intervalStr)
+            cur_timestamp = self.database.connection.executeSql(sql)[0][0]
+
+            #go to next time interval
+            time_const += self.intervalStr
+
+        # write values to flat file
+        if self.typeID == 'linkid':
+            try:
+                io2 = open(os.path.join(self.path, "l_timewindow"), "wr")
+                io2.writelines(nameList)
+                io2.close()
+            except IOError as (errno, strerror):
+                print "I/O error({0}): {1}".format(errno, strerror)
+        else:
+            try:
+                io2 = open(os.path.join(self.path, "g_timewindow"), "wr")
+                io2.writelines(nameList)
+                io2.close()
+            except IOError as (errno, strerror):
+                print "I/O error({0}): {1}".format(errno, strerror)
+
+        # creating textfile for t.register input
+        filename = "timewin_%s" % prefix + "vec_" + str(self.timestamp_min).replace(' ', '_') + "|" + str(
+            self.timestamp_max).replace(' ', '_')
+        try:
+            self.temporalRegPath = os.path.join(self.path, filename)
+            io4 = open(self.temporalRegPath, "wr")
+            io4.writelines(tgrass_vector)
+            io4.close()
+        except IOError as (errno, strerror):
+            print "I/O error({0}): {1}".format(errno, strerror)
+
+        #drop temp table
+        sql = "drop table %s.%s" % (self.schema, self.database.recordTableName)
+
+
+class Computor():
+    def __init__(self, baseline, timeWin, database):
+        self.awConst = baseline.aw
+        self.database = database
+        self.baselineConf = baseline
+        self.timeWin = timeWin
+
+        self.baselineDict = None
+
+        self.prepareDatabase()
+        self.computePrecip()
+
+        self.timeWin.createWin()
+
+    def prepareDatabase(self):
+        sql = "DROP SCHEMA IF EXISTS %s CASCADE" % self.database.schema
+        shutil.rmtree(self.database.pathWorkSchemeDir)
+        os.makedirs(self.database.pathWorkSchemeDir)
+        self.database.connection.executeSql(sql, False, True)
+        sql = "CREATE SCHEMA %s" % self.database.schema
+        self.database.connection.executeSql(sql, False, True)
+
+    def getBaselDict(self):
+        '''@note  returns key:linkid'''
+
+        def computeBaselinFromMode(recordTable):
+            sql = "SELECT linkid from %s group by 1" % recordTable
+            linksid = self.database.connection.executeSql(sql, True, True)
+            tmp = []
+
+            # round value
+            sql = "CREATE TABLE %s.tempround as SELECT round(a::numeric,%s) as a, linkid FROM %s" % (
+                self.database.schema, self.baselineConf.roundMode, recordTable)
+            self.database.connection.executeSql(sql, False, True)
+            # compute mode for current link
+            for linkid in linksid:
+                linkid = linkid[0]
+                sql = "SELECT mode(a) AS modal_value FROM %s.tempround where linkid=%s;" % (
+                    self.database.schema, linkid)
+                resu = self.database.connection.executeSql(sql, True, True)[0][0]
+                tmp.append(str(linkid) + ',' + str(resu) + '\n')
+
+            sql = "DROP TABLE %s.tempround" % self.database.schema
+            self.database.connection.executeSql(sql, False, True)
+            try:
+                io0 = open(os.path.join(self.database.pathWorkSchemeDir, "baseline"), "wr")
+                io0.writelines(tmp)
+                io0.close()
+            except IOError as (errno, strerror):
+                print "I/O error({0}): {1}".format(errno, strerror)
+
+            try:
+                io1 = open(os.path.join(self.database.pathWorkSchemeDir, "compute_precip_info"), "wr")
+                io1.write('mode|' + str(self.baselineConf.aw))
+                io1.close
+            except IOError as (errno, strerror):
+                print "I/O error({0}): {1}".format(errno, strerror)
+
+        def computeBaselineFromTime():
+            # ################################
+            # @function for reading file of intervals or just one moments when dont raining.#
+            # @format of input file(with key interval):
+            #  interval
+            #  2013-09-10 04:00:00
+            #  2013-09-11 04:00:00
+            #
+            #@just one moment or moments
+            #  2013-09-11 04:00:00
+            #  2013-09-11 04:00:00
+            ################################
+            #@typestr choose statistical method for baseline computing.
+            # typestr='avg'
+            # typestr='mode'
+            # typestr='quantile'
+            ################################
+            tmp = []
+            st = ''
+            #print self.baselineConf.statFce
+            ######## AVG ##########
+            if self.baselineConf.statFce == 'avg':
+                try:
+                    f = open(self.baselineConf.pathTofile, 'r')
+                    #parse input file
+                    for line in f:
+                        st += line.replace("\n", "")
+                        if 'i' in line.split("\n")[0]:  #get baseline form interval
+                            fromt = f.next()
+                            st += fromt.replace("\n", "")
+                            tot = f.next()
+
+                            #validate input data
+                            if not isTimeValid(fromt) or not isTimeValid(tot):
+                                grass.fatal("Input data is not valid. Parameter 'baselitime'")
+
+                            st += tot.replace("\n", "")
+                            sql = "SELECT linkid, avg(txpower-rxpower)as a from record where time >='%s' and time<='%s' group by linkid order by 1" % (
+                                fromt, tot)
+                            resu = self.database.connection.executeSql(sql, True, True)
+                            tmp.append(resu)
+
+                        else:  #get baseline one moment
+                            time = line.split("\n")[0]
+                            #validate input data
+                            if not isTimeValid(time):
+                                grass.fatal("Input data is not valid. Parameter 'baselitime'")
+
+                            time = datetime.strptime(time, "%Y-%m-%d %H:%M:%S")
+                            st += str(time).replace("\n", "")
+                            fromt = time + timedelta(seconds=-60)
+                            tot = time + timedelta(seconds=+60)
+                            sql = "SELECT linkid, avg(txpower-rxpower)as a from record where time >='%s' and time<='%s' group by linkid order by 1" % (
+                                fromt, tot)
+                            resu = self.database.connection.executeSql(sql, True, True)
+                            tmp.append(resu)
+
+                            continue
+                except IOError as (errno, strerror):
+                    print "I/O error({0}): {1}".format(errno, strerror)
+
+                mydict = {}
+                mydict1 = {}
+                i = True
+                # sum all baseline per every linkid from get baseline dataset(next step avg)
+                for dataset in tmp:
+                    mydict = {int(rows[0]): float(rows[1]) for rows in dataset}
+                    if i is True:
+                        mydict1 = mydict
+                        i = False
+                        continue
+                    for link, a in dataset:
+                        mydict1[link] += mydict[link]
+
+                length = len(tmp)
+                links = len(tmp[0])
+                i = 0
+
+                #compute avg(divide sum by num of datasets)
+                for dataset in tmp:
+                    for link, a in dataset:
+                        i += 1
+                        mydict1[link] /= length
+                        if i == links:
+                            break
+                    break
+
+                #write values to baseline file
+                writer = csv.writer(open(os.path.join(self.database.pathWorkSchemeDir, 'baseline'), 'wr'))
+                for key, value in mydict1.items():
+                    writer.writerow([key, value])
+
+            ######## MODE or QUANTILE ##########
+            elif self.baselineConf.statFce == 'mode' or self.baselineConf.statFce == 'quantile':
+                #print 'mode***'
+                try:
+                    f = open(self.baselineConf.pathToFile, 'r')
+                    # print 'open PathTOfile'
+                    #parse input file
+                    for line in f:
+                        st += line.replace("\n", "")
+                        if 'i' in line.split("\n")[0]:  #get baseline form interval
+                            fromt = f.next()
+                            st += fromt.replace("\n", "")
+                            tot = f.next()
+                            #validate input data
+                            if not isTimeValid(fromt) or not isTimeValid(tot):
+                                grass.fatal("Input data is not valid. Parameter 'baselitime'")
+                            st += tot.replace("\n", "")
+                            sql = "SELECT linkid, txpower-rxpower as a from record WHERE time >='%s' and time<='%s'" % (
+                                fromt, tot)
+                            resu = self.database.connection.executeSql(sql, True, True)
+                            resu += resu
+
+                        else:  #get baseline one moment
+                            time = line.split("\n")[0]
+                            if not isTimeValid(time):
+                                grass.fatal("Input data is not valid. Parameter 'baselitime'")
+                            time = datetime.strptime(time, "%Y-%m-%d %H:%M:%S")
+                            st += str(time).replace("\n", "")
+                            fromt = time + timedelta(seconds=-60)
+                            tot = time + timedelta(seconds=+60)
+
+                            sql = "SELECT linkid, txpower-rxpower as a from record WHERE time >='%s' and time<='%s'" % (
+                                fromt, tot)
+                            resu = self.database.connection.executeSql(sql, True, True)
+                            resu += resu
+                            continue
+
+                except IOError as (errno, strerror):
+                    print "I/O error({0}): {1}".format(errno, strerror)
+
+                tmp.append(resu)
+                table_mode_tmp = "mode_tmp"
+                sql = "CREATE TABLE %s.%s ( linkid integer,a real);" % (self.database.schema, table_mode_tmp)
+                self.database.connection.executeSql(sql, False, True)
+                c = 0
+                #write values to flat file
+
+                try:
+                    io = open(os.path.join(self.database.pathWorkSchemeDir, "mode_tmp"), "wr")
+                    c = 0
+                    for it in tmp:
+                        for i in it:
+                            a = str(i[0]) + "|" + str(i[1]) + "\n"
+                            io.write(a)
+                            c += 1
+                    io.close()
+                except IOError as (errno, strerror):
+                    print "I/O error({0}): {1}".format(errno, strerror)
+
+                    #update table
+                try:
+                    io1 = open(os.path.join(self.database.pathWorkSchemeDir, "mode_tmp"), "r")
+                    self.database.connection.copyfrom(io1, "%s.%s" % (self.database.schema, table_mode_tmp))
+                    io1.close()
+                    os.remove(os.path.join(self.database.pathWorkSchemeDir, "mode_tmp"))
+                except IOError as (errno, strerror):
+                    print "I/O error({0}): {1}".format(errno, strerror)
+
+                recname = self.database.schema + '.' + table_mode_tmp
+
+                if self.baselineConf.statFce == 'mode':
+                    computeBaselinFromMode(recname)
+
+                if self.baselineConf.statFce == 'quantile':
+                    computeBaselineFromQuentile(recname)
+
+                sql = "DROP TABLE %s.%s" % (self.database.schema, table_mode_tmp)
+                self.database.connection.executeSql(sql, False, True)
+
+
+
+            #write  unique mark to file
+            try:
+                io1 = open(os.path.join(self.database.pathWorkSchemeDir, "compute_precip_info"), "wr")
+                st = st + '|' + str(self.baselineConf.aw)
+                io1.write(st)
+                io1.close
+            except IOError as (errno, strerror):
+                print "I/O error({0}): {1}".format(errno, strerror)
+
+        def computeBaselineFromQuentile(recordTable):
+            link_num = self.database.connection.count("link")
+            sql = "SELECT linkid from %s group by linkid" % self.database.recordTable
+            linksid = self.database.connection.executeSql(sql, True, True)
+            tmp = []
+            # for each link  compute baseline
+            for linkid in linksid:
+                linkid = linkid[0]
+                sql = "Select\
+                    max(a) as maxAmount,\
+                    avg(a) as avgAmount,\
+                    quartile\
+                    FROM (SELECT a, ntile(%s) OVER (order by a) as quartile\
+                    FROM %s where linkid=%s ) x\
+                    GROUP BY quartile\
+                    ORDER BY quartile\
+                    limit 1" % (self.baselineConf.quantile, recordTable, linkid)
+
+                resu = self.database.connection.executeSql(sql, True, True)[0][0]
+                tmp.append(str(linkid) + ',' + str(resu) + '\n')
+
+            try:
+                io0 = open(os.path.join(self.database.pathWorkSchemeDir, "baseline"), "wr")
+                io0.writelines(tmp)
+                io0.close()
+            except IOError as (errno, strerror):
+                print "I/O error({0}): {1}".format(errno, strerror)
+
+            try:
+                io1 = open(os.path.join(self.database.pathWorkSchemeDir, "compute_precip_info"), "wr")
+                io1.write('quantile' + self.baselineConf.quantile + '|' + self.baselineConf.aw)
+                io1.close
+            except IOError as (errno, strerror):
+                print "I/O error({0}): {1}".format(errno, strerror)
+
+        def readBaselineFromText(path):
+            with open(path, mode='r') as infile:
+                reader = csv.reader(infile, delimiter=',')
+                mydict = {float(rows[0]): float(rows[1]) for rows in reader}
+            return mydict
+
+        if self.baselineConf.type == 'values':
+            # print 'valuesDirectly'
+            self.baselineDict = readBaselineFromText(self.baselineConf.pathTofile)
+
+        elif self.baselineConf.type == 'fromDryWin':
+            grass.message('Computing baselines "time interval" "%s"...' % self.baselineConf.statFce)
+            # print 'fromDryWin'
+            computeBaselineFromTime()
+            self.baselineDict = readBaselineFromText(os.path.join(self.database.pathWorkSchemeDir, 'baseline'))
+
+    def computePrecip(self):
+        grass.message("Preparing database for computing precipitation...")
+        Aw = float(self.baselineConf.aw)
+
+        link_num = self.database.connection.count("link")
+
+        # self.timeWin.sumValues()
+        self.timeWin.setTimestamp()
+
+        sql = "SELECT n2.time, n2.txpower-n2.rxpower as a,n1.lenght,n1.polarization,n1.frequency,n1.linkid\
+                FROM %s.link AS n1 \
+                JOIN %s.record AS n2 ON n1.linkid = n2.linkid \
+                WHERE n2.linkid = n1.linkid AND\
+                time >= '%s' AND\
+                time <= '%s' \
+                ORDER by n2.recordid;" % \
+              (self.database.dataScheme,
+               self.database.dataScheme,
+               self.timeWin.timestamp_min,
+               self.timeWin.timestamp_max)
+        resu = self.database.connection.executeSql(sql, True, True)
+
+        sql = "CREATE TABLE %s.%s ( linkid integer,time timestamp, precip real);" % \
+              (self.database.schema,
+               self.database.computedPrecip)
+        self.database.connection.executeSql(sql, False, True)
+
+        # optimalization of commits
+        self.database.connection.setIsoLvl(0)
+
+        #choose baseline source (quantile, user values, ) get dict linkid, baseline
+        self.getBaselDict()
+
+        #check if baseline from text is correct
+        if len(self.baselineDict) < link_num:
+            sql = "SELECT linkid FROM link"
+            links = self.database.connection.executeSql(sql, True, True)
+            for link in links:
+                #grass.message(type(link))
+                if not link[0] in self.baselineDict:
+                    grass.message("Linkid= %s is missing in txtfile" % str(link[0]))
+                    grass.message("Link not included in computation")
+            grass.message(
+                'HINT-> Missing values "linkid,baseline," in text file. Link probably getting ERROR "-99.9" in SELECTed time interval\n or you omitted values in input  text. You can add value manualy into the file and use method "read baseline from file"')
+
+        recordid = 1
+        temp = []
+        for record in resu:
+            #if missing baseline. Link will be skip
+            if record[5] in self.baselineDict and (record[4] / 1000000) > 10:
+                #coef_a_k[alpha, k]
+                coef_a_k = self.computeAlphaK(record[4], record[3])
+
+                #read value from dictionary
+                baseline_decibel = (self.baselineDict[record[5]])
+
+                #final precipiatation is R1    
+                Ar = record[1] - baseline_decibel - Aw
+                #TODO check this condition
+                if Ar > 0:
+                    yr = Ar / record[2]
+                    R1 = (yr / coef_a_k[1]) ** (1 / coef_a_k[0])
+                else:
+                    R1 = 0
+
+                #string for output flatfile
+                out = str(record[5]) + "|" + str(record[0]) + "|" + str(R1) + "\n"
+                temp.append(out)
+                recordid += 1
+
+        #write values to flat file
+        try:
+            io = open(os.path.join(self.database.pathWorkSchemeDir, "precip"), "wr")
+            io.writelines(temp)
+            io.close()
+        except IOError as (errno, strerror):
+            print "I/O error({0}): {1}".format(errno, strerror)
+
+        grass.message("Writing computed precipitation to database...")
+        io1 = open(os.path.join(self.database.pathWorkSchemeDir, "precip"), "r")
+        self.database.connection.copyfrom(io1, "%s.%s" % (self.database.schema, self.database.computedPrecip))
+        io1.close()
+        os.remove(os.path.join(self.database.pathWorkSchemeDir, "precip"))
+
+    def computeAlphaK(self, freq, polarization):
+        """@RECOMMENDATION ITU-R P.838-3
+        Specific attenuation model for rain for use in prediction methods
+        γR = kR^α
+        return kv and αv (vertical polarization)
+        return kh and αh (horizontal polarization)
+        """
+        freq /= 1000000
+
+        if polarization == "h":
+            # Coefficients for kH    1
+            aj_kh = (-5.33980, -0.35351, -0.23789, -0.94158)
+            bj_kh = (-0.10008, 1.26970, 0.86036, 0.64552)
+            cj_kh = (1.13098, 0.45400, 0.15354, 0.16817)
+            mk_kh = -0.18961
+            ck_kh = 0.71147
+
+            # Coefficients for αH    3
+            aj_ah = (-0.14318, 0.29591, 0.32177, -5.37610, 16.1721)
+            bj_ah = (1.82442, 0.77564, 0.63773, -0.96230, -3.29980)
+            cj_ah = (-0.55187, 0.19822, 0.13164, 1.47828, 3.43990)
+            ma_ah = 0.67849
+            ca_ah = -1.95537
+            kh = 0
+            ah = 0
+
+            # kh.. coefficient k of horizontal polarization
+            for j in range(0, len(aj_kh)):
+                frac_kh = -math.pow(((math.log10(freq) - bj_kh[j]) / cj_kh[j]), 2)
+                kh += aj_kh[j] * math.exp(frac_kh)
+
+            kh = 10 ** (kh + mk_kh * math.log10(freq) + ck_kh)
+
+            # ah.. coefficient α of horizontal polarization
+            for j in range(0, len(aj_ah)):
+                frac_ah = -math.pow(((math.log10(freq) - bj_ah[j]) / cj_ah[j]), 2)
+                ah += aj_ah[j] * math.exp(frac_ah)
+
+            ah = ah + ma_ah * math.log10(freq) + ca_ah
+
+            return ah, kh
+
+        else:
+            # Coefficients for kV    2
+            aj_kv = [-3.80595, -3.44965, -0.39902, 0.50167]
+            bj_kv = [0.56934, -0.22911, 0.73042, 1.07319]
+            cj_kv = [0.81061, 0.51059, 0.11899, 0.27195]
+            mk_kv = -0.16398
+            ck_kv = 0.63297
+
+            # Coefficients for αV   4
+            aj_av = [-0.07771, 0.56727, -0.20238, -48.2991, 48.5833]
+            bj_av = [2.33840, 0.95545, 1.14520, 0.791669, 0.791459]
+            cj_av = [-0.76284, 0.54039, 0.26809, 0.116226, 0.116479]
+            ma_av = -0.053739
+            ca_av = 0.83433
+            kv = 0
+            av = 0
+            # kv.. coefficient k of vertical polarization
+            for j in range(0, len(aj_kv)):
+                frac_kv = -math.pow(((math.log10(freq) - bj_kv[j]) / cj_kv[j]), 2)
+                kv += aj_kv[j] * math.exp(frac_kv)
+
+            kv = 10 ** (kv + mk_kv * math.log10(freq) + ck_kv)
+
+            # av.. coefficient α of vertical polarization
+            for j in range(0, len(aj_av)):
+                frac_av = -math.pow(((math.log10(freq) - bj_av[j]) / cj_av[j]), 2)
+                av += aj_av[j] * math.exp(frac_av)
+
+            av = (av + ma_av * math.log10(freq) + ca_av)
+
+            return av, kv
+
+class GrassLayerManager():
+    def __init__(self, database):
+        self.database = database
+        self.connectDBaLayers()
+
+    def connectDBaLayers(self):
+        layerNum = 0
+        try:
+            with open(os.path.join(self.database.pathWorkSchemeDir, "l_timewindow"), 'r') as f:
+                for win in f.read().splitlines():
+                    layerNum += 1
+                    win = self.database.schema + '.' + win
+                    grass.run_command('v.db.connect',
+                                      driver='pg',
+                                      database=self.database.dbName,
+                                      map=self.database.linkVecMapName,
+                                      table=win,
+                                      key='linkid',
+                                      layer=layerNum,
+                                      quiet=True)
+        except:
+            print "Unexpected error:", sys.exc_info()[0]
+            grass.fatal('Cannot connect tables(time-windows)  to vector layer')
+
+class GrassTemporalManagement():
+    def __init__(self,database,timeWinConf):
+        self.database=database
+        self.datasetName=database.schema
+        self.datasetTitle='MW time dataset'
+        self.datasetTdescription=" IDtype=%s,\
+                                   sumStep=%s,\
+                                   startTime=%s,\
+                                   endTime='%s" %(timeWinConf.typeID,
+                                                timeWinConf.sumStep,
+                                                timeWinConf.startTime,
+                                                timeWinConf.endTime)
+        self.createTimedataset()
+
+    def createTimedataset(self,datasetName=None,datasetTitle=None,
+                          datasetTdescription=None,temporalType='absolute',
+                          semanticType='mean'):
+
+        if  datasetName is None:
+            self.datasetName=self.datasetName
+        if datasetTitle is None:
+            self.datasetTitle=self.datasetTitle
+        if datasetTdescription is None:
+            self.datasetTdescription=self.datasetTdescription
+
+        grass.run_command('t.create',
+                          type='stvds',
+                          output=self.datasetName,
+                          title=self.datasetTitle,
+                          description=self.datasetTdescription,
+                          temporaltype=temporalType,
+                          semantictype=semanticType)
+
+    def registerMaps(self):
+        grass.run_command('t.register',
+                          input=self.datasetName,
+                          map=self.database.linkVecMapName,
+                          type='vect',
+                          
+                          )
+
+
+class Database():
+    def __init__(self, name=None, user=None, password=None,
+                 host=None, nodeVecMapName='node', linkVecMapName='link',
+                 workScheme='tmp5', dataScheme='public'):
+
+        self.dbName = name
+        self.user=user
+        self.password=password
+        self.host=host
+
+        self.schema = workScheme
+        self.dataScheme = dataScheme
+        self.connection = None
+        self.viewStatement = 'table'
+        self.recordTableName = 'record'
+        self.computedPrecip = 'computed_precip'
+        self.computedPrecipGauge = 'rgage_precip'
+
+        self.nodeVecMapName = nodeVecMapName
+        self.linkVecMapName = linkVecMapName
+
+        self.pathWorkSchemeDir = os.path.join(os.path.dirname(os.path.realpath(__file__)), "tmp_%s" % self.schema)
+
+        self.pyConnection()
+        self.grassConnection()
+        self.grassTemporalConnection()
+        self.firstPreparation()
+        self.prepareDB()
+        self.prepareDir()
+
+    def grassTemporalConnection(self):
+        conninfo = 'dbname= ' +self.dbName
+        if self.user:
+            conninfo+=' user= '+ self.user
+        if self.password:
+            conninfo+=' passwd= ' + self.password
+        if self.host:
+            conninfo+=' host=' + self.host
+
+        grass.run_command('t.connect',
+                         driver='pg',
+                         database=conninfo)
+
+    def grassConnection(self):
+        grass.message("Connecting to db-GRASS...")
+        # Unfortunately we cannot test untill user/password is set
+        if self.user or self.password:
+            if grass.run_command('db.login',
+                                 driver="pg",
+                                 database=self.dbName,
+                                 user=self.user,
+                                 password=self.password) != 0:
+
+                grass.fatal("Cannot login")
+
+        # Try to connect
+        if grass.run_command('db.select',
+                             quiet=True,
+                             flags='c',
+                             driver="pg",
+                             database=self.dbName,
+                             sql="select version()") != 0:
+
+            if self.user or self.password:
+                grass.message("Deleting login (db.login) ...")
+                if grass.run_command('db.login',
+                                     quiet=True,
+                                     driver="pg",
+                                     database=self.dbName,
+                                     user="", password="") != 0:
+
+                    grass.message("Cannot delete login.")
+            grass.fatal("Cannot connect to database.")
+
+        if grass.run_command('db.connect', driver="pg", database=self.dbName) != 0:
+            grass.fatal("Cannot connect to database.")
+
+    def pyConnection(self):
+        try:
+            conninfo = {'dbname': self.dbName}
+            if self.user:
+                conninfo['user'] = self.user
+            if self.password:
+                conninfo['passwd'] = self.password
+            if self.host:
+                conninfo['host'] = self.host
+            self.connection = pg(**conninfo)
+
+        except psycopg2.OperationalError, e:
+            grass.fatal("Unable to connect to the database <%s>. %s" % (self.dbName, e))
+
+    def firstPreparation(self):
+        if not self.isAttributExist('public', 'link', 'frequency'):
+            grass.message("Add colum lenght")
+            sql = "ALTER TABLE link ADD COLUMN lenght real; "
+            self.connection.executeSql(sql, False, True)
+
+            grass.message("Add colum frequency")
+            sql = "ALTER TABLE link ADD COLUMN frequency real; "
+            self.connection.executeSql(sql, False, True)
+
+            grass.message("Optimalization of frequency attribute")
+            sql = "UPDATE link\
+                    SET frequency = record.frequency\
+                    FROM record\
+                    WHERE record.linkid = link.linkid;"
+            self.connection.executeSql(sql, False, True)
+
+            sql = "ALTER TABLE record DROP COLUMN frequency;"
+            self.connection.executeSql(sql, False, True)
+
+            # TODO
+            '''
+            grass.message("Add function for computing distance ")
+            sql="CREATE OR REPLACE FUNCTION get_earth_distance1 \
+                (lon1 Float, lat1 Float, lon2 Float, lat2 Float, Radius Float DEFAULT 6387.7)\
+                RETURNS FLOAT AS '\
+                -- Convert degrees to radians\
+                DECLARE K FLOAT := 57.29577951; v_dist FLOAT;\
+                BEGIN\
+                -- calculate\
+                v_dist := (Radius * ACOS((SIN(Lat1 / K) * SIN(Lat2 / K)) \
+                + (COS(Lat1 / K) * COS(Lat2 / K) * COS(Lon2 / K - Lon1 / K))));\
+                -- Return distance in Km\
+                RETURN round(CAST (v_dist AS Numeric),3);\
+                END;\
+                ' LANGUAGE 'plpgsql';"
+            self.connection.executeSql(sql, False, True)
+            '''
+            grass.message("Computing column lenght")
+            sql = "UPDATE link SET lenght = get_earth_distance1(n1.long,n1.lat,n2.long,n2.lat) \
+                FROM node AS n1 JOIN \
+                link AS l ON n1.nodeid = fromnodeid \
+                JOIN node AS n2 ON n2.nodeid = tonodeid \
+                WHERE link.linkid = l.linkid; "
+            self.connection.executeSql(sql, False, True)
+
+            grass.message("Add column precip")
+            sql = "ALTER TABLE record ADD COLUMN precip real; "
+            self.connection.executeSql(sql, False, True)
+
+            grass.message("Create sequence")
+            sql = "CREATE SEQUENCE serial START 1; "
+            self.connection.executeSql(sql, False, True)
+
+            grass.message("Add column recordid")
+            sql = "ALTER TABLE record add column recordid integer default nextval('serial'); "
+            self.connection.executeSql(sql, False, True)
+
+            grass.message("Create index on recordid")
+            sql = "CREATE INDEX idindex ON record USING btree(recordid); "
+            self.connection.executeSql(sql, False, True)
+
+            sql = "CREATE INDEX timeindex ON record USING btree (time); "
+            self.connection.executeSql(sql, False, True)
+
+            grass.message("Add mode function")
+            sql = "CREATE OR REPLACE FUNCTION _final_mode(anyarray)\
+                    RETURNS anyelement AS $BODY$ SELECT a FROM unnest($1)\
+                    a GROUP BY 1  ORDER BY COUNT(1) DESC, 1 LIMIT 1;\
+                    $BODY$ LANGUAGE 'sql' IMMUTABLE;"
+            self.connection.executeSql(sql, False, True)
+
+            sql = "CREATE AGGREGATE mode(anyelement) (\
+                    SFUNC=array_append, \
+                    STYPE=anyarray,\
+                    FINALFUNC=_final_mode, \
+                    INITCOND='{}');"
+            self.connection.executeSql(sql, False, True)
+
+    def prepareDB(self):
+        sql = "DROP SCHEMA IF EXISTS %s CASCADE" % self.schema
+        try:
+            shutil.rmtree(self.pathWorkSchemeDir)
+        except:
+            os.makedirs(self.pathWorkSchemeDir)
+        self.connection.executeSql(sql, False, True)
+        sql = "CREATE SCHEMA %s" % self.schema
+        self.connection.executeSql(sql, False, True)
+
+    def prepareDir(self):
+        try:
+            os.makedirs(self.pathWorkSchemeDir)
+        except OSError:
+            if not os.path.isdir(self.pathWorkSchemeDir):
+                raise
+
+    def isAttributExist(self, schema, table, columns):
+        sql = "SELECT EXISTS( SELECT * FROM information_schema.columns WHERE \
+              table_schema = '%s' AND \
+              table_name = '%s' AND\
+              column_name='%s');" % (schema, table, columns)
+        return self.connection.executeSql(sql, True, True)[0][0]
+
+    def isTableExist(self, schema, table):
+        sql = "SELECT EXISTS( SELECT * \
+              FROM information_schema.tables \
+              WHERE table_schema = '%s' AND \
+              table_name = '%s');" % (schema, table)
+        return self.connection.executeSql(sql, True, True)[0][0]
+
+
+def randomWord(length):
+    return ''.join(random.choice(string.lowercase) for i in range(length))
+
+
+def isTimeValid(time):
+    RE = re.compile(r'^\d{4}-\d{2}-\d{2}[ T]\d{2}:\d{2}:\d{2}$')
+    return bool(RE.search(time))
+
+
+def main():
+    db = Database(name='new_mw')
+    convertor = VectorLoader(db)
+
+    #create native vector map(nodes)
+    pointsSQL = convertor.selectNodes()
+    pointsASCII = convertor.getASCIInodes(pointsSQL)
+    convertor.grass_vinASCII(pointsASCII, db.nodeVecMapName)
+
+    #create native vector map(links)
+    linksSQL = convertor.selectLinks()
+    linksASCII = convertor.getASCIIlinks(linksSQL)
+    convertor.grass_vinASCII(linksASCII, db.linkVecMapName)
+
+    twin = TimeWindows(IDtype='linkid',
+                       sumStep='minute',
+                       startTime='2013-09-10 04:00:00',
+                       endTime='2013-09-10 05:00:00',
+                       database=db)
+
+    baseline = Baseline(type='fromDryWin',
+                        statFce='mode',
+                        pathToFile='/home/matt/Dropbox/gsoc/sandbox/krejcmat/src/test_suite/baseltime.ref',
+                        roundMode=3)
+
+    Computor(baseline, twin, db)
+    GrassLayerManager(db)
+
+    GrassTemporalManagement(db,twin)
+
+
+main()
\ No newline at end of file


Property changes on: sandbox/krejcmat/src/mw2/mv2.py
___________________________________________________________________
Added: svn:executable
   + *

Added: sandbox/krejcmat/src/mw2/pgwrapper.py
===================================================================
--- sandbox/krejcmat/src/mw2/pgwrapper.py	                        (rev 0)
+++ sandbox/krejcmat/src/mw2/pgwrapper.py	2014-10-18 19:29:34 UTC (rev 62287)
@@ -0,0 +1,117 @@
+#!/usr/bin/env python
+# -*- coding: utf-8
+
+import sys, os, argparse
+import psycopg2 as ppg
+import psycopg2.extensions
+
+
+class pgwrapper:
+    def __init__(self, dbname, host='', user='', passwd=''):
+        self.dbname = dbname  # Database name which connect to.
+        self.host = host  # Host name (default is "localhost")
+        self.user = user  # User name for login to the database.
+        self.password = passwd  # Password for login to the database.
+        self.connection = self.setConnect()  # Set a connection to the database
+        self.cursor = self.setCursor()  # Generate cursor.
+
+    def setConnect(self):
+        conn_string = "dbname='%s'" % self.dbname
+        if self.user:
+            conn_string += " user='%s'" % self.user
+        if self.host:
+            conn_string += " host='%s'" % self.host
+        if self.password:
+            conn_string += " passwd='%s'" % self.password
+
+        conn = ppg.connect(conn_string)
+        return conn
+
+    def setCursor(self):
+        return self.connection.cursor()
+
+
+    def setIsoLvl(self, lvl='0'):
+        if lvl == 0:
+            self.connection.set_session('read committed')
+        elif lvl == 1:
+            self.connection.set_session(readonly=True, autocommit=False)
+
+
+    def copyfrom(self, afile, table, sep='|'):
+        try:
+            self.cursor.copy_from(afile, table, sep=sep)
+            self.connection.commit()
+
+        except Exception, err:
+            self.connection.rollback()
+            self.print_message(" Catched error (as expected):\n")
+            self.print_message(err)
+
+            pass
+
+    def copyexpert(self, sql, data):
+        try:
+            self.cursor.copy_expert(sql, data)
+        except Exception:
+            self.connection.rollback()
+            pass
+
+
+    def executeSql(self, sql, results=True, commit=False):
+        # Excute the SQL statement.
+        # self.print_message (sql)
+        #print '*'*50
+        #print sql
+        #print '*'*50
+        try:
+            self.cursor.execute(sql)
+        except Exception, e:
+            self.connection.rollback()
+            print e.pgerror
+            pass
+
+        if commit:
+            self.connection.commit()
+
+        if results:
+            # Get the results.
+            results = self.cursor.fetchall()
+            # Return the results.1
+            return results
+
+
+    def count(self, table):
+        """!Count the number of rows.
+        @param table         : Name of the table to count row"""
+        sql_count = 'SELECT COUNT(*) FROM ' + table
+        self.cursor.execute(sql_count)
+        n = self.cursor.fetchall()[0][0]
+        return n
+
+    def updatecol(self, table, columns, where=''):
+        """!Update the values of columns.
+        @param table            : Name of the table to parse.
+        @param columns          : Keys values pair of column names and values to update.
+        @param where            : Advanced search option for 'where' statement.
+        """
+        # Make a SQL statement.
+        parse = ''
+        for i in range(len(columns)):
+            parse = parse + '"' + str(dict.keys(columns)[i]) + '"=' + str(dict.values(columns)[i]) + ','
+        parse = parse.rstrip(',')
+
+        if where == '':
+            sql_update_col = 'UPDATE "' + table + '" SET ' + parse
+        else:
+            sql_update_col = 'UPDATE "' + table + '" SET ' + parse + ' WHERE ' + where
+        print "upcol %s" % sql_update_col
+        # Excute the SQL statement.
+        self.cursor.execute(sql_update_col)
+
+    def print_message(self, msg):
+        print '-' * 80
+        print msg
+        print '-' * 80
+        print
+        sys.stdout.flush()


Property changes on: sandbox/krejcmat/src/mw2/pgwrapper.py
___________________________________________________________________
Added: svn:executable
   + *

Added: sandbox/krejcmat/src/mw2/test_suite/baseltime.ref
===================================================================
--- sandbox/krejcmat/src/mw2/test_suite/baseltime.ref	                        (rev 0)
+++ sandbox/krejcmat/src/mw2/test_suite/baseltime.ref	2014-10-18 19:29:34 UTC (rev 62287)
@@ -0,0 +1,6 @@
+int
+2013-09-09 00:00:00
+2013-09-09 02:00:00
+int
+2013-09-10 06:15:00
+2013-09-10 07:30:00

Added: sandbox/krejcmat/src/mw2/test_suite/color.ref
===================================================================
--- sandbox/krejcmat/src/mw2/test_suite/color.ref	                        (rev 0)
+++ sandbox/krejcmat/src/mw2/test_suite/color.ref	2014-10-18 19:29:34 UTC (rev 62287)
@@ -0,0 +1,19 @@
+0.2 240:240:240
+0.5 230:230:230
+1 220:220:255
+2 190:190:255
+3 150:150:255
+4 100:100:240
+5 70:70:200
+6 40:40:150
+7 0:120:0
+8 0:160:0
+9 0:210:0
+10 0:240:0
+13 255:250:0
+16 255:200:0
+19 255:150:0
+22 255:70:0
+35 180:0:0
+30 90:0:0
+35 40:0:0

Added: sandbox/krejcmat/src/mw2/test_suite/test.r.mvprecip.sh
===================================================================
--- sandbox/krejcmat/src/mw2/test_suite/test.r.mvprecip.sh	                        (rev 0)
+++ sandbox/krejcmat/src/mw2/test_suite/test.r.mvprecip.sh	2014-10-18 19:29:34 UTC (rev 62287)
@@ -0,0 +1,10 @@
+#!/bin/sh
+
+DB="mvdb"
+#../r.mvprecip.py -r database=$DB baseltime=baseltime.ref
+
+../r.mvprecip.py -g database=$DB \
+    fromtime="2013-09-09 06:00:00" totime="2013-09-09 06:03:00" \
+    baseltime=baseltime.ref color=radarcolor.ref lignore=ignore.ref
+
+exit 0


Property changes on: sandbox/krejcmat/src/mw2/test_suite/test.r.mvprecip.sh
___________________________________________________________________
Added: svn:executable
   + *



More information about the grass-commit mailing list