[Qgis-developer] QgsRasterLayer.identify() returns wrongly
encoded keys?
Martin Dobias
wonder.sk at gmail.com
Tue Dec 2 17:17:12 EST 2008
2008/12/2 Borys Jurgiel <borys at wolf.most.org.pl>:
> the 'sample' contains the mentioned dictionary. Then if I try to print the
> first key, which should contain QString 'Kanał 1':
>
> print sample.keys()[0]
>
> I give the error message:
>
> exceptions.UnicodeEncodeError: 'ascii' codec can't encode character u'\u0142'
> in position 4: ordinal not in range(128)
>
> So the string is not properly unicoded. But I can't find any error, neither in
> qgsrasterlayer.cpp nor in qgsrasterlayer.sip file. It's passed from tr()
> directly to QgsRasterLayer::identify() header. Probably I'm missing something
> obvious.
There's nothing wrong with the string. The problem is that print
statement is trying to convert unicode string to 8-bit string to
display it in console. And by default it uses 'ascii' codec which
raises an exception when it encounters a character with value >= 128
(since ascii is just 7-bit encoding). How to simulate it:
>>> u'\u0142'.encode('ascii')
A resolution for your situation is to encode with e.g. utf8 codec or
you can use directly QString's methods toUtf8() or toLocal8Bit().
Happy hacking
Martin
More information about the Qgis-developer
mailing list