4

I've been working on a problem and I still can't find the solution. I've a UTF-8 table and a view from this table which is published as a WFS service in GeoServer. I can export UTF-8 characters in GML but I can't change encoding in JSON and SHP export which seems to be Western - Windows 1254.

How can I change the encoding to UTF-8 to fixed the shown of special characters?

Best regards.

A.

Alper Dincer
  • 141
  • 1
  • 6
  • You can change the character-set of the dbf files of the datastore that holds your files. Play around with that and see if it changes anything. On the admin page, go to ´Datastores´ and click on the datastore your files are in. There, you see the prefs for charset. – hnrch Aug 17 '15 at 17:16
  • The problem is with shapefile output with &outputformat=shape-zip, not with input. So far I have not been able to find any way for configuring the output code page but on Windows it is always ISO-8859-1 for me. However, the shape-zip contains .cst file that shows the real encoding and if the reading application can interpret .cst file the end result should be fine. With GeoJSON output I did get correct encoding but I am not sure if my test was reliable enough. – user30184 Aug 17 '15 at 20:31

3 Answers3

8

GeoSolutions.it has gave me a solution that adding "&format_options=CHARSET:UTF-8" to WFS URL and this solves the problem for SHP export.

Alper Dincer
  • 141
  • 1
  • 6
1

I have encountered on similar issue and I have implemented approach from this answer . Then I have exported the same shapefile with another name using Qgis. From now on, you would need to change Store path in your Geoserver of WFS layer, making sure that UTF-8 is also chosen. In my case UTF encoding for WFS was fixed.

lujjas
  • 69
  • 6
0

Yes, confirmed!

Issue fixed also for Cyrillization of Shapefile (DBF) to Windows-1251, which is a workaround to overcome showing the Cyrillic alphabet with "???"

We used: &format_options=charset:cp1251

MrXsquared
  • 34,292
  • 21
  • 67
  • 117