I am working on a large command line tool, written for Python 2.6+ and supported for Windows, OS X and Linux. The target users are developers but it is also being auto-invoked by CI-systems etc. In the Windows environment, our tool runs in cmd.exe, but in the other OS’s we allow whatever terminal the users have.
I would really like to add general unicode support to this tool, in particular to print non-ascii characters to the terminal. Python is in general pretty nice with unicode, but cmd.exe does not play well with my small tests due to some issue with the encoding. I can probably “get it to work” on Windows in some way, but can other encoding issues occur in other terminals? Do “most modern terminals” use utf-8 on GNU/Linux and OS X (or is it locale dependent?). Also, if an encoding mismatch occurs, will “most terminals” just output garbled chars, or will they actually crash? Is it in general possible to reliably determine the encoding used by the terminal?
Looking mainly for people who have attempted at implementing cross platform command line tools with unicode support. My goal with this question is to determine whether I can implement unicode support with little to moderate effort which works for the vast majority of our users.
We’re using the code from https://stackoverflow.com/questions/878972/windows-cmd-encoding-change-causes-python-crash/3259271#3259271 whenever we run into problems on cmd.exe and windows. Linux seems to be utf-8, and the only problem we’ve seen there is when windows users’ putty settings have been off. I don’t know about OS X.
Can you implement unicode support with little to moderate effort? Yes and no. Plugging in the above code is very little effort and will work for most users, but users can change their terminal encoding and other tools aren’t always Unicode Safe…