Printing UTF-8 Text to the Windows Console

Let’s suppose you have a string encoded in UTF-8, and you want to print it to the Windows console.

You might have heard of the _setmode function and the _O_U8TEXT flag.

The MSDN documentation contains this compilable code snippet that you can use as the starting point for your experimentation:

// crt_setmodeunicode.c  
// This program uses _setmode to change  
// stdout to Unicode. Cyrillic and Ideographic  
// characters will appear on the console (if  
// your console font supports those character sets).  
  
#include <fcntl.h>  
#include <io.h>  
#include <stdio.h>  
  
int main(void) {  
    _setmode(_fileno(stdout), _O_U16TEXT);  
    wprintf(L"\x043a\x043e\x0448\x043a\x0430 \x65e5\x672c\x56fd\n");  
    return 0;  
}  

So, to print UTF-8-encoded text, you may think of substituting the _O_U16TEXT flag with _O_U8TEXT, and use printf (or cout) with a byte sequence representing your UTF-8-encoded string.

For example, let’s consider the Japanese name for Japan, written using the kanji 日本.

The first kanji is the Unicode code point U+65E5; the second kanji is U+672C. Their UTF-8 encodings are the 3-byte sequences 0xE6 0x97 0xA5 and 0xE6 0x9C 0xAC, respectively.

So, let’s consider this compilable code snippet that tries to print a UTF-8-encoded string:

#include <fcntl.h>
#include <io.h>
#include <stdint.h>
#include <iostream>

int main()
{
    _setmode(_fileno(stdout), _O_U8TEXT);

    // Japanese name for Japan, 
    // encoded in UTF-8
    uint8_t utf8[] = { 
        0xE6, 0x97, 0xA5, // U+65E5
        0xE6, 0x9C, 0xAC, // U+672C
        0x00 
    };

    std::cout << reinterpret_cast<const char*>(utf8) << '\n';
}

This code compiles fine. However, if you run it, you’ll get this error message:

Error when trying to print UTF-8 text to the console
Error when trying to print UTF-8 text to the console

So, how to print some UTF-8 encoded text to the Windows command prompt?

Well, it seems that you have to first convert from UTF-8 to UTF-16, and then use wprintf or wcout to print the UTF-16-encoded text. This isn’t optimal, but at least it seems to work.

 

What Is the Encoding Used by the error_code message String?

std::system_error is an exception class introduced in C++11, that is thrown by various functions that interact with OS-specific APIs. The platform-dependent error code is represented using the std::error_code class (returned by system_error::code). The error_code::message method returns an explanatory string for the error code. So, what is the encoding used to store the text in the returned std::string object? UTF-8? Some other code-page?

To answer this question, I spelunked inside the MSVC STL implementation code, and found this _Winerror_message helper function that is used to get the description of a Windows error code.

STL _Winerror_message helper function
STL _Winerror_message helper function

This function first calls the FormatMessageW API to get the error message encoded in Unicode UTF-16. Then, the returned wchar_t-string is converted to a char-string, that is written to an output buffer allocated by the caller.

The conversion is done invoking the WideCharToMultiByte API, and the CodePage parameter is set to CP_ACP, meaning “the system default Windows ANSI code page” (copy-and-paste’d from the official MSDN documentation).

I think in modern C++ code, in general it’s a good practice to store UTF-8-encoded text in std::strings. The code pages are a source of subtle bugs, as there are many of them, they can change, and you end up getting garbage characters (mojibake) when different char-strings using different code pages are mixed and appended together (e.g.  when written to UTF-8 log files).

So, I’d have preferred using the CP_UTF8 flag with the WideCharToMultiByte call above, getting a char-string containing the error message encoded as a UTF-8 string.

However, this would cause mojibake bugs for C++/Windows code that uses cout or printf to print message strings, as this code assumes CP_ACP by default.

So, my point is still that char-strings should in general use the UTF-8 encoding; but unless the Windows console and cout/printf move to UTF-8 as their default encoding, it sounds like the current usage of CP_ACP in the error message string is understandable.

Anyway, due to the use of CP_ACP in the wchar_t-to-char string conversion discussed above, you should pay attention when writing error_code::message strings to UTF-8-encoded log files. Maybe the best thing would be writing custom code to get the message string from the error code identifier, and encoding it using UTF-8 (basically invoking FormatMessage followed by WideCharToMultiByte with CP_UTF8).

Thanks to Stephan T. Lavavej, Casey Carter and Billy O’Neal for e-mail communication on this issue.

 

The CStringW with wcout Bug Under the Hood

I discussed in a previous blog post a subtle bug involving CStringW and wcout, and later I showed how to fix it.

In this blog post, I’d like to discuss in more details what’s happening under the hood, and what triggers that bug.

Well, to understand the dynamics of that bug, you can consider the following simplified case of a function and a function template, implemented like this:

void f(const void*) {
  cout << "f(const void*)\n";
}

template <typename CharT> 
void f(const CharT*) {
  cout << "f(const CharT*)\n";
}

If s is a CStringW object, and you write f(s), which function will be invoked?

Well, you can write a simple compilable code containing these two functions, the required headers, and a simple main implementation like this:

int main() {
  CStringW s = L"Connie";
  f(s);
}

Then compile it, and observe the output. You know, printf-debugging™ is so cool! 🙂

Well, you’ll see that the program outputs “f(const void*)”. This means that the first function (the non-templated one, taking a const void*), is invoked.

So, why did the C++ compiler choose that overload? Why not f(const wchar_t*), synthesized from the second function template?

Well, the answer is in the rules that C++ compilers follow when doing template argument deduction. In particular, when deducing template arguments, the implicit conversions are not considered. So, in this case, the implicit CStringW conversion to const wchar_t* is not considered.

So, when overload resolution happens later, the only candidate available is f(const void*). Now, the implicit CStringW conversion to const wchar_t* is considered, and the first function is invoked.

Out of curiosity, if you comment out the first function, you’ll get a compiler error. MSVC complains with a message like this:

error C2672: ‘f’: no matching overloaded function found

error C2784: ‘void f(const CharT *)’: could not deduce template argument for ‘const CharT *’ from ‘ATL::CStringW’

The message is clear (almost…): “Could not deduce template argument for const CharT* from CStringW”: that’s because implicit conversions like this are not considered when deducing template arguments.

Well, what I’ve described above in a simplified case is basically what happens in the slightly more complex case of wcout.

wcout is an instance of wostream. wostream is declared in <iosfwd> as:

typedef basic_ostream<wchar_t, char_traits<wchar_t>> wostream;

Instead of the initial function f, in this case you have operator<<. In particular, here the candidates are an operator<< overload that is a member function of basic_ostream:

basic_ostream& basic_ostream::operator<<(const void *_Val)

and a template non-member function:

template<class _Elem, class _Traits> 
inline basic_ostream<_Elem, _Traits>& 
operator<<(basic_ostream<_Elem, _Traits>& _Ostr, const _Elem *_Val)

(This code is edited from the <ostream> standard header that comes with MSVC.)

When you write code like “wcout << s” (for a CStringW s), the implicit conversion from CStringW to const wchar_t* is not considered during template argument deduction. Then, overload resolution picks the basic_ostream::operator<<(const void*) member function (corresponding to the first f in the initial simplified case), so the string’s address is printed via this “const void*” overload (instead of the string itself).

On the other hand, when CStringW::GetString is explicitly invoked (as in “wcout << s.GetString()”), the compiler successfully deduces the template arguments for the non-member operator<< (deducing wchar_t for _Elem). And this operator<<(wostream&, const wchar_t*) prints the expected wchar_t string.

I know… There are aspects of C++ templates that are not easy.