Someone asked why in this article I used std::wstring instead of the new C++11 std::u16string for Unicode UTF-16 text.
The key point is that Win32 Unicode UTF-16 APIs use wchar_t as their code unit type; wstring is based on wchar_t, so it works fine with those APIs.
On the other hand, u16string is based on the char16_t type, which is a new built-in type introduced in C++11, and is different from wchar_t.
So, if you have a u16string variable and you try to use it with a Win32 Unicode API, e.g.:
// std::u16string s; SetWindowText(hWnd, s.c_str());
Visual Studio 2015 complains (emphasis mine):
error C2664: ‘BOOL SetWindowTextW(HWND,LPCWSTR)’: cannot convert argument 2 from ‘const char16_t *’ to ‘LPCWSTR’
note: Types pointed to are unrelated; conversion requires reinterpret_cast, C-style cast or function-style cast
wchar_t is non-portable in the sense that its size isn’t specified by the standard; but, all in all, if you are invoking Win32 APIs you are already in an area of code that is non-portable (as Win32 APIs are Windows platform specific), so adding wstring (or even CString!) to that mix doesn’t change anything with respect to portability (or lack thereof).