使用WM_CHAR键入Unicode符号

Typing Unicode symbols using WM_CHAR

本文关键字:Unicode 符号 键入 CHAR WM 使用      更新时间:2023-10-16

如何使用WM_CHAR消息键入一些Unicode符号(如cyrillic)?现在我有了不正确的cyrillic符号类型。这是我的代码:

DWORD dwCurrentTreadID = GetCurrentThreadId();
HWND hForeground = GetForegroundWindow();
DWORD dwForegroungThreadID = GetWindowThreadProcessId(hForeground, NULL);
AttachThreadInput(dwForegroungThreadID,dwCurrentTreadID,true);
PostMessageW(GetFocus(), WM_CHAR, character, 1);

您不能用PostMessage模拟键盘输入。改为使用SendInput()

INPUT input = {0};
input.type = INPUT_KEYBOARD;
input.ki.wScan = (WORD) character;
input.ki.dwFlags = KEYEVENTF_UNICODE;
SendInput(1, &input, sizeof(INPUT));

Windows中的Unicode使用UTF-16。wScan是16位的,因此它只能容纳单独的UTF-16编码单元。您可以将高达U+FFFF的Unicode代码点放入单个代码单元中,但要发送高于U+FFFF的代码点(需要2个代码单元),您必须提供2个INPUT值,每个代码单元一个:

INPUT input[2] = {0};
int numInput;
// character should be a 32bit codepoint and not exceed 0x10FFFF...
if (character <= 0xFFFF)
{
    input[0].type = INPUT_KEYBOARD;
    input[0].ki.wScan = (WORD) character;
    input[0].ki.dwFlags = KEYEVENTF_UNICODE;
    numInput = 1;
}
else
{
    character -= 0x010000;
    input[0].type = INPUT_KEYBOARD;
    input[0].ki.wScan = (WORD) (((character >> 10) & 0x03FF) + 0xD800);
    input[0].ki.dwFlags = KEYEVENTF_UNICODE;
    input[0].type = INPUT_KEYBOARD;
    input[1].ki.wScan = (WORD) ((character & 0x03FF) + 0xDC00);
    input[0].ki.dwFlags = KEYEVENTF_UNICODE;
    numInput = 2;
}
SendInput(numInput, input, sizeof(INPUT));

您可以将其封装在一个函数中,该函数发送一个UTF-16编码的输入字符串:

void SendInputStr(const std::wstring &str) // in C++11, use std::u16string instead...
{
    if (str.empty()) return;
    std::vector<INPUT> input(str.length());
    for (int i = 0; i < str.length(); ++i)
    {
        input[i].type = INPUT_KEYBOARD;
        input[i].ki.wScan = (WORD) str[i];
        input[i].ki.dwFlags = KEYEVENTF_UNICODE;
    }
    SendInput(input.size(), &input[0], sizeof(INPUT));
}