将long-long的数组转换为字符串

convert array of long long into string

本文关键字:字符串 转换 数组 long-long      更新时间:2023-10-16

在c中++我有一个带符号的长-长(63位数字)数组,可变长度数组。

std::array<long long, n> encodedString

这个数组实际上包含一个UTF-8编码的字符串。这意味着,如果连接数组中每个元素的二进制文件,结果将是UTF-8编码的文本。

例如阵列:

(621878499550 , 2339461068677718049) 

如果你把那些有符号的long-long翻译成63位二进制,它会给出:

621878499550=00000000000000000000000 1001000011001010110110001101100011011110

2339461068677718049=0100000011101110110111100100110111001000010000000100001

如果将这些二进制文件连接到:00000000000000000000000 100100001100101011011000110111011110010000001110110111100100110111001000010000000100001

这是"你好,世界!"的UTF8

因此,问题是,从数组(621878499550、2339461068677718049)开始,获得"Hello world!"字符串的最简单方法是什么

我目前的最佳解决方案是以二进制模式(fwrite)将数组写入文件,然后以文本模式将文件读取为字符串。

使用位集将long-long转换为二进制和字符串流,以流式传输

#include <sstream>
#include <iostream>
#include <bitset>
#include <array>
int main() 
{
std::array<long long, 2> array = { 621878499550 , 2339461068677718049ll };
std::stringstream ss;
for (auto& n : array)
{
ss << std::bitset<64>(n);
}
std::cout << ss.str() << std::endl;
}

输出0000000000000000000000010010000110010101101100011011000110111100010000001110111011011110111001001101100011001000010000000100001

试试这个

// 48 & 56 were to avoid the extra padding made when you use 64 bitset but i think thats what you are looking for 
std::string binary = std::bitset<48>(114784820031264).to_string();
std::string binary2 = std::bitset<56>(2339461068677718049).to_string();
binary += binary2;
std::stringstream sstream(binary);
std::string output;
while (sstream.good())
{
std::bitset<8> bits;
sstream >> bits;
output +=char(bits.to_ulong());
}
std::cout << output;