I've done many tests and some research, but I can't find a solution.
In SQL Server, how can I convert a string to HEX UTF16 LittleEndian to have a result like this?
SELECT CONVERT(VARBINARY(MAX), 'P384992') as key
Actual result:
0x50333834393932
Needed result:
0x5000330038003400390039003200
The main problem here is you are using a varchar
for your string. varchar
s use the collation of the database, but the vast majority are single byte character collations, and so use code pages like ANSI-1252. As a result converting a varchar
to a varbinary
is going to yield a single byte per character.
To get UTF-16, you need to be using an nvarchar
. As siggemannen notes, just popping a N
at the start might not yield a UTF-16 value, but a USC-2 value. To get a UTF-16 value, you would need to ensure you use a collation that uses supplemenary characters, which is denoted by SC
in the collation name. For example Latin1_General_100_CI_AS_SC
(Latin1-General-100, case-insensitive, accent-sensitive, kanatype-insensitive, width-insensitive, supplementary characters).
So, that would look like this:
SELECT CONVERT(varbinary(MAX), N'P384992' COLLATE Latin1_General_100_CI_AS_SC) as [key];
If you, ironically, are only going to have ANSI characters in your string, then just using an nvarchar
literal will be fine, as USC-2 and UTF-16 completely overlap.