I'm creating a key pair using
ECDSA<ECP, SHA256>::Signer signer;
signer.AccessKey().Initialize(randomGeneratorM, ASN1::secp160r1());
...
ECDSA<ECP, SHA256>::PublicKey publicKey;
signer.AccessKey().MakePublicKey(publicKey);
Loading, saving, signing and verification works fine. But my signature is always 42 bytes, when I expected it to be "40" (twice the size of the 160 key bits).
The signature is generated as:
std::string data ...
ECDSA<ECP, SHA256>::Signer signer(privateKey);
byte signatureBuffer[42];
size_t signatureLength = signer.SignMessage(
randomGeneratorM, (const byte*)data.c_str(), data.size(), signatureBuffer);
I couldn't find any documentation that the signature might be encoded. So I'm curious: where do the extra bytes come from?
In fact, signer.MaxSignatureLength()
returns "42" and for all the signatures generated so far, the signature length was never anything other.
I am aware that "42" is the answer to life, the universe and everything ;)
Oh, and CryptoPP V5.6.2, Visual Studio 2008
Sample signature (in hex):
00F9C6853895481DDA23517DE16AA44518CDB2C9A900FF9AACA718DFB2AAA9C10E45265224EC40C7FD63
I couldn't find any documentation that the signature might be encoded. So I'm curious: where do the extra bytes come from?
Crypto++ uses IEEE P1363 encoding of the {R,S}
tuple, which is a simple concatenation. The 42 is MaxImage()
size, which is the largest the signature can be. That's based on the encoding of the {R,S}
tuple under the P1363 standard.
If interested, digital signature formats are discussed at Cryptographic Interoperability: Digital Signatures.
00F9C6853895481DDA23517DE16AA44518CDB2C9A900FF9AACA718DFB2AAA9C10E45265224EC40C7FD63
This breaks out to:
00F9C6853895481DDA23517DE16AA44518CDB2C9A9
00FF9AACA718DFB2AAA9C10E45265224EC40C7FD63
which is a concatenation of two ASN.1 encoded values without the integer type information. Crypto++ knows the size of R
and S
because its an integral part of the domain parameters. Specifically, its ASN1::secp160r1()
which is a 160-bit curves.
ASN.1 Integers are two's compliment encoded (see ITU's X.690 Specification of Basic Encoding Rules (BER), Canonical Encoding Rules (CER) and Distinguished Encoding Rules (DER) and friends), and R
and S
are positive per IEEE P1363 (see IEEE's P1363 Section 5.6.1 Converting Between Integers and Bit Strings). So you have to have a leading 0x00
octet.
If you used secp256r1()
, then MaxImage()
and MaxSignatureLength()
would increase accordingly.
For completeness, Java using a different form of encoding. Here it is:
SEQUENCE ::= {
r INTEGER,
s INTEGER
}
That means you have a 20 octets r
plus 2 octets for ASN.1 integer encoding, 20 octets s
plus 2 octets for ASN.1 integer encoding, and 2 octets for the sequence encoding. For a total of 46 octets. If r
or s
is negative in 2's compliment format, then you need to add a byte for the leading 0x00 octet. So it could be 48 bytes.
Crypto++ offers a function for converting between P1363, Java and OpenPGP signatures called DSAConvertSignatureFormat
. You can see the source code in dsa.cpp (there's not much to it).