I wanted to encrypt a string from Java and decrypt that encrypted value in Python. using AEC GCM algorytham. below is my java code
import java.nio.ByteBuffer;
import java.nio.charset.Charset;
import java.nio.charset.StandardCharsets;
import java.security.SecureRandom;
import java.util.Arrays;
import java.util.Base64;
import javax.crypto.Cipher;
import javax.crypto.spec.GCMParameterSpec;
import javax.crypto.spec.SecretKeySpec;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class AESEncryptionUtil {
public static void main(String[] args) {
String encString = "Hello, World!";
String secKey = "hellow world";
String encrypted = encrypt(encString, secKey);
System.out.println("Encrypted (Java): " + encrypted);
String decrypted = decrypt(encrypted, secKey);
System.out.println("Decrypted (Java): " + decrypted);
}
private static final Logger logger = LoggerFactory.getLogger(AESEncryption.class);
private static final String ENCRYPT_ALGO = "AES/GCM/NoPadding";
private static final int TAG_LENGTH_BIT = 128;
private static final int IV_LENGTH_BYTE = 12;
private static final int SALT_LENGTH_BYTE = 16;
private static final Charset UTF_8 = StandardCharsets.UTF_8;
public static String encrypt(String pText, String secKey) {
try {
if (pText == null || pText.equals("null")) {
return null;
}
byte[] salt = getRandomNonce(SALT_LENGTH_BYTE);
byte[] iv = getRandomNonce(IV_LENGTH_BYTE);
byte[] keyBytes = secKey.getBytes(StandardCharsets.UTF_16);
SecretKeySpec skeySpec = new SecretKeySpec(Arrays.copyOf(keyBytes, 16), "AES");
Cipher cipher = Cipher.getInstance(ENCRYPT_ALGO);
cipher.init(Cipher.ENCRYPT_MODE, skeySpec, new GCMParameterSpec(TAG_LENGTH_BIT, iv));
byte[] cipherText = cipher.doFinal(pText.getBytes());
byte[] cipherTextWithIvSalt =
ByteBuffer.allocate(iv.length + salt.length + cipherText.length)
.put(iv)
.put(salt)
.put(cipherText)
.array();
return Base64.getEncoder().encodeToString(cipherTextWithIvSalt);
} catch (Exception ex) {
logger.error("Error while encrypting:", ex);
}
return null;
}
public static String decrypt(String cText, String secKey) {
try {
if (cText == null || cText.equals("null")) {
return null;
}
byte[] decode = Base64.getDecoder().decode(cText.getBytes(UTF_8));
ByteBuffer bb = ByteBuffer.wrap(decode);
byte[] iv = new byte[IV_LENGTH_BYTE];
bb.get(iv);
byte[] salt = new byte[SALT_LENGTH_BYTE];
bb.get(salt);
byte[] cipherText = new byte[bb.remaining()];
bb.get(cipherText);
byte[] keyBytes = secKey.getBytes(StandardCharsets.UTF_16);
SecretKeySpec skeySpec = new SecretKeySpec(Arrays.copyOf(keyBytes, 16), "AES");
Cipher cipher = Cipher.getInstance(ENCRYPT_ALGO);
cipher.init(Cipher.DECRYPT_MODE, skeySpec, new GCMParameterSpec(TAG_LENGTH_BIT, iv));
byte[] plainText = cipher.doFinal(cipherText);
return new String(plainText, UTF_8);
} catch (Exception ex) {
logger.error("Error while decrypting:", ex);
}
return null;
}
public static byte[] getRandomNonce(int numBytes) {
byte[] nonce = new byte[numBytes];
new SecureRandom().nextBytes(nonce);
return nonce;
}
}
i cannot change my Java code; I tried many ways in Python but was not able to achieve. most of the time i am getting secretKey decoding error and cryptography.exceptions.InvalidTag from the Python side. your suggestions are appreciated.
Python code:
import base64
import os
from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
from cryptography.hazmat.backends import default_backend
def decrypt(cipher_text_base64, secret_key):
cipher_text_with_iv_salt = base64.b64decode(cipher_text_base64)
iv = cipher_text_with_iv_salt[:12]
salt = cipher_text_with_iv_salt[12:28]
tag = cipher_text_with_iv_salt[-16:] # The last 16 bytes are the tag
ciphertext = cipher_text_with_iv_salt[28:-16] # Ciphertext excluding tag
key = secret_key.encode('utf-16')
key = key[:16].ljust(16, b'\0')
decryptor = Cipher(algorithms.AES(key), modes.GCM(iv, tag), backend=default_backend()).decryptor()
plaintext = decryptor.update(ciphertext) + decryptor.finalize()
return plaintext.decode('utf-8')
if __name__ == "__main__":
text_to_encrypt = "Hello, World!"
secret_key = "hellow world"
# Decrypt in Python
decrypted_text = decrypt("encrypted_text", secret_key)
print(f"Decrypted (Python): {decrypted_text}")
The problem is caused by different encodings of the key.
secret_key.encode('utf-16')
in the Java code means by definition big endian with BOM (byte order mark), see Charset, sec. Standard charsets:
When decoding, the UTF-16 and UTF-32 charsets...; when encoding, it uses big-endian byte order and writes a big-endian byte-order mark.
In contrast, encode('utf-16')
in the Python code is little endian with BOM (at least on my machine; this may depend on the platform’s native byte order).
In the Python code, big endian with BOM can be reliably achieved, e.g. with:
import codecs
...
key = codecs.BOM_UTF16_BE + secret_key.encode('utf-16be')
key = key[:16].ljust(16, b'\0')
...
With this fix decryption works with the Python code.
Although the Java code cannot be changed, future readers should be aware of a vulnerability, namely the derivation of the key from a string with a charset encoding.
Instead, a key derivation function such as Argon2 or at least PBKDF2 should be used in conjunction with a random salt.
Interestingly, a random salt is generated (and later concatenated) in the Java code, but it is not applied anywhere. It is possible that the use of a dedicated key derivation function was considered, but was not implemented later for some reason.
In addition, for text encodings (such as pText.getBytes()
) a specific encoding should always be explicitly specified as otherwise the platform-specific default encoding is used (at least for older Java versions).