sql-serveroracle-databasedatastageora-12899

NTEXT on SQL Server to NVARCHAR2(2000) on Oracle (ORA-12899: value too large for column)


My source is in SQL Server and the target is Oracle. There are some tables having columns defined NTEXT in SQL Server and I created columns of NVARCHAR2(2000) which allows 4000 bytes, to store the data from the source.

When I pull the data defined NTEXT from SQL Server, I cast and substring the data to fit into 4000 bytes in the target. I'm using Data Stage by IBM to extract the source form SQL Server and the code below performs converting data type to varchar(4000) and extracting a substring with the specified length, 4000 bytes.

cast(substring([text],1,3950) as varchar(4000)) as "TEXT"

However, it often occurs an error ORA-12899 when it inserts into NVARCHAR2(2000) on Oracle which is sized 4000 bytes.

Error message: ORA-12899: value too large for column (actual: 3095, maximum: 2000).

First, it is hard to understand why it occurs the error even though the destination has a column sized 4000 bytes and I cut the data using SUBSTING already.

Second, I was wondering if I miss anything to handle the issue when my team does not consider CLOB on Oracle for those NTEXT type data.

Please help me to resolve this issue. I'm working on many tables and the error occurs often.


Solution

  • nvarchar2 is limited to 2000 chars, which requires 4000 bytes. You have to specify the limit in chars (so, 2000, not 4000). Try changing your function to:

    cast(substr([text],1,2000) as nvarchar2(2000)) as "TEXT"