oracle-databasegoogle-cloud-platformgoogle-bigquerygoogle-cloud-data-fusion

Oracle To Big Query using Data Fusion


Column_ID   Data_Type
1           VARCHAR2(10 BYTE)
2           VARCHAR2(50 BYTE)
3           NUMBER
4           VARCHAR2(25 BYTE)
5           NUMBER(2,0)
6           VARCHAR2(50 BYTE)
7           VARCHAR2(4000 BYTE)
8           DATE
9           VARCHAR2(15 BYTE)

I have this oracle table that I'm trying to move to big query using gcp data fusion. I'm using Multiple Database Tables as the source but when I add this specific table to the list of tables my pipeline fails. I believe because of the NUMBER datatype. My conclusion was based on trying to do a custom sql on the table in data fusion and cast the number to a decimal which made it work.

ERROR Error getting table schemas from database.

Is there a work around for this or I'll have to update all my columns from Number to INT.


Solution

  • Apparently this issue persists in data fusion. So as a work around, I used custom sql as the table source, then did a manual cast of the number to varchar(didn't use float since it broke data fusion too..) then I altered the table in big query, created new float rows and copied the the varchar column to the float column then dropped it.