oracle-databaseperformancebulk-operations

Read and update an Oracle table with 20 millions records


I have a situation where I am doing a data fix from back up.

Table MAIN TABLE (PrimaryKey, Value) and Table BACKUP(PRIMARYKEY, Value).

I want to find all the records in MAIN Table with value=0 , then go fetch the value for the same primary key from table BACKUP and update the MAIN Table.

  1. There are 20 millions records with value=0
  2. Updates and fetch are both done using primary key

Questions

  1. Stored procedure? Script?

  2. Fetch and update are done on the same table? Any concerns?

  3. How much time do you think it will take- ball park figure. how to test?

Solution I was thinking :

Open a cursor on Table Main with my condition(value=0) and then go fetch value from BACKUP and then update. Commit every 10K updates in a loop

Any thoughts?


Solution

  • You can give a try to Oracle's MERGE.

    Make sure you make tests in test tables before applying the query to main tables.

    MERGE INTO main_table m
    USING backup_table b
    ON (m.primary_key = b.primary_key)
    WHEN MATCHED THEN
        UPDATE SET m.value = b.value 
        WHERE m.value = 0;