I want to copy one big database table to another. This is my current approach:
OPEN CURSOR WITH HOLD lv_db_cursor FOR
SELECT * FROM zcustomers.
DO.
REFRESH gt_custom.
FETCH NEXT CURSOR lv_db_cursor
INTO TABLE gt_custom
PACKAGE SIZE lv_package_size.
IF sy-subrc NE 0.
CLOSE CURSOR lv_db_cursor.
EXIT.
ENDIF.
INSERT zcustomers1 FROM TABLE gt_custom.
* Write code to modify u r custom table from gt_custom .
ENDDO.
But the problem is that I get a error "Enterprise]ASE has run out of LOCKS".
I tried to use COMMIT
statement after insert some piece of records, but it closes the cursor.
I don't want to increase max locks by database setting or make a copy on database level.
I want to understand how I can copy with best performance and low usage memory in ABAP...
Thank you.
By using COMMIT CONNECTION
instead of COMMIT WORK
it is possible to commit only the transaction writing to zcustomers1, while keeping the transaction reading from zcustomers open.
Note that having multiple transactions (one reading, multiple writing) can create inconsistencies in the database if zcustomers or zcustomers1 are written while this code runs. Also reading from zcustomers1 shows only a part of the entries from zcustomers.
DATA:
gt_custom TYPE TABLE OF ZCUSTOMERS,
lv_package_size TYPE i,
lv_db_cursor TYPE cursor.
lv_package_size = 10000.
OPEN CURSOR WITH HOLD lv_db_cursor FOR
SELECT * FROM zcustomers.
DO.
REFRESH gt_custom.
FETCH NEXT CURSOR lv_db_cursor
INTO TABLE gt_custom
PACKAGE SIZE lv_package_size.
IF sy-subrc NE 0.
CLOSE CURSOR lv_db_cursor.
EXIT.
ENDIF.
MODIFY zcustomers2 FROM TABLE gt_custom.
" regularily commiting here releases a partial state to the database
" through that, locks are released and running into ASE error SQL1204 is avoided
COMMIT CONNECTION default.
ENDDO.