typeormnode-oracledb

How to insert 20k records with TypeORM+Oracle?


We want to insert 20k records in single oracle db table using TypeORM. Mostly this insertion will happen only once a day but there is possibility that it might happen multiple times as well. Hence, we want to insert in bulk.

Below code fails with error:

ORA-01745: invalid host/bind variable name

await AppDataSource.createQueryBuilder()
  .insert()
  .into(CacheUsersORM)
  .values(cacheusers)
  .execute();

contents of 'cacheusers' printed in console. It should be an array of user objects but for some reason typeORM is not treating it as an array of objects.

[
    "3E3F2BF1-AD4C-4903-993E-503E5AB3713C",
    "usr1",
    "ldapalias1",
    3,
    "CN=usr1, OU=Users, DC=ad, DC=ent, DC=example, DC=com",
    "7AECDFA9-2B7B-419F-8BA6-55810B731EC1",
    "usr2",
    "ldapalias1",
    3,
    "CN=usr2, OU=Users, DC=ad, DC=ent, DC=example, DC=com"
]

Below code works fine but takes more than 10 mins to load all user records in my local testing:

for (const singleuser of cacheusers) {
    await AppDataSource.createQueryBuilder()
      .insert()
      .into(CacheUsersORM)
      .values(singleuser)
      .execute();
}

TypeORM Entity:

import {
  Column,
  Entity,
  PrimaryColumn,
  PrimaryGeneratedColumn,
  Unique,
} from "typeorm";

@Entity("USERS_CACHE")
@Unique("USERSCACHE_UC", ["userName", "ldapContainerID"])
export class UsersCacheORM {
  @PrimaryColumn({
    name: "USERGUID",
    type: "nvarchar2",
    length: 36,
    primaryKeyConstraintName: "USERSCACHE_PK",
  })
  userGUID?: string;

  @Column({ name: "USERNAME", nullable: false, type: "varchar2", length: 256 })
  userName!: string;

  @Column({
    name: "LDAP_ALIAS",
    nullable: false,
    type: "varchar2",
    length: 256,
  })
  ldapAlias!: string;

  @Column({ name: "LDAP_CONTAINER_ID", nullable: false, type: "number" })
  ldapContainerID!: number;

  @Column({ name: "LDAP_DN", nullable: false, type: "varchar2", length: 4000 })
  ldapDN!: string;
}

Solution

  • Just tried to insert records in batches with batch size set to 100 and it worked. Now, I am going to increase the batch size to 200, 500 etc. and see what could be the maximum batch size.

    for (let i = 0; i < cacheusers.length; i += 100) {
    await AppDataSource.createQueryBuilder()
      .insert()
      .into(UsersCacheORM)
      .values(cacheusers.slice(i, i + 100))
      .execute();
    }