documentumdocumentum6.5dfcdocumentum-dfcdocumentum-dql

Unable to delete documents in documentum application using DFC


I have written the following code with the approach given in EMC DFC 7.2 Development Guide. With this code, I'm able to delete only 50 documents even though there are more records. Before deletion, I'm taking the dump of object id. I'm not sure if there is any limit with IDfDeleteOperation. As this is deleting only 50 documents, I tried using DQL delete command, even there it is limited to 50 documents. I tried using destory() and destroyAllVersions() method that document has, even this didn't work for me. I have written everything in main method.

import com.documentum.com.DfClientX;
import com.documentum.com.IDfClientX;
import com.documentum.fc.client.*;
import com.documentum.fc.common.DfException;
import com.documentum.fc.common.DfId;
import com.documentum.fc.common.IDfLoginInfo;
import com.documentum.operations.IDfCancelCheckoutNode;
import com.documentum.operations.IDfCancelCheckoutOperation;
import com.documentum.operations.IDfDeleteNode;
import com.documentum.operations.IDfDeleteOperation;

import java.io.BufferedWriter;
import java.io.FileWriter;

public class DeleteDoCAll {

    public static void main(String[] args) throws DfException {
       System.out.println("Started...");

        IDfClientX clientX  = new DfClientX();
        IDfClient dfClient = clientX.getLocalClient();
        IDfSessionManager sessionManager = dfClient.newSessionManager();
        IDfLoginInfo loginInfo = clientX.getLoginInfo();
        loginInfo.setUser("username");
        loginInfo.setPassword("password");
        sessionManager.setIdentity("repo", loginInfo);

        IDfSession dfSession = sessionManager.getSession("repo");
        System.out.println(dfSession);

        IDfDeleteOperation delo = clientX.getDeleteOperation();
        IDfCancelCheckoutOperation cco = clientX.getCancelCheckoutOperation();

       try {
           String dql = "select r_object_id from my_report where folder('/Home', descend);
           IDfQuery idfquery = new DfQuery();
           IDfCollection collection1 = null;

           try {
               idfquery.setDQL(dql);
               collection1 = idfquery.execute(dfSession, IDfQuery.DF_READ_QUERY);
               int i = 1;
               while(collection1 != null && collection1.next()) {
                   String r_object_id = collection1.getString("r_object_id");

                   StringBuilder attributes = new StringBuilder();

                   IDfDocument iDfDocument = (IDfDocument)dfSession.getObject(new DfId(r_object_id));
                   attributes.append(iDfDocument.dump());

                   BufferedWriter writer = new BufferedWriter(new FileWriter("path to file", true));
                   writer.write(attributes.toString());
                   writer.close();

                   cco.setKeepLocalFile(true);
                   IDfCancelCheckoutNode cnode;


                   if(iDfDocument.isCheckedOut()) {
                       if(iDfDocument.isVirtualDocument()) {
                           IDfVirtualDocument vdoc = iDfDocument.asVirtualDocument("CURRENT", false);
                           cnode = (IDfCancelCheckoutNode)cco.add(iDfDocument);
                       } else {
                           cnode = (IDfCancelCheckoutNode)cco.add(iDfDocument);
                       }

                       if(cnode == null) {
                           System.out.println("Node is null");
                       }
                       if(!cco.execute()) {
                           System.out.println("Cancel check out operation failed");
                       } else {
                           System.out.println("Cancelled check out for " + r_object_id);
                       }
                   }

                   delo.setVersionDeletionPolicy(IDfDeleteOperation.ALL_VERSIONS);
                   IDfDeleteNode node = (IDfDeleteNode)delo.add(iDfDocument);
                   if(node == null) {
                       System.out.println("Node is null");
                       System.out.println(i);
                       i += 1;
                   }
                   if(delo.execute()) {
                       System.out.println("Delete operation done");
                       System.out.println(i);
                       i += 1;
                   } else {
                       System.out.println("Delete operation failed");
                       System.out.println(i);
                       i += 1;
                   }
               }
           } finally {
               if(collection1 != null) {
                   collection1.close();
               }
           }

       } catch(Exception e) {
           e.printStackTrace();
       } finally {
           sessionManager.release(dfSession);
       }
   }
}

I don't know where I'm making mistake, every time I try, the program stops at 50th iteration. Can you please help me to delete all documents in proper way? Thanks a lot!


Solution

  • At first select all document IDs into List<IDfId> for example and close the collection. Don't do another expensive operations inside of the opened collection, because you are then unnecessarily blocking it.

    This is the cause why it did only 50 documents. Because you had one main opened collection and each execution of delete operation opened another collection and it probably reached some limit. So as I said it is better to consume the collection at first and then work further with those data:

        List<IDfId> ids = new ArrayList<>();
        try {
            query.setDQL("SELECT r_object_id FROM my_report WHERE FOLDER('/Home', DESCEND)");
            collection = query.execute(session, IDfQuery.DF_READ_QUERY);
            while (collection.next()) {
                ids.add(collection.getId("r_object_id"));
            }
        } finally {
            if (collection != null) {
                collection.close();
            }
        }
    

    After that you can iterate through the list and do all actions with the document you need. But don't execute delete operation in each iteration - it is ineffective. Instead of it add all documents into one operation and execute it once at the end.

        IDfDeleteOperation deleteOperation = clientX.getDeleteOperation();
        deleteOperation.setVersionDeletionPolicy(IDfDeleteOperation.ALL_VERSIONS);
        for (IDfId id : ids) {
            IDfDocument document = (IDfDocument) session.getObject(id);
            ...
            deleteOperation.add(document);
        }
        deleteOperation.execute();
    

    The same is for the IDfCancelCheckoutOperation.

    And another thing - when you are using FileWriter use close() in the finally block or use try-with-resources like this:

        try (BufferedWriter writer = new BufferedWriter(new FileWriter("file.path", true))) {
            writer.write(document.dump());
        } catch (IOException e) {
            throw new UncheckedIOException(e);
        }
    

    Using of StringBuilder is good idea, but create it only once at the beginning, append all attributes in each iteration and then write the content of the StringBuilder into the file at the end and not during each iteration - it is slow.