I have a requirement to read say name of users from a file(all in new lines) and insert to Database(oracle). I am using GetFile->SplitText->ExtractText->PutSql in NiFi. All works fine for less records(around 10 records). But when I try it for 50 or 100 plus records, the connection between ExtractText and PutSql, I am getting "A flowfile is currently penalized and data can not be processed at this time". Also all flowfiles remain in queue.
Note in PutSQL I am using simple sql (insert into users_table(user_name) values ('user123'); It works properly for 10, 20 records but when I provide 100+ records then it gets stuck.
Hello I had asked this question. Now I think I have got a solution. In PutSql Processor, there is a property "Support Fragmented Transactions". If I set its value to be false, then it is proceeding and not penalising the flow files.(If we read more into it, it says something about fragment.identifier and count due to which it waits for all flow files.)
Note: It also mentions it is for respecting Atomicity. Anyone can suggest if having this property as false is proper or not. As right now if I am updating it to false then it is not penalising.