I have an MTA application deployed on SAP BTP Cloud Foundry with UI5 as one of the module and the service layer application as another one. Both are having type as nodejs. 250 mb memory is allocated to UI5 module and 500 mb to the service module.
The ui application has a SearchField on which live search is implemented using its liveChange event with OData binding as below.
```
var oList = this.getView().byId("idObjects");
var oBinding = oList.getBinding("items");
let filter = new Filter({
path: path,
operator: FilterOperator.Contains,
value1: value,
caseSensitive: false
});
oBinding.filter(filter);
```
For every character typed into the search field an OData batch request is sent to the service layer and eventually the request is cancelled if next character is typed/entered into the search field immediately. The last request is only processed and not cancelled.
The length of the search field is 256 characters. If user enters 256 characters out of 256 OData requests 255 gets cancelled and the 256th one does the search/filter work.
The problem here is, when all 256 character are entered into the search field(by pressing a key continuously) both the ui application and service module crashes due to Out of Memory. Even though the initial 255 requests are cancelled by the client side the service layer receives it and starts processing.
I did not get any clue for memory leakage over here as the live search is handled by OData only.
If I replace the liveChange event by search event then it looks perfect as it triggers only one request on click of the search icon.
Is there any way to resolve this OOM issue with liveChange?
Regards, Ravindra
Do you really need a liveChange Event here? If your user can enter up to 256 characters, what is the minimum character amount a user must enter before he receives a reasonable amount of responses? Perhaps limiting the event to only fire after a certain input length has been reached or just changing it to a change/submit event completely will fix your issue. Otherwise, your Input field currently is behaving as expected.