As you can see in this tutorial, the default latency timer for the COM port is 16ms.
COM Port Latency
But in most cases we want the minimum latency.
On the internet we see a lot of explanation on why this value should be as small as possible, but never when it is good to choose a large value.
So why the default value is 16ms when it could be 1ms ?
It's more CPU-efficient to transfer data in larger chunks. If you have a 1ms latency, your serial port may cause up to 1000 transfers per second through the OS (interrupt, lower handler, context switch, user callback, etc). With a 16ms latency, you would process the same amount of data in only 60 transfers, with each transfer handling a larger block.
Reducing the interrupt count is a lot less useful on a modern multicore system than it was on a single core, where all time spent in a serial (or USB) interrupt meant delays in processing other I/O such as disk transfers. Now that work can be split among multiple cores, although inefficient processing still has a bad effect on e.g. battery life.