I am working on a musical instrument type app. It uses a keyboard like set of UIButtons to trigger playing a musical note when touchDown occurs. The note stops playing based on touchUpInside or touchUpOutside events.
Everything works as expected except when attempting to play multiple notes. The touchDown events work, but touchUpInside event is not sent when a button is released as expected when another button continues to be touched. The touchUpInside is sent to all the buttons when all the buttons are released.
It appears to buffer other touch events until the button being held is released. If a button is released and another pressed, the touchDown and touchUpInside events are not sent to that button until all the other buttons are released. Then the touchUpInside events are sent to the first buttons and the subsequent events are triggered.
This flow has been traced through the code for the IBActions for the events. I have enabled multitouch. I have tried allowing multiple gesture recognizers. Nothing seems to affect this behavior.
There is no real code to share as the buttons and events were built using Interface Builder - Storyboard.
Is there something holding up the processing of events pending the release of previous touch events?
Any suggestions would be appreciated? I have searched for other similar questions but could not find this issue.
Thank you!
As your piano keys, use custom views, not UIButtons. You can probably most easily track the kind of touches you want to detect by attaching a UILongPressGestureRecognizer to each view. Or, at a lower level, you could simply detect touch events directly.