I have created a Unity scene where I have set up the following objects:
> ARCamera
> ImageTarget
> Quad
> Video Player
I've set the Video Player's source to a URL pointing to an external MP4 file and connected the Quad object to the "Target texture" of the Video Player. In Unity, the setup works as expected, but after exporting the project as an Android library and integrating it into React Native, the video doesn't play. Instead, I only see the Quad object with its original white color.
Here's my setup:
Unity version: 2022.3.19f1
React Native version: 0.73.5
Android version: 13
The video should play on the Quad object in the React Native app, as it does in Unity.
Actual behavior:
The video does not play, and only the Quad object with its original white color is visible.
I have already tried:
Could someone please provide insights or solutions on why the video isn't playing after integrating the Unity project into React Native on Android? Any help would be appreciated.
After further investigation, I discovered that the issue wasn't specific to the integration of the Unity project into React Native. Instead, the video file itself was the culprit.
Upon testing the video playback in a regular React Native environment, I encountered the same problem: the video didn't play. Subsequently, I decided to try a different video file, and to my surprise, it played without any issues.
Both files were with .mp4 extensions, but maybe there was a problem with the original's video encoding.