I am trying to use media streams with getUserMedia on Chrome on Android. To test, I worked up the script below which simply connects the input stream to the output. This code works as-expected on Chrome under Windows, but on Android I do not hear anything. The user is prompted to allow for microphone access, but no audio comes out of the speaker, handset speaker, or headphone jack.
navigator.webkitGetUserMedia({
video: false,
audio: true
}, function (stream) {
var audioContext = new webkitAudioContext();
var input = audioContext.createMediaStreamSource(stream);
input.connect(audioContext.destination);
});
In addition, the feedback beeps when rolling the volume up and down do not sound, as if Chrome is playing back audio to the system.
Is it true that this functionality isn't supported on Chrome for Android yet? The following questions are along similar lines, but neither really have a definitive answer or explanation.
As I am new to using getUserMedia, I wanted to make sure there wasn't something I was doing in my code that could break compatibility.
I should also note that this problem doesn't seem to apply to getUserMedia itself. It is possible to use getUserMedia in an <audio>
tag, as demonstrated by this code (utilizes jQuery):
navigator.webkitGetUserMedia({
video: false,
audio: true
}, function (stream) {
$('body').append(
$('<audio>').attr('autoplay', 'true').attr('src', webkitURL.createObjectURL(stream))
);
});
Chrome on Android now properly supports getUserMedia. I suspect that this originally had something to do with the difference in sample rate between recording and playback (which exhibits the same issue on desktop Chrome). In any case, all started working some time on the latest stable around February 2014.