I am trying to screen share an android device's screen and display it on a web client with javascript
on the android device I use the Media Projection API to start screen capturing
private fun startScreenCapture(){
val mediaProjectionManager = getSystemService(Context.MEDIA_PROJECTION_SERVICE) as MediaProjectionManager
ActivityCompat.startActivityForResult(this, mediaProjectionManager.createScreenCaptureIntent(), 1, null)
}
take the intent data when allowed and pass it to my webrtc client to create the track
fun startScreenCapture(permissionIntent: Intent, view: SurfaceViewRenderer, peerDevice: String){
if(peerConnection == null){
capturer = ScreenCapturerAndroid(permissionIntent, object : MediaProjection.Callback() {
override fun onStop() {
super.onStop()
Log.d("TAG", "onStop: stopped screen casting permission")
}
})
peerConnection = peerConnectionFactory.createPeerConnection(iceServer, object: PeerConnectionObserver(){
override fun onIceCandidate(candidate: IceCandidate?) {
super.onIceCandidate(candidate)
candidate?.let {
sendIceCandidate(it, peerDevice)
}
}
override fun onConnectionChange(newState: PeerConnection.PeerConnectionState?) {
super.onConnectionChange(newState)
if(newState == PeerConnection.PeerConnectionState.CONNECTED){
}
}
override fun onRenegotiationNeeded() {
super.onRenegotiationNeeded()
createOffer(peerDevice)
}
})
val defaultDisplay = DisplayManagerCompat.getInstance(context).getDisplay(Display.DEFAULT_DISPLAY)
val displayContext = context.createDisplayContext(defaultDisplay!!)
val screenWidthPixels = displayContext.resources.displayMetrics.widthPixels
val screenHeightPixels = displayContext.resources.displayMetrics.heightPixels
val surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread",eglBaseContext)
val localVideoSource = peerConnectionFactory.createVideoSource(capturer!!.isScreencast)
capturer!!.initialize(surfaceTextureHelper,context,localVideoSource.capturerObserver)
capturer!!.startCapture(screenWidthPixels,screenHeightPixels,30)
localVideoTrack = peerConnectionFactory.createVideoTrack("${deviceId}_video",localVideoSource)
localVideoTrack!!.addSink(capturer)
try{
peerConnection?.addTrack(localVideoTrack)
}catch (e:Exception){
e.printStackTrace()
}
}
}
on the javascript side the peerConnection.ontrack
gets called but when I try to set the stream to my video element on the page there are no streams in the track.
peerConnection.ontrack = (event) => {
console.log(`New track ${event.track}`)
console.log(`Track ID: ${event.track.id}`)
console.log(`Track kind: ${event.track.kind}`)
console.log(`Track label: ${event.track.label}`)
console.log(`Streams: ${event.streams.length}`)
console.log(`First stream: ${event.streams[0]}`)
remoteVideo.srcObject = event.streams[0];
};
here are the logs from the console from the browser
New track [object MediaStreamTrack]
main.js:107 Track ID: 12345_video
main.js:108 Track kind: video
main.js:109 Track label: 12345_video
main.js:110 Streams: 0
main.js:111 First stream: undefined
I omitted all the other back and forth required for the webrtc clients because that all appears to be working fine. Why is there no stream for the track available?
You are using the addTrack call without the optional second parameter which is the list of streams the MediaStreamTrack belongs to. You should be able to pass them the same way the apprtc demo does here:
List<String> mediaStreamLabels = Collections.singletonList("ARDAMS");
peerConnection?.addTrack(localVideoTrack, mediaStreamLabels)
(it looks like this is a list of strings on Android which differs from the JS API)