|
Hello there,
I am hoping the community here might be able to help resolve an issue for us. We have an existing POC PTT application that runs on Android devices and is focused on delivering PTT voice services to users. This PTT application calls AUDIO_RECORDER object in the android OS to interact with the audio capture recording output of the MIC whenever the PTT button is pressed on the Android device to record the audio to send. It encodes it into AMR for transmission. We are now looking to run alongside this existing PTT application the video capture capabilities of the MCU BVPU library to firstly do static recording of video capture files to the local Android device being carried by the user. And then as a phase 2 development we would like to then look at live streaming the video through to our control room sites. The issue however we see is that that the video library /code takes exclusive control of the MEDIA_RECORDER object so that it can put the audio into its video recordings and this then locks out access for other apps like our PTT apk from getting hold of the AUDIO_RECORDER. When the video library starts up first, it grabs the mic and PTT app can only transmit silence. If PTT app starts up first and initates AUDIO_RECORDER Object, then the video app does not get the audio stream into the video recording. We both need to use the same Audio Capture source simultaneously in our different apks, but how can we make this work??!! Clearly one idea is to embed the video SDK library into our PTT application manifest so there is one APK in control, but then what is your recommendation on mutually sharing the Audio source for the two different activities. Video encoding into AAC of the source and when the PTT button is pressed we need the same audio capture source which we will encode into AMR. Your ideas would be greatly appreciated on how we might solve this challenge!
We must ensure PTT is accessing Audio Capture directly so as to keep latency issues to a minimum during transmission of these critical voice broadcasts, so we do want to avoid having the overhead of having to encode once for the first app and then take that encoded audio buffer and transcode that yet again into the other codec needed of the same stream in the device for the other app. Ideally we need to find a solution that geneates two streams prior to encoding that each apk can use whether in the foreground or background of the device and each app must be able to get to the audio capture source of the mic on the device.
Thanks in advance
Brett
|
|