I can getting audio from Camera microphone and I saved it .mp3 format to use below code,I am trying combine Video and Audio Data and playing them at same time,how can I do this?
File ses = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/", "ses.mp3");
String path2 = String.valueOf(ses);
MediaRecorder recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(path2);
try {
recorder.prepare();
} catch (IOException e) {
e.printStackTrace();
}
recorder.start();
I can convert NV21 byte data to .h264 format and I played Video from getting Camera data
private CameraProxy.CameraDataCallBack callBack = new CameraProxy.CameraDataCallBack() {
@Override
public void onDataBack(byte[] data, long length) {
encode(data);
}
Encode Video Process
//Video format H264
private synchronized void encode(byte[] data) {
ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mMediaCodec.getOutputBuffers();
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.capacity();
inputBuffer.clear();
inputBuffer.put(data);
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
} else {
return;
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
Log.i(TAG, "outputBufferIndex-->" + outputBufferIndex);
do {
if (outputBufferIndex >= 0) {
ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
System.out.println("buffer info-->" + bufferInfo.offset + "--"
+ bufferInfo.size + "--" + bufferInfo.flags + "--"
+ bufferInfo.presentationTimeUs);
byte[] outData = new byte[bufferInfo.size];
outBuffer.get(outData);
try {
if (bufferInfo.offset != 0) {
fos.write(outData, bufferInfo.offset, outData.length
- bufferInfo.offset);
} else {
fos.write(outData, 0, outData.length);
}
fos.flush();
Log.i(TAG, "out data -- > " + outData.length);
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
0);
} catch (IOException e) {
e.printStackTrace();
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = mMediaCodec.getOutputFormat();
}
} while (outputBufferIndex >= 0);
}
For this purpose you can use a library called FFMPEG Android It takes params as command lines and processed any video and audio. You may need to go through some documentation of FFMPEG android, I used this library to add water mark to a video splitting the video in frames and then adding water mark to it. It did really good Job, I also tested it to combine audio and It Helped.
Here is Code sample that I used.
ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
@Override
public void onStart() {
}
@Override
public void onFailure() {
}
@Override
public void onSuccess() {
final String fileP = lipModel.filePath;
String[] cmd = {"-i", lipModel.filePath, "-i", imagePath, "-preset", "ultrafast", "-filter_complex", "[1:v]scale="+width*0.21+":"+height*0.35+" [ovrl],[0:v][ovrl] overlay=x=(main_w-overlay_w):y=(main_h-overlay_h)", outputPath};
try {
// to execute "ffmpeg -version" command you just need to pass "-version"
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
}
@Override
public void onProgress(String message) {
Log.d(TAG, "onProgress: " + message);
}
@Override
public void onFailure(String message) {
Log.d(TAG, "onFailure: " + message);
}
@Override
public void onSuccess(String message) {
Log.d(TAG, "onSuccess: " + message);
new AsyncDispatcher(new IAsync() {
@Override
public void IOnPreExecute() {
}
@Override
public Object IdoInBackGround(Object... params) {
File file = new File(lipModel.filePath);
if (file.exists()) {
file.delete();
}
lipModel.filePath = outputPath;
lipModel.contentUri = Uri.parse(new File(lipModel.filePath).toString()).toString();
lipSyncSerializedModel.lipSyncMap.put(lipModel.uniqueName, lipModel);
ObjectSerializer.getInstance(getApplicationContext()).serialize(SerTag.LIP_HISTORy, lipSyncSerializedModel);
HomeActivity.this.runOnUiThread(new Runnable() {
@Override
public void run() {
if (LipSyncFragment.iOnNewDataAddedRef != null) {
LipSyncFragment.iOnNewDataAddedRef.newDataAdded();
// historyFragment.favModel = favModel;
}
LipsyncHistoryFragment lipHistory = new LipsyncHistoryFragment();
File file = new File(fileP);
if (file != null) {
if(file.exists()){
file.delete();
Log.d(TAG, "run: Deleted the Orignal Video");
}
}
new FragmentUtils(HomeActivity.this,
lipHistory, R.id.fragContainer);
}
});
return null;
}
@Override
public void IOnPostExecute(Object result) {
}
});
}
@Override
public void onFinish() {
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// Handle if FFmpeg is already running
e.printStackTrace();
}
FFMPEG Documentation Link: http://writingminds.github.io/ffmpeg-android-java/
FFMPEG Library: https://github.com/writingminds/ffmpeg-android-java
There is another Library that does the same
public class Mp4ParserAudioMuxer implements AudioMuxer {
@Override
public boolean mux(String videoFile, String audioFile, String outputFile) {
Movie video;
try {
video = new MovieCreator().build(videoFile);
} catch (RuntimeException e) {
e.printStackTrace();
return false;
} catch (IOException e) {
e.printStackTrace();
return false;
}
Movie audio;
try {
audio = new MovieCreator().build(audioFile);
} catch (IOException e) {
e.printStackTrace();
return false;
} catch (NullPointerException e) {
e.printStackTrace();
return false;
}
Track audioTrack = audio.getTracks().get(0);
video.addTrack(audioTrack);
Container out = new DefaultMp4Builder().build(video);
FileOutputStream fos;
try {
fos = new FileOutputStream(outputFile);
} catch (FileNotFoundException e) {
e.printStackTrace();
return false;
}
BufferedWritableFileByteChannel byteBufferByteChannel =
new BufferedWritableFileByteChannel(fos);
try {
out.writeContainer(byteBufferByteChannel);
byteBufferByteChannel.close();
fos.close();
} catch (IOException e) {
e.printStackTrace();
return false;
}
return true;
}
}
https://github.com/sannies/mp4parser
You can also try these But it will not be so easy, You need to learn about these api's
You can use all three of them together if you need to or individually. You can find some example code here.