buffervideo-encodingopenh264

ParamValidationExt error with WelsInitEncoderExt failed while setting up OpenH264 encoder


Scenario:
I am using OpenH264 with my App to encode into a video_file.mp4.

Environment:
Platform : MacOs Sierra
Compiler : Clang++

The code:
Following is the crux of the code I have:

void EncodeVideoFile() {

  ISVCEncoder     * encoder_;
  std:string      video_file_name = "/Path/to/some/folder/video_file.mp4";
  EncodeFileParam * pEncFileParam;
  SEncParamExt    * pEnxParamExt;

  float frameRate = 1000;
  EUsageType usageType = EUsageType::CAMERA_VIDEO_REAL_TIME;
  bool denoise = false;
  bool lossless = true;
  bool enable_ltr = false;
  int layers = 1;
  bool cabac = false;
  int sliceMode = 1;

  pEncFileParam = new EncodeFileParam;

  pEncFileParam->eUsageType = EUsageType::CAMERA_VIDEO_REAL_TIME;
  pEncFileParam->pkcFileName = video_file_name.c_str();
  pEncFileParam->iWidth = frame_width;
  pEncFileParam->iHeight = frame_height;
  pEncFileParam->fFrameRate = frameRate;
  pEncFileParam->iLayerNum = layers;
  pEncFileParam->bDenoise = denoise;
  pEncFileParam->bLossless = lossless;
  pEncFileParam->bEnableLtr = enable_ltr;
  pEncFileParam->bCabac = cabac;

  int rv = WelsCreateSVCEncoder (&encoder_);

  pEnxParamExt = new SEncParamExt;

  pEnxParamExt->iUsageType                = pEncFileParam->eUsageType;
  pEnxParamExt->iPicWidth                 = pEncFileParam->iWidth;
  pEnxParamExt->iPicHeight                = pEncFileParam->iHeight;
  pEnxParamExt->fMaxFrameRate             = pEncFileParam->fFrameRate;
  pEnxParamExt->iSpatialLayerNum          = pEncFileParam->iLayerNum;

  pEnxParamExt->bEnableDenoise            = pEncFileParam->bDenoise;
  pEnxParamExt->bIsLosslessLink           = pEncFileParam->bLossless;
  pEnxParamExt->bEnableLongTermReference  = pEncFileParam->bEnableLtr;
  pEnxParamExt->iEntropyCodingModeFlag    = pEncFileParam->bCabac ? 1 : 0;

  for (int i = 0; i < pEnxParamExt->iSpatialLayerNum; i++) {
      pEnxParamExt->sSpatialLayers[i].sSliceArgument.uiSliceMode = pEncFileParam->eSliceMode;
  }


  encoder_->InitializeExt(pEnxParamExt);

  int videoFormat = videoFormatI420;
  encoder_->SetOption (ENCODER_OPTION_DATAFORMAT, &videoFormat);

  int frameSize = frame_width * frame_height * 3 / 2;
  int total_num = 500;
  BufferedData buf;
  buf.SetLength (frameSize);

  // check the buffer before proceeding
  if (buf.Length() != (size_t)frameSize) {
      CloseEncoder();
      return;
  }

  SFrameBSInfo info;
  memset (&info, 0, sizeof (SFrameBSInfo));
  SSourcePicture pic;
  memset (&pic, 0, sizeof (SSourcePicture));
  pic.iPicWidth = frame_width;
  pic.iPicHeight = frame_height;
  pic.iColorFormat = videoFormatI420;
  pic.iStride[0] = pic.iPicWidth;
  pic.iStride[1] = pic.iStride[2] = pic.iPicWidth >> 1;
  pic.pData[0] = buf.data();
  pic.pData[1] = pic.pData[0] + frame_width * frame_height;
  pic.pData[2] = pic.pData[1] + (frame_width * frame_height >> 2);

  for(int num = 0; num < total_num; num++) {
     // try to encode the frame
     rv = encoder_->EncodeFrame (&pic, &info);
  }

  if (encoder_) {
      encoder_->Uninitialize();
      WelsDestroySVCEncoder (encoder_);
  }
}

Above code is something I pulled up from official usage examples of OpenH264 where BufferedData.h is a class I reused from OpenH264 utils

Issue:
But, I am getting the following error:

[OpenH264] this = 0x0x1038bc8c0, Error:ParamValidationExt(), width > 0, height > 0, width * height <= 9437184, invalid 0 x 0 in dependency layer settings!
[OpenH264] this = 0x0x1038bc8c0, Error:WelsInitEncoderExt(), ParamValidationExt failed return 2.
[OpenH264] this = 0x0x1038bc8c0, Error:CWelsH264SVCEncoder::Initialize(), WelsInitEncoderExt failed.

Above does not crash the application but it goes through a blank run without creating the video_file.mp4 with the dummy data that I am trying to write into it.

Question:
There seems to be something wrong with the set up config I applying to pEnxParamExtwhich goes into encoder_->InitializeExt.
What am I doing wrong with the set up of the encoder?

Note:
I am not trying to hook up to any camera device. I am just trying to create a .mp4 video out of some dummy image data.


Solution

  • If you want to get complete and working OpenH264 Encoder Initialization procedure you can click... here.

    According to your problem scenario, you are trying to create a video file(.mp4/.avi) from some dummy images. This task can be accomplished using two different libraries: i) Library for Codec, ii) Library for Container.

    i) Library for Codec: It's so much easy to use a OpenH264 to compress data. One thing I must mention is that, OpenH264 always works with raw frames e.g. yuv420 data. So, if you want to compress your image data, you have to convert these image data into yuv420 color format. To get OpenH264 click... here

    ii) Library for Container: After getting the encoded data you have to use another library to create the container with extension .mp4, .avi, .flv etc. There exists a lot of libraries in github to do that staff like FFmpeg, OpenCV, Bento4, MP4Maker, mp4parser etc. Before using these libraries please check in detail about the license issues. If you use FFmpeg, you will not need to use OpenH264 becuse FFmpeg itself works along with several codecs. You will also find lot more working examples as so many developers are working with video data out there.

    Hope it helps. :)