ioscameraflash-builderair-native-extension

Video Camera native extension for IOS


I'm building a native extension for iOS, where I want to implement a Barcode Scanner.

I've followed the AVCam example and I've tried it in a native application (full xcode) and it works ok.

Now, I want to use this code since a Flex mobile project. I've been able to create the ANE and put it on a Flex Mobile project, and I can call the functions of the ANE.

It seems to work ok, but my problem is that I can't see what you're seeing through the camera. I mean, I have a method where I call to start the camera and init the capture. I've also implemented the captureOutput delegate, and the most strange thing is that when I run my app, I can see the logs inside the initcapture and captureOutput like the application is capturing the data, but in the iPad I don't see the camera.

This is part of the code I use:

- (void)initCapture
{
    NSLog(@"camera view capture init");
    /*We setup the input*/
    self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
    /*We setupt the output*/
    captureOutput = [[AVCaptureVideoDataOutput alloc] init];
    // If the queue is blocked when new frames are captured, those frames will be automatically dropped
    captureOutput.alwaysDiscardsLateVideoFrames = YES;
    //captureOutput.minFrameDuration = CMTimeMake(1, 10); Uncomment it to specify a minimum duration for each video frame
    [captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
    // Set the video output to store frame in BGRA (It is supposed to be faster)

    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
    // Set the video output to store frame in 422YpCbCr8(It is supposed to be faster)

    //************************Note this line
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];

    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
    [captureOutput setVideoSettings:videoSettings];

    //And we create a capture session
    self.captureSession = [[AVCaptureSession alloc] init];
    //We add input and output
    [self.captureSession addInput:captureInput];
    [self.captureSession addOutput:captureOutput];


    if ([self.captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720])
    {
        NSLog(@"camera view Set preview port to 1280X720");
        self.captureSession.sessionPreset = AVCaptureSessionPreset1280x720;
    } else
        //set to 640x480 if 1280x720 not supported on device
        if ([self.captureSession canSetSessionPreset:AVCaptureSessionPreset640x480])
        {
            NSLog(@"camera view Set preview port to 640X480");
            self.captureSession.sessionPreset = AVCaptureSessionPreset640x480;
        }


    /*We add the preview layer*/

    self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];

    if ([self.prevLayer respondsToSelector:@selector(connection)])
        self.prevLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
    else
        self.prevLayer.orientation = AVCaptureVideoOrientationLandscapeLeft;

    self.prevLayer.frame = CGRectMake(150, 0, 700, 700);
    self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspect;
    [self.view.layer addSublayer: self.prevLayer];
}

- (void) startScanning {
    NSLog(@"camera view start scanning");
    self.state = LAUNCHING_CAMERA;
    [self.captureSession startRunning];
    self.prevLayer.hidden = NO;
    self.state = CAMERA;
}

#pragma mark AVCaptureSession delegate

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"camera view Capture output");
}

How should I solve this?

Thank you very much.


Solution

  • I think I've solve it.

    Instead of:

    [self.view.layer addSublayer: self.prevLayer];
    

    I put:

    UIViewController *mainController = [UIApplication sharedApplication].keyWindow.rootViewController;
    [mainController.view.layer addSublayer: self.prevLayer];
    

    Now, I can see the camera on my flex application.