iosobjective-ccameraavcam

AVCam Customize PreviewLayer


It's my first time with iOS Camera. I trying to create a simple app that can take only photos (Still Image). I'm using the code from the wwdc:

https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010112-Intro-DontLinkElementID_2

I want to create a custom photo size, like in the picture:
enter image description here

But the result is: enter image description here

How can i fix it to the size of the square?

Thank you!

Edit: I am attaching a picture of the result. enter image description here how can i fix it?

Edite 2:

CMPCameraViewController:

- (void)viewDidLoad
{
[super viewDidLoad];

// Disable UI. The UI is enabled if and only if the session starts running.
self.stillButton.enabled = NO;

// Create the AVCaptureSession.
self.session = [[AVCaptureSession alloc] init];

// Setup the preview view.
self.previewView.session = self.session;

// Communicate with the session and other session objects on this queue.
self.sessionQueue = dispatch_queue_create( "session queue", DISPATCH_QUEUE_SERIAL );

self.setupResult = AVCamSetupResultSuccess;


// Setup the capture session.
// In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.
// Why not do all of this on the main queue?
// Because -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue
// so that the main queue isn't blocked, which keeps the UI responsive.
dispatch_async( self.sessionQueue, ^{
    if ( self.setupResult != AVCamSetupResultSuccess ) {
        return;
    }

    self.backgroundRecordingID = UIBackgroundTaskInvalid;
    NSError *error = nil;

    AVCaptureDevice *videoDevice = [CMPCameraViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
    AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

    if ( ! videoDeviceInput ) {
        NSLog( @"Could not create video device input: %@", error );
    }

    [self.session beginConfiguration];

    if ( [self.session canAddInput:videoDeviceInput] ) {
        [self.session addInput:videoDeviceInput];
        self.videoDeviceInput = videoDeviceInput;

        dispatch_async( dispatch_get_main_queue(), ^{
            // Why are we dispatching this to the main queue?
            // Because AVCaptureVideoPreviewLayer is the backing layer for AAPLPreviewView and UIView
            // can only be manipulated on the main thread.
            // Note: As an exception to the above rule, it is not necessary to serialize video orientation changes
            // on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.

            // Use the status bar orientation as the initial video orientation. Subsequent orientation changes are handled by
            // -[viewWillTransitionToSize:withTransitionCoordinator:].
            UIInterfaceOrientation statusBarOrientation = [UIApplication sharedApplication].statusBarOrientation;
            AVCaptureVideoOrientation initialVideoOrientation = AVCaptureVideoOrientationPortrait;
            if ( statusBarOrientation != UIInterfaceOrientationUnknown ) {
                initialVideoOrientation = (AVCaptureVideoOrientation)statusBarOrientation;
            }

            AVCaptureVideoPreviewLayer *previewLayer = (AVCaptureVideoPreviewLayer *)self.previewView.layer;
            previewLayer.connection.videoOrientation = initialVideoOrientation;
            previewLayer.bounds = _previewView.frame;
            //previewLayer.connection.videoOrientation = UIInterfaceOrientationLandscapeLeft;
        } );
    }
    else {
        NSLog( @"Could not add video device input to the session" );
        self.setupResult = AVCamSetupResultSessionConfigurationFailed;
    }

    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];

    if ( ! audioDeviceInput ) {
        NSLog( @"Could not create audio device input: %@", error );
    }

    if ( [self.session canAddInput:audioDeviceInput] ) {
        [self.session addInput:audioDeviceInput];
    }
    else {
        NSLog( @"Could not add audio device input to the session" );
    }

    AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
    if ( [self.session canAddOutput:movieFileOutput] ) {
        [self.session addOutput:movieFileOutput];
        AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
        if ( connection.isVideoStabilizationSupported ) {
            connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
        }
        self.movieFileOutput = movieFileOutput;
    }
    else {
        NSLog( @"Could not add movie file output to the session" );
        self.setupResult = AVCamSetupResultSessionConfigurationFailed;
    }

    AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    if ( [self.session canAddOutput:stillImageOutput] ) {
        stillImageOutput.outputSettings = @{AVVideoCodecKey : AVVideoCodecJPEG};
        [self.session addOutput:stillImageOutput];
        self.stillImageOutput = stillImageOutput;
    }
    else {
        NSLog( @"Could not add still image output to the session" );
        self.setupResult = AVCamSetupResultSessionConfigurationFailed;
    }

    [self.session commitConfiguration];
} );    
}

CMPPreviewView:

 + (Class)layerClass
   {
   return [AVCaptureVideoPreviewLayer class];
   }

 - (AVCaptureSession *)session
 {
AVCaptureVideoPreviewLayer *previewLayer = (AVCaptureVideoPreviewLayer *)self.layer;
return previewLayer.session;
}

- (void)setSession:(AVCaptureSession *)session
{
AVCaptureVideoPreviewLayer *previewLayer = (AVCaptureVideoPreviewLayer *)self.layer;
previewLayer.session = session;
((AVPlayerLayer *)[self layer]).videoGravity = AVLayerVideoGravityResize;
   }

Solution

  • Apple's AVCam code is a great starting point for getting into photography development.

    What you are trying to do is modify the size of your video preview layer. This is done by changing the videoGravity setting. Here's an example for an aspect fill type viewing:

    [Swift 3]
    
    previewView.videoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
    

    Now, for your situation of filling to a rectangle you are required to define layer bounds then use AVLayerVideoGravityResize.

    Please note: This will not affect the size of the captured photo. It simply modifies the size of the video preview layer. This is an important distinction. For modifying the size of the captured photo you'll need to perform a crop operation (which can be done fairly easily in a variety of manners), but it seems that is not your intention.

    Best of luck.

    Edit: Now, that it seems you're interested in cropping the captured UIImage.

    [Swift 3] 
    // I'm going to assume you've done something like this to store the captured data to a UIImage object 
    //If not, I would do so
    let myImage = UIImage(data: capturedImageData)! 
    
    // using core graphics (the cg in cgImage) you can perform all kinds of image manipulations--crop, rotation, mirror, etc. 
    // here's crop to a rectangle--fill in with your desired values 
    let myRect = CGRect(x: ..., y: ..., width: ..., height: ...) 
    myImage = myImage.cgImage?.cropping(to: myRect)
    

    Hopefully, this answers your question.