So you want to record a video and play music from the user's library at the same time? Look no further. Below is the answer.
For the audio playback you will use AVAudioPlayer
. All you have to do is to declare the AVAudioPlayer
as a global variable (I named it audioPlayer) and implement the code below.
Use this in after the user chose the song he/she wants to play:
func mediaPicker(mediaPicker: MPMediaPickerController, didPickMediaItems mediaItemCollection: MPMediaItemCollection) {
let pickerItem: MPMediaItem = mediaItemCollection.items[0]
let songURL = pickerItem.valueForProperty(MPMediaItemPropertyAssetURL)
if let sURL = songURL as? NSURL
{
songTitle = pickerItem.title!
do
{
audioPlayer = try AVAudioPlayer(contentsOfURL: sURL)
}
catch
{
print("Can't Create Audio Player: \(error)")
}
}
dismissViewControllerAnimated(true, completion: { () -> Void in
audioPlayer.play()
})
}
You will also need to set up the audio session(in viewDidLoad
). It's crucial if you want audio to play while recording:
// Audio Session Setup
do
{
try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
}
catch
{
print("Can't Set Audio Session Category: \(error)")
}
AVAudioSessionCategoryOptions.MixWithOthers
do
{
try audioSession.setMode(AVAudioSessionModeVideoRecording)
}
catch
{
print("Can't Set Audio Session Mode: \(error)")
}
// Start Session
do
{
try audioSession.setActive(true)
}
catch
{
print("Can't Start Audio Session: \(error)")
}
Now for the video recording. You will use AVCaptureSession
. Declare the following as global variables:
let captureSession = AVCaptureSession()
var currentDevice: AVCaptureDevice?
var videoFileOutput: AVCaptureMovieFileOutput?
var cameraPreviewLayer: AVCaptureVideoPreviewLayer?
Then configure the session in viewDidLoad
. Note: The video preview is in a container and the entire video related code is in a different view controller but just using a view instead of a container should work just as fine:
// Preset For 720p
captureSession.sessionPreset = AVCaptureSessionPreset1280x720
// Get Available Devices Capable Of Recording Video
let devices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo) as! [AVCaptureDevice]
// Get Back Camera
for device in devices
{
if device.position == AVCaptureDevicePosition.Back
{
currentDevice = device
}
}
let camera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
// Audio Input
let audioInputDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)
do
{
let audioInput = try AVCaptureDeviceInput(device: audioInputDevice)
// Add Audio Input
if captureSession.canAddInput(audioInput)
{
captureSession.addInput(audioInput)
}
else
{
NSLog("Can't Add Audio Input")
}
}
catch let error
{
NSLog("Error Getting Input Device: \(error)")
}
// Video Input
let videoInput: AVCaptureDeviceInput
do
{
videoInput = try AVCaptureDeviceInput(device: camera)
// Add Video Input
if captureSession.canAddInput(videoInput)
{
captureSession.addInput(videoInput)
}
else
{
NSLog("ERROR: Can't add video input")
}
}
catch let error
{
NSLog("ERROR: Getting input device: \(error)")
}
// Video Output
videoFileOutput = AVCaptureMovieFileOutput()
captureSession.addOutput(videoFileOutput)
// Show Camera Preview
cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
view.layer.addSublayer(cameraPreviewLayer!)
cameraPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
let width = view.bounds.width
cameraPreviewLayer?.frame = CGRectMake(0, 0, width, width)
// Bring Record Button To Front & Start Session
view.bringSubviewToFront(recordButton)
captureSession.startRunning()
print(captureSession.inputs)
Then you create an @IBAction
for handling when the user presses the record button (I just used a simple button which I made red and round):
@IBAction func capture(sender: AnyObject) {
do
{
initialOutputURL = try NSFileManager.defaultManager().URLForDirectory(.DocumentDirectory, inDomain: .UserDomainMask, appropriateForURL: nil, create: true).URLByAppendingPathComponent("output").URLByAppendingPathExtension("mov")
}
catch
{
print(error)
}
if !isRecording
{
isRecording = true
UIView.animateWithDuration(0.5, delay: 0.0, options: [.Repeat, .Autoreverse, .AllowUserInteraction], animations: { () -> Void in
self.recordButton.transform = CGAffineTransformMakeScale(0.75, 0.75)
}, completion: nil)
videoFileOutput?.startRecordingToOutputFileURL(initialOutputURL, recordingDelegate: self)
}
else
{
isRecording = false
UIView.animateWithDuration(0.5, delay: 0, options: [], animations: { () -> Void in
self.recordButton.transform = CGAffineTransformMakeScale(1.0, 1.0)
}, completion: nil)
recordButton.layer.removeAllAnimations()
videoFileOutput?.stopRecording()
}
}
Then all there is left for you to do is to save the video to (presumably) the camera roll. But I won't include that. You must put in some effort yourselves. (hint: UISaveVideoAtPathToSavedPhotosAlbum
)
So that's it folks. That's how you use AVFoundation
to record a video and play music from the library at the same time.