47

I have been working on using a custom camera, and I recently upgraded to Xcode 8 beta along with Swift 3. I originally had this:

var stillImageOutput: AVCaptureStillImageOutput?

However, I am now getting the warning:

'AVCaptureStillImageOutput' was deprecated in iOS 10.0: Use AVCapturePhotoOutput instead

As this is fairly new, I have not seen much information on this. Here is my current code:

var captureSession: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var previewLayer: AVCaptureVideoPreviewLayer?

func clickPicture() {

    if let videoConnection = stillImageOutput?.connection(withMediaType: AVMediaTypeVideo) {

        videoConnection.videoOrientation = .portrait
        stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (sampleBuffer, error) -> Void in

            if sampleBuffer != nil {

                let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
                let dataProvider = CGDataProvider(data: imageData!)
                let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)

                let image = UIImage(cgImage: cgImageRef!, scale: 1, orientation: .right)

            }

        })

    }

}

I have tried to look at AVCapturePhotoCaptureDelegate, but I am not quite sure how to use it. Does anybody know how to use this? Thanks.

3
  • You need to see WWDC 2016 session 511 video..
    – LC 웃
    Commented Jun 17, 2016 at 8:57
  • Ohk! So I will watch the video, and will post an answer if I can. Thanks! Commented Jun 17, 2016 at 10:43
  • Looking at the docs might help too.
    – rickster
    Commented Jun 17, 2016 at 18:57

7 Answers 7

64

Updated to Swift 4 Hi it's really easy to use AVCapturePhotoOutput.

You need the AVCapturePhotoCaptureDelegate which returns the CMSampleBuffer.

You can get as well a preview image if you tell the AVCapturePhotoSettings the previewFormat

    class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {

        let cameraOutput = AVCapturePhotoOutput()

        func capturePhoto() {

          let settings = AVCapturePhotoSettings()
          let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
          let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                               kCVPixelBufferWidthKey as String: 160,
                               kCVPixelBufferHeightKey as String: 160]
          settings.previewPhotoFormat = previewFormat
          self.cameraOutput.capturePhoto(with: settings, delegate: self)

        }

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {                        
            if let error = error {
                print(error.localizedDescription)
            }

            if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
              print("image: \(UIImage(data: dataImage)?.size)") // Your Image
            }   
        }
    }

For more information visit https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput

Note: You have to add the AVCapturePhotoOutput to the AVCaptureSession before taking the picture. So something like: session.addOutput(output), and then: output.capturePhoto(with:settings, delegate:self) Thanks @BigHeadCreations

7
  • 7
    Gives error: "[AVCapturePhotoOutput capturePhotoWithSettings:delegate:] No active and enabled video connection". Could you please provide full example for iOS 10 / Swift 3. Commented Jul 22, 2016 at 19:40
  • @TuomasLaatikainen you're likely need to set capture session preset to AVCaptureSessionPresetPhoto Commented Sep 10, 2016 at 20:46
  • I have watched the video, surfed the entire web, re-written code, changed iPhones and cannot resolve the exception "No active and enabled video connection." The Apple doc is classically vague and void of details. Help! Is there a working project to share??
    – mobibob
    Commented Dec 28, 2016 at 3:44
  • @TuomasLaatikainen did you figure out what the problem was for you? Having the same issue
    – SRMR
    Commented Apr 2, 2017 at 21:58
  • 1
    @TuomasLaatikainen you have to add the AVCapturePhotoOutput to the AVCaptureSession before taking the picture. So something like: session.addOutput(output), and then: output.capturePhoto(with:settings, delegate:self) Commented Nov 27, 2017 at 23:57
45

There is my full implementation

import UIKit
import AVFoundation

class ViewController: UIViewController, AVCapturePhotoCaptureDelegate {

var captureSesssion : AVCaptureSession!
var cameraOutput : AVCapturePhotoOutput!
var previewLayer : AVCaptureVideoPreviewLayer!

@IBOutlet weak var capturedImage: UIImageView!
@IBOutlet weak var previewView: UIView!

override func viewDidLoad() {
    super.viewDidLoad()
    captureSesssion = AVCaptureSession()
    captureSesssion.sessionPreset = AVCaptureSessionPresetPhoto
    cameraOutput = AVCapturePhotoOutput()

    let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)

    if let input = try? AVCaptureDeviceInput(device: device) {
        if (captureSesssion.canAddInput(input)) {
            captureSesssion.addInput(input)
            if (captureSesssion.canAddOutput(cameraOutput)) {
                captureSesssion.addOutput(cameraOutput)
                previewLayer = AVCaptureVideoPreviewLayer(session: captureSesssion)
                previewLayer.frame = previewView.bounds
                previewView.layer.addSublayer(previewLayer)
                captureSesssion.startRunning()
            }
        } else {
            print("issue here : captureSesssion.canAddInput")
        }
    } else {
        print("some problem here")
    }
}

// Take picture button
@IBAction func didPressTakePhoto(_ sender: UIButton) {
    let settings = AVCapturePhotoSettings()
    let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
    let previewFormat = [
         kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
         kCVPixelBufferWidthKey as String: 160,
         kCVPixelBufferHeightKey as String: 160
    ]
    settings.previewPhotoFormat = previewFormat
    cameraOutput.capturePhoto(with: settings, delegate: self)
}

// callBack from take picture
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

    if let error = error {
        print("error occure : \(error.localizedDescription)")
    }

    if  let sampleBuffer = photoSampleBuffer,
        let previewBuffer = previewPhotoSampleBuffer,
        let dataImage =  AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer:  sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
        print(UIImage(data: dataImage)?.size as Any)

        let dataProvider = CGDataProvider(data: dataImage as CFData)
        let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
        let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right)

        self.capturedImage.image = image
    } else {
        print("some error here")
    }
}

// This method you can use somewhere you need to know camera permission   state
func askPermission() {
    print("here")
    let cameraPermissionStatus =  AVCaptureDevice.authorizationStatus(forMediaType: AVMediaTypeVideo)

    switch cameraPermissionStatus {
    case .authorized:
        print("Already Authorized")
    case .denied:
        print("denied")

        let alert = UIAlertController(title: "Sorry :(" , message: "But  could you please grant permission for camera within device settings",  preferredStyle: .alert)
        let action = UIAlertAction(title: "Ok", style: .cancel,  handler: nil)
        alert.addAction(action)
        present(alert, animated: true, completion: nil)

    case .restricted:
        print("restricted")
    default:
        AVCaptureDevice.requestAccess(forMediaType: AVMediaTypeVideo, completionHandler: {
            [weak self]
            (granted :Bool) -> Void in

            if granted == true {
                // User granted
                print("User granted")
 DispatchQueue.main.async(){
            //Do smth that you need in main thread   
            } 
            }
            else {
                // User Rejected
                print("User Rejected")

DispatchQueue.main.async(){
            let alert = UIAlertController(title: "WHY?" , message:  "Camera it is the main feature of our application", preferredStyle: .alert)
                let action = UIAlertAction(title: "Ok", style: .cancel, handler: nil)
                alert.addAction(action)
                self?.present(alert, animated: true, completion: nil)  
            } 
            }
        });
    }
}
}
4
  • How did you set flashMode to it?
    – coolly
    Commented May 20, 2017 at 2:59
  • Working on iOS 10.0.2. For turn on flash settings.flashMode = .on Commented Jun 25, 2017 at 3:29
  • Why UIImageOrientation.right ? Then it's a wrong orientation on iPad.
    – Makalele
    Commented Apr 10, 2018 at 9:58
  • Works like a charm :) Commented Jan 1, 2019 at 19:44
18

In iOS 11 "photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {}" is deprecated.

Use following method:

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
    let imageData = photo.fileDataRepresentation()
    if let data = imageData, let img = UIImage(data: data) {
        print(img)
    }
}
17

I took @Aleksey Timoshchenko's excellent answer and updated it to Swift 4.x.

Note that for my use-case I allow the user to take multiple photos which is why I save them in the images array.

Note that you need to wire up the @IBAction takePhoto method via your storyboard or in code. In my case, I use a storyboard.

As of iOS 11, the AVCapturePhotoOutput.jpegPhotoDataRepresentation that is used in @Aleksey Timoshchenko's answer is deprecated.

Swift 4.x

class CameraVC: UIViewController {

    @IBOutlet weak var cameraView: UIView!

    var images = [UIImage]()

    var captureSession: AVCaptureSession!
    var cameraOutput: AVCapturePhotoOutput!
    var previewLayer: AVCaptureVideoPreviewLayer!

    override func viewDidLoad() {
        super.viewDidLoad()
    }

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        startCamera()
    }

    func startCamera() {
        captureSession = AVCaptureSession()
        captureSession.sessionPreset = AVCaptureSession.Preset.photo
        cameraOutput = AVCapturePhotoOutput()

        if let device = AVCaptureDevice.default(for: .video),
           let input = try? AVCaptureDeviceInput(device: device) {
            if (captureSession.canAddInput(input)) {
                captureSession.addInput(input)
                if (captureSession.canAddOutput(cameraOutput)) {
                    captureSession.addOutput(cameraOutput)
                    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
                    previewLayer.frame = cameraView.bounds
                    cameraView.layer.addSublayer(previewLayer)
                    captureSession.startRunning()
                }
            } else {
                print("issue here : captureSesssion.canAddInput")
            }
        } else {
            print("some problem here")
        }
    }

    @IBAction func takePhoto(_ sender: UITapGestureRecognizer) {
        let settings = AVCapturePhotoSettings()
        let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
        let previewFormat = [
            kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
            kCVPixelBufferWidthKey as String: 160,
            kCVPixelBufferHeightKey as String: 160
        ]
        settings.previewPhotoFormat = previewFormat
        cameraOutput.capturePhoto(with: settings, delegate: self)   
    }
}

extension CameraVC : AVCapturePhotoCaptureDelegate {
    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {

        if let error = error {
            print("error occured : \(error.localizedDescription)")
        }

        if let dataImage = photo.fileDataRepresentation() {
            print(UIImage(data: dataImage)?.size as Any)

            let dataProvider = CGDataProvider(data: dataImage as CFData)
            let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
            let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImage.Orientation.right)

            /**
               save image in array / do whatever you want to do with the image here
            */
            self.images.append(image)

        } else {
            print("some error here")
        }
    }
}
3
  • This is the best answer. It focus in the core aspects to make it work !!!
    – eharo2
    Commented Mar 14, 2019 at 3:08
  • Great answer. But note that fileDataRepresentation() requires iOS11
    – Fraser
    Commented Oct 1, 2019 at 22:45
  • Thank you, this solved it for me. Works even in 2022. Commented Apr 1, 2022 at 9:45
3

I found this project in GitHub that helped me understand the initialization of the device and capture-session.

AVCapturePhotoOutput_test by inoue0426

2

The capture delegate function has been changed to photoOutput. Here's the updated function for Swift 4.

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {            
        if let error = error {
            print(error.localizedDescription)
        }

        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
            print("image: \(String(describing: UIImage(data: dataImage)?.size))") // Your Image
        }
}
0

Exact same answer as the one given by @productioncoder but I had to change the startCamera() to under viewDidLoad() instead of viewDidAppear().

Not the answer you're looking for? Browse other questions tagged or ask your own question.