12

AVFoundation is just great for those who want to get hands dirty, but there still a lot of basic stuff not easy to figure like how to save the pictures taken from device to Photo Album

Any ideas?

1
  • It doesn't appear that this topic involves Cocoa.
    – El Tomato
    Commented Jan 17, 2014 at 6:19

4 Answers 4

61
+100

Here is a step by step tutorial on how to capture an image using AVFoundation and save it to photo album.

Add a UIView object to the NIB (or as a subview), and create a @property in your controller:

@property(nonatomic, retain) IBOutlet UIView *vImagePreview;

Connect the UIView to the outlet above in IB, or assign it directly if you’re using code instead of a NIB.

Then edit your UIViewController, and give it the following viewDidAppear method:

-(void)viewDidAppear:(BOOL)animated
{
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    session.sessionPreset = AVCaptureSessionPresetMedium;

    CALayer *viewLayer = self.vImagePreview.layer;
    NSLog(@"viewLayer = %@", viewLayer);

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

    captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
    [self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];

    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!input) {
        // Handle the error appropriately.
        NSLog(@"ERROR: trying to open camera: %@", error);
    }
    [session addInput:input];

    stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [stillImageOutput setOutputSettings:outputSettings];
    [session addOutput:stillImageOutput];

    [session startRunning];
}

Create a new @property to hold a reference to output object:

@property(nonatomic, retain) AVCaptureStillImageOutput *stillImageOutput;

Then make a UIImageView where we’ll display the captured photo. Add this to your NIB, or programmatically.

Hook it up to another @property, or assign it manually, e.g.;

@property(nonatomic, retain) IBOutlet UIImageView *vImage;

Finally, create a UIButton, so that you can take the photo.

Again, add it to your NIB (or programmatically to your screen), and hook it up to the following method:

-(IBAction)captureNow {
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillImageOutput.connections)
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] )
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) 
            { 
                break; 
            }
    }

    NSLog(@"about to request a capture from: %@", stillImageOutput);
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
    {
         CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
         if (exifAttachments)
         {
            // Do something with the attachments.
            NSLog(@"attachements: %@", exifAttachments);
         } else {
            NSLog(@"no attachments");
             }

        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
        UIImage *image = [[UIImage alloc] initWithData:imageData];

        self.vImage.image = image;

        UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
     }];
}

You might have to import #import <ImageIO/CGImageProperties.h> also.

Source. Also check this.

3
  • 2
    I will strongly recommend others to visit the websites linked in this post. They give a deeper insight in what is happening. Great post, iDev!
    – user353955
    Commented Sep 19, 2013 at 8:33
  • Worked great. Maybe [super viewDidAppear:animated] should be called also. Commented Jun 16, 2016 at 7:51
  • Thanks a lot. I have it's Swift version code. Anyone need, please let me know.
    – nitin.agam
    Commented Aug 22, 2017 at 7:20
1

According to your question, looks like you've already got the picture from the camera into NSData or UIImage. If so, you can add this picture to the album in differrent ways. AVFoundation itself doesn't have classes which are carrying the holding of the images. So, for example, you can use the ALAssetsLibrary framework to save image to the Photo Album. Or you can use just UIKit framework with it's UIImageWriteToSavedPhotosAlbum method. Both are good to use.

In case that you don't have still image captured yet, you can look at the captureStillImageAsynchronouslyFromConnection method of the AVFoundation framework. Anyway, here are the ideas. You can find examples in the Internet easily. Good luck :)

0

There is a lot to setting up the camera this way that you either aren't doing or aren't showing.

The best place to look would be at: AVCamCaptureManager.m in the the AVCam sample project; particularly setupSession and captureStillImage (which writes the photo to the library).

0

This method works for me

-(void) saveImageDataToLibrary:(UIImage *) image
{
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
[appDelegate.library writeImageDataToSavedPhotosAlbum:imageData metadata:nil completionBlock:^(NSURL *assetURL, NSError *error)
{
    if (error) {
        NSLog(@"%@",error.description);
    }
    else {
        NSLog(@"Photo saved successful");
    }
}];
}

where appDelegate.library is ALAssetLibrary instance.

1
  • thanks, but how did you extract the UIImage from the front camera device using AVFoundation?
    – RollRoll
    Commented Nov 2, 2012 at 14:59

Not the answer you're looking for? Browse other questions tagged or ask your own question.