facebook youtube pinterest twitter reddit whatsapp instagram

Direct Access to the Camera

Using AVFoundation framework you can access the raw data of the camera, which allows you to process images in real time. Direct access to the camera is used for Augmented Reality, or to modify an image before saving it to the camera roll. You can do almost anything to the image captured by the camera if you access it using the method described in this tutorial.

Today I will explain how to access the raw data of the camera using AVFoundation framework, and how to apply a Core Image filter to the captured data. To do this, you need to use the following classes:

  • AVCaptureSession: is the component that connects the inputs and the outputs.
  • AVCaptureDevice: the input (camera, microphone…).
  • AVCaptureDeviceInput: mediates between the device and the AVCaptureSession.
  • AVCaptureVideoDataOutput: passes the data from the camera to its delegate.

Step 1: Create the Project

Create a new Single View Application project, choose iPhone in Device Family and select Use Automatic Reference Counting and Use Storyboard.

Step 2: Adding Frameworks

For this tutorial, you will need to add four frameworks: AVFoundation, Core Image, Core Media, and Core Video. In the project file, in the Summary tab, scroll down to Linked Frameworks and Libraries and click the plus button. Select the four frameworks and click Add.

Step 3: Setting Up the UI

We are going to display the captured data in a UIImageView, so add the following line in ViewController.h:

@property (nonatomic, weak) IBOutlet UIImageView *imageView;

Synthesize it in the implementation file and in the storyboard drag a UIImageView to the View Controller. Then, Ctrl+Drag from the View Controller to the UIImageView and select imageView when asked.

Step 4: More Properties

You need to add four more properties to ViewController.h, one for each of the classes we described earlier that are needed to access the camera data. Moreover, you have to import the AVFoundation framework.

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController : UIViewController

@property (nonatomic, weak) IBOutlet UIImageView *imageView;
@property (nonatomic, strong) AVCaptureSession *session;
@property (nonatomic, strong) AVCaptureDevice *device;
@property (nonatomic, strong) AVCaptureDeviceInput *input;
@property (nonatomic, strong) AVCaptureVideoDataOutput *output;

@end

Don’t forget to synthesize them in ViewController.m.

Step 5: Creating the AVCaptureSession

Now you need to set up those four components in viewDidLoad to start capturing the data. You can instantiate the session just with alloc init, and then you need to set the level of resolution you want to deal with. Since we want our app to run smoothly, we will use a very low resolution.

- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.

self.session = [[AVCaptureSession alloc] init];
self.session.sessionPreset = AVCaptureSessionPreset352x288;

Next, set the AVCaptureDevice to video and then create the AVCaptureDeviceInput by passing device to it. As for the AVCaptureVideoDataOutput, you need to set the video settings.

self.device =  [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];

self.output = [[AVCaptureVideoDataOutput alloc] init];
self.output.videoSettings =  [[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
Finally, add the input and the output you have just created to the AVCaptureSession and call startRunning upon it.

[self.session addInput:self.input];
[self.session addOutput:self.output];

[self.session startRunning];
}

Step 6: Implementing AVCaptureVideoDataOutputSampleBufferDelegate

The code you have written captures the data from the camera, but we aren’t doing anything with that data. To do something, we first need our ViewController to adopt the AVCaptureVideoDataOutputSampleBufferDelegate protocol.

This way, we can set it as the delegate for AVCaptureVideoDataOutput, and our view controller will be called by the output object and it will pass the pixel information for each frame.

The first line of ViewController.h should look like this:

@interface ViewController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>

When you set the delegate for AVCaptureVideoDataOutput, you need to specify the queue where the frames will be processed. We will use a different queue to prevent the main thread from blocking. If you are not very sure about what I mean with queues, you should check out this tutorial.

The final viewDidLoad method looks like this:

- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.

self.session = [[AVCaptureSession alloc] init];
self.session.sessionPreset = AVCaptureSessionPreset352x288;

self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];

self.output = [[AVCaptureVideoDataOutput alloc] init];
self.output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];

dispatch_queue_t queue;
queue = dispatch_queue_create("new_queue", NULL);

[self.output setSampleBufferDelegate:self queue:queue];

[self.session addInput:self.input];
[self.session addOutput:self.output];

[self.session startRunning];
}

Step 7: Capturing the Output

Now, you need to implement the method -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection, which will be called every time a new video frame has been captured. From this method, you have to set the image received as the UIImageView‘s image to see your app as if it was a camera.

However, to set it as the UIImageView‘s image, you first need to create a UIImage from the CMSampleBufferRef received. To do that, you need to convert it to a CIImage, so you will need a Core Image context. Go back to ViewController.h and add the following property:

@property (nonatomic, strong) CIContext *context;

To avoid creating a new CIContext every time we capture a frame, we will lazily instantiate this context. Synthesize it in ViewController.m, and add the following method:

@synthesize context = _context;

- (CIContext *)context
{
if (!_context) {
_context = [CIContext contextWithOptions:nil];
}

return _context;
}

With the CIImage and the CIContext, you can now create a CGImageRef and therefore a UIImage:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVPixelBufferRef pixel_buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixel_buffer];

CGImageRef ref = [self.context createCGImage:ciImage fromRect:ciImage.extent];
UIImage *image = [UIImage imageWithCGImage:ref scale:1.0 orientation:UIImageOrientationRight];

CGImageRelease(ref);

You cannot set this image as the UIImageView‘s image directly, since we are now in a thread different from the main thread. The only thread where you can update the UI is in the main thread, so you will have to change imageView‘s image like this:

[self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];
}

Step 8: Applying a CIFilter

If you run the app now, you will see in the screen what is captured with the camera. However, the whole point of accessing the camera using AVFoundation is to be able to process the images in real time. We will now use a CIFilter to see what the camera is capturing in sepia.

To use a filter, you need to create it passing the name of the filter you want to use. For iOS, there are a total of 48 different filters, which you can check in the official documentation provided by Apple. Once you have created the filter, you have to set its attributes, and then get the resulting CIImage calling outputImage upon it.

This is how it would look like:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVPixelBufferRef pixel_buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixel_buffer];

CIFilter *filter = [CIFilter filterWithName:@"CISepiaTone"];
[filter setDefaults];
[filter setValue:ciImage forKey:@"inputImage"];
[filter setValue:[NSNumber numberWithFloat:1.0] forKey:@"inputIntensity"];

CIImage *result = [filter outputImage];

CGImageRef ref = [self.context createCGImage:result fromRect:result.extent];
UIImage *image = [UIImage imageWithCGImage:ref scale:1.0 orientation:UIImageOrientationRight];

CGImageRelease(ref);

[self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES]; 
}

Conclusion

As you can see, accessing the camera directly allows you to process images in real time, so you can start experimenting with Augmented Reality or almost anything you have in mind. In a future tutorial I will show you about iOS 5 face detection. In the meantime, let me know if you have any doubts!

You can find the whole project for this tutorial here:

Related Post(s)

  • Custom Camera App | Part 2: Taking Pictures

    Second part of the Custom Camera App series. Custom Camera App – Part 1: Custom Overlay Custom Camera App – Part 2: Taking Pictures Custom Camera App – Part 3: Assets Library In the first par

  • Custom Camera App | Part 3: Assets Library

    Third part of the Custom Camera App series. Custom Camera App – Part 1: Custom Overlay Custom Camera App – Part 2: Taking Pictures Custom Camera App – Part 3: Assets Library Welcome to the la

  • Custom Camera App | Part 1: Custom Overlay

    First part of the Custom Camera App series. Custom Camera App – Part 1: Custom Overlay Custom Camera App – Part 2: Taking Pictures Custom Camera App – Part 3: Assets Library In this series, I