如何在Xcode中通过帧获取图片帧

嗨,我想通过iPhone使用iphone后置摄像头获取相框。 我到目前为止所做的。

  • 我以完整模式打开相机。
  • (IBAction)showCameraUI {[self startCameraControllerFromViewController:self usingDelegate:self]; }

  • (BOOL)startCameraControllerFromViewController:(UIViewController *)controller usingDelegate:(id)delegate {

  • if (([UIImagePickerController isSourceTypeAvailable:
          UIImagePickerControllerSourceTypeCamera] == NO)
        || (delegate == nil)
        || (controller == nil))
        return NO;
    
    
    UIImagePickerController *cameraUI = [[UIImagePickerController alloc] init];
    cameraUI.sourceType = UIImagePickerControllerSourceTypeCamera;
    
    // Displays a control that allows the user to choose picture or
    // movie capture, if both are available:
    cameraUI.mediaTypes =
    [UIImagePickerController availableMediaTypesForSourceType:
     UIImagePickerControllerSourceTypeCamera];
    
    // Hides the controls for moving & scaling pictures, or for
    // trimming movies. To instead show the controls, use YES.
    cameraUI.allowsEditing = NO;
    
    cameraUI.delegate = delegate;
    cameraUI.showsCameraControls=NO;
    cameraUI.navigationBarHidden=YES;
    cameraUI.toolbarHidden=YES;
    cameraUI.wantsFullScreenLayout=YES;
    cameraUI.cameraViewTransform = CGAffineTransformScale(cameraUI.cameraViewTransform, CAMERA_SCALAR_SX, CAMERA_SCALAR_SY);
    UIButton *btnRecording = [UIButton buttonWithType:UIButtonTypeRoundedRect];
    CGRect buttonRect =CGRectMake(190 , 420, 100, 39); // position in the parent view and set the size of the button
    btnRecording.frame = buttonRect;
    [btnRecording setTitle:@"Recording" forState:UIControlStateNormal];
    // add targets and actions
    [btnRecording addTarget:self action:@selector(buttonClicked:) forControlEvents:UIControlEventTouchUpInside];
     cameraUI.cameraOverlayView= btnRecording;
    [controller presentModalViewController: cameraUI animated: YES];
    return YES;
    

    }

  • 设置AVCapture可逐帧获取图片。
  • (void)setupCaptureSession {NSError * error = nil;
  • // Create the session
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    

    // session.AVCaptureTorchModeOn = YES;

    // Configure the session to produce lower resolution video frames, if your
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    session.sessionPreset = AVCaptureSessionPresetLow;
    
    // Find a suitable AVCaptureDevice
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    [device lockForConfiguration:nil];
    [device setTorchMode:AVCaptureTorchModeOn];  // use AVCaptureTorchModeOff to turn off
    [device unlockForConfiguration];
    // Create a device input with the device and add it to the session.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                        error:&error];
    if (!input) {
        // Handling the error appropriately.
    }
    [session addInput:input];
    
    // Create a VideoDataOutput and add it to the session
    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init] ;
    

    // output.alwaysDiscardsLateVideoFrames = YES; [session addOutput:output];

    dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);
    
    // Specify the pixel format
    output.videoSettings =
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    
    
    // If you wish to cap the frame rate to a known value, such as 15 fps, set
    // minFrameDuration.
    

    // output.minFrameDuration = CMTimeMake(1,1);

    // Start the session running to start the flow of data
      NSLog(@"session is going to start at here");
    [session startRunning];
    
    // Assign session to an ivar.
    //[self setSession:session];
    

    }

    //从样本缓冲区数据创建一个UIImage - (UIImage *)imageFromSampleBuffer :( CMSampleBufferRef)sampleBuffer {NSLog(@“picture is getting”); CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); //锁定像素缓冲区的基地址CVPixelBufferLockBaseAddress(imageBuffer,0);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    
    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    if (!colorSpace)
    {
        NSLog(@"CGColorSpaceCreateDeviceRGB failure");
        return nil;
    }
    
    // Get the base address of the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    // Get the data size for contiguous planes of the pixel buffer.
    size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
    
    // Create a Quartz direct-access data provider that uses data we supply
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize,
                                                              NULL);
    // Create a bitmap image from data supplied by our data provider
    CGImageRef cgImage =
    CGImageCreate(width,
                  height,
                  8,
                  32,
                  bytesPerRow,
                  colorSpace,
                  kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
                  provider,
                  NULL,
                  true,
                  kCGRenderingIntentDefault);
    CGDataProviderRelease(provider);
    CGColorSpaceRelease(colorSpace);
    
    // Create and return an image object representing the specified Quartz image
    UIImage *image = [UIImage imageWithCGImage:cgImage];
    CGImageRelease(cgImage);
    
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    
    return image;
    

    }

    //写入示例缓冲区时调用的委托例程 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer :( CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {//从样本缓冲区数据创建UIImage NSLog @“图片越来越”); UIImage * image = [self imageFromSampleBuffer:sampleBuffer];

    // [self.delegate cameraCaptureGotFrame:image]; }

    现在委托“captureOutput”没有获得通话。 我不知道我在做什么错。 这将帮助我。 提前致谢。

    链接地址: http://www.djcxy.com/p/46525.html

    上一篇: How to get Picture frames by frames in Xcode

    下一篇: Issue Creating and Masking a Solid Color CGImage