iOS利用摄像头获取环境光感参数的方法
本文介绍了iOS利用摄像头获取环境光感参数的方法,分享给大家,具体如下:
不多说,代码如下:
#import"LightSensitiveViewController.h" @importAVFoundation; #import@interfaceLightSensitiveViewController() @property(nonatomic,strong)AVCaptureSession*session; @end @implementationLightSensitiveViewController -(void)viewDidLoad{ [superviewDidLoad]; //Doanyadditionalsetupafterloadingtheview. self.view.backgroundColor=[UIColorwhiteColor]; self.navigationItem.title=@"光感"; [selflightSensitive]; } -(void)didReceiveMemoryWarning{ [superdidReceiveMemoryWarning]; //Disposeofanyresourcesthatcanberecreated. } #pragmamark-光感 -(void)lightSensitive{ //1.获取硬件设备 AVCaptureDevice*device=[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo]; //2.创建输入流 AVCaptureDeviceInput*input=[[AVCaptureDeviceInputalloc]initWithDevice:deviceerror:nil]; //3.创建设备输出流 AVCaptureVideoDataOutput*output=[[AVCaptureVideoDataOutputalloc]init]; [outputsetSampleBufferDelegate:selfqueue:dispatch_get_main_queue()]; //AVCaptureSession属性 self.session=[[AVCaptureSessionalloc]init]; //设置为高质量采集率 [self.sessionsetSessionPreset:AVCaptureSessionPresetHigh]; //添加会话输入和输出 if([self.sessioncanAddInput:input]){ [self.sessionaddInput:input]; } if([self.sessioncanAddOutput:output]){ [self.sessionaddOutput:output]; } //9.启动会话 [self.sessionstartRunning]; } #pragmamark-AVCaptureVideoDataOutputSampleBufferDelegate的方法 -(void)captureOutput:(AVCaptureOutput*)captureOutputdidOutputSampleBuffer:(CMSampleBufferRef)sampleBufferfromConnection:(AVCaptureConnection*)connection{ CFDictionaryRefmetadataDict=CMCopyDictionaryOfAttachments(NULL,sampleBuffer,kCMAttachmentMode_ShouldPropagate); NSDictionary*metadata=[[NSMutableDictionaryalloc]initWithDictionary:(__bridgeNSDictionary*)metadataDict]; CFRelease(metadataDict); NSDictionary*exifMetadata=[[metadataobjectForKey:(NSString*)kCGImagePropertyExifDictionary]mutableCopy]; floatbrightnessValue=[[exifMetadataobjectForKey:(NSString*)kCGImagePropertyExifBrightnessValue]floatValue]; NSLog(@"%f",brightnessValue); //根据brightnessValue的值来打开和关闭闪光灯 AVCaptureDevice*device=[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo]; BOOLresult=[devicehasTorch];//判断设备是否有闪光灯 if((brightnessValue<0)&&result){//打开闪光灯 [devicelockForConfiguration:nil]; [devicesetTorchMode:AVCaptureTorchModeOn];//开 [deviceunlockForConfiguration]; }elseif((brightnessValue>0)&&result){//关闭闪光灯 [devicelockForConfiguration:nil]; [devicesetTorchMode:AVCaptureTorchModeOff];//关 [deviceunlockForConfiguration]; } } @end
注意点:
- 首先引入AVFoundation框架和ImageIO/ImageIO.h声明文件
- 遵循AVCaptureVideoDataOutputSampleBufferDelegate协议
- AVCaptureSession对象要定义为属性,确保有对象在一直引用AVCaptureSession对象;否则如果在lightSensitive方法中定义并初始化AVCaptureSession对象,会造成AVCaptureSession对象提前释放,[self.sessionstartRunning];会失效
- 实现AVCaptureVideoDataOutputSampleBufferDelegate的代理方法,参数brightnessValue就是周围环境的亮度参数了,范围大概在-5~~12之间,参数数值越大,环境越亮
以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持毛票票。