How to convert a Color32[] image from WebCamTexture to a UIImage in iOS?

Does anyone know how to do this efficiently? I am using the EncodeToPNG() function, but the performance is really slow.

I am trying to capture the Ipad camera image using WebCamTexture and send it to the Objective C side to do some processing. I noticed that it is possible to send the native adress of the texture but how should I handle it on the ObjC side? Does anyone have any tips for this?

Thanks!


As you say performance on EncodeToPNG is too slow, what you need to do is to hook into the code before the camera feed gets sent from iOS ( objc ) to Unity's WebCamTexture.

We used a similar technique with a plugin called CameraCaptureKit ( https://www.assetstore.unity3d.com/en/#!/content/56673 ) to freeze the image send to Unity while waiting for the Flash and anti -shake being turned on.

In the xcode project generated by Unity you can open up CameraCapture.mm and find the function.

- (void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection

You can then modify the image sent to Unity's WebCamTexture by modifying this code. UnityDidCaptureVideoFrame is the code you want to plug into.

intptr_t tex = (intptr_t)CMVideoSampling_SampleBuffer(&self->_cmVideoSampling, sampleBuffer, &self->_width, &self->_height);
UnityDidCaptureVideoFrame(tex, self->_userData);

Cheers

链接地址: http://www.djcxy.com/p/68314.html

上一篇: 如何使用Git进行Unity3D源代码控制?

下一篇: 如何将WebCamTexture中的Color32 []图像转换为iOS中的UIImage?