IOS技术分享| 在iOS WebRTC 中添加美颜滤镜

迭代根系
• 阅读 187

在使用WebRTC的时候,对视频进行美颜处理一般有两种方式:替换WebRTC中的采集模块对视频数据进行美颜

一、替换WebRTC中的采集模块

替换WebRTC中的采集模块,相对比较简单,使用GPUImageVideoCamera替换WebRTC中的视频采集,得到经过GPUImage添加美颜处理后的图像,发送给WebRTC的OnFrame方法。

参考基于WebRTC框架开发的全平台推拉流SDK:Github

设置美颜

- (void)setBeautyFace:(BOOL)beautyFace{
    if(_beautyFace == beautyFace) return;

    _beautyFace = beautyFace;
    [_emptyFilter removeAllTargets];
    [_filter removeAllTargets];
    [_videoCamera removeAllTargets];

     if(_beautyFace){
         _filter = [[GPUImageBeautifyFilter alloc] init];
         _emptyFilter = [[GPUImageEmptyFilter alloc] init];
     }else{
         _filter = [[GPUImageEmptyFilter alloc] init];
     }

     __weak typeof(self) _self = self;
     [_filter setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) {
         [_self processVideo:output];
     }];
       [_videoCamera addTarget:_filter];
  if (beautyFace) {
      [_filter addTarget:_emptyFilter];
      if(_gpuImageView) [_emptyFilter addTarget:_gpuImageView];
  } else {
      if(_gpuImageView) [_filter addTarget:_gpuImageView];
  }
}

格式转换

GPUImage处理后的Pixel格式为BGRA,当处理完成后需要转换为I420格式,用于内部处理和渲染。

WebRTC 在编码的时候使用的是NV12格式的Pixel,所以在编码的时候会进行二次格式转换
-(void) processVideo:(GPUImageOutput *)output{
  rtc::CritScope cs(&cs_capture_);
  if (!_isRunning) {
      return;
  }
  @autoreleasepool {
      GPUImageFramebuffer *imageFramebuffer = output.framebufferForOutput;

  size_t width = imageFramebuffer.size.width;
  size_t height = imageFramebuffer.size.height;
  uint32_t size = width * height * 3 / 2;

  if(self.nWidth != width || self.nHeight != height)
  {
      self.nWidth = width;
      self.nHeight = height;
      if(_dst)
          delete[] _dst;
      _dst = NULL;
  }
  if(_dst == NULL)
  {
      _dst = new uint8_t[size];
  }
  uint8_t* y_pointer = (uint8_t*)_dst;
  uint8_t* u_pointer = (uint8_t*)y_pointer + width*height;
  uint8_t* v_pointer = (uint8_t*)u_pointer + width*height/4;
  int y_pitch = width;
  int u_pitch = (width + 1) >> 1;
  int v_pitch = (width + 1) >> 1;

  libyuv::ARGBToI420([imageFramebuffer byteBuffer], width * 4, y_pointer, y_pitch, u_pointer, u_pitch, v_pointer, v_pitch, width, height);
  if(self.bVideoEnable)
      libyuv::I420Rect(y_pointer, y_pitch, u_pointer, u_pitch, v_pointer, v_pitch, 0, 0, width, height, 32, 128, 128);

  if(_capturer != nil)
      _capturer->CaptureYUVData(_dst, width, height, size);
  }
}

美颜后的数据发送给WebRTC的OnFrame方法

GPUImageVideoCapturer 类为GPUImage 封装的摄像头类,跟WebRTC中的采集类功能保持一致,继承 cricket::VideoCapturer 类,便可以往WebRTC中塞入采集的音视频流。

namespace webrtc {
  // 继承cricket::VideoCapturer
    class GPUImageVideoCapturer : public cricket::VideoCapturer {
        ...
    }
}
void GPUImageVideoCapturer::CaptureYUVData(const webrtc::VideoFrame& frame, int width, int height)
{
    VideoCapturer::OnFrame(frame, width, height);
}

二、对视频数据进行美颜

对视频数据美颜的思路就是传统的第三方美颜SDK的做法,对内部采集的音视频数据进行处理:内部采集的数据(CVPixelBufferRef)-》转换为纹理(GLuint)-》对纹理进行音视频的美颜-》美颜的纹理转换为iOS的采集数据(CVPixelBufferRef)-》返回给WebRTC内部进行渲染编码和传输。

同步线程

内部处理的一般都是使用同步线程,这样能够保证数据线性流动,参阅GPUImage中的代码片段

 runSynchronouslyOnVideoProcessingQueue(^{
 // 美颜处理
 });

把CVPixelBufferRef 数据转换为纹理(GLuint)

RGB格式类型的转换方式
  • CoreVideo框架的方法:使用此方法可以创建CVOpenGLESTextureRef纹理,并通过CVOpenGLESTextureGetName(texture)获取纹理id。

    - (GLuint)convertRGBPixelBufferToTexture:(CVPixelBufferRef)pixelBuffer {
        if (!pixelBuffer) {
            return 0;
        }
        CGSize textureSize = CGSizeMake(CVPixelBufferGetWidth(pixelBuffer),
                                        CVPixelBufferGetHeight(pixelBuffer));
        CVOpenGLESTextureRef texture = nil;
        CVReturn status = CVOpenGLESTextureCacheCreateTextureFromImage(nil,
                                                                               [[GPUImageContext sharedImageProcessingContext] coreVideoTextureCache],
                                                                       pixelBuffer,
                                                                       nil,
                                                                       GL_TEXTURE_2D,
                                                                       GL_RGBA,
                                                                       textureSize.width,
                                                                       textureSize.height,
                                                                       GL_BGRA,
                                                                       GL_UNSIGNED_BYTE,
                                                                       0,
                                                                       &texture);
        
        if (status != kCVReturnSuccess) {
            NSLog(@"Can't create texture");
        }
        self.renderTexture = texture;
        return CVOpenGLESTextureGetName(texture);
    }
  • OpenGL的方法:创建纹理对象,使用glTexImage2D方法上传CVPixelBufferRef中图像数据data到纹理对象中。

        glBindTexture(GL_TEXTURE_2D, [outputFramebuffer texture]);
        glTexImage2D(GL_TEXTURE_2D, 0, _pixelFormat==GPUPixelFormatRGB ? GL_RGB : GL_RGBA, (int)uploadedImageSize.width, (int)uploadedImageSize.height, 0, (GLint)_pixelFormat, (GLenum)_pixelType, bytesToUpload);
YUV格式类型的转换方式
- (GLuint)convertYUVPixelBufferToTexture:(CVPixelBufferRef)pixelBuffer {
    if (!pixelBuffer) {
        return 0;
    }
    
    CGSize textureSize = CGSizeMake(CVPixelBufferGetWidth(pixelBuffer),
                                    CVPixelBufferGetHeight(pixelBuffer));

    [EAGLContext setCurrentContext:self.context];
    
    GLuint frameBuffer;
    GLuint textureID;
    
    // FBO
    glGenFramebuffers(1, &frameBuffer);
    glBindFramebuffer(GL_FRAMEBUFFER, frameBuffer);
    
    // texture
    glGenTextures(1, &textureID);
    glBindTexture(GL_TEXTURE_2D, textureID);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, textureSize.width, textureSize.height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
    
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    
    
    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, textureID, 0);
    
    glViewport(0, 0, textureSize.width, textureSize.height);
    
    // program
    glUseProgram(self.yuvConversionProgram);
    
    // texture
    CVOpenGLESTextureRef luminanceTextureRef = nil;
    CVOpenGLESTextureRef chrominanceTextureRef = nil;

    CVReturn status = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                                   self.textureCache,
                                                                   pixelBuffer,
                                                                   nil,
                                                                   GL_TEXTURE_2D,
                                                                   GL_LUMINANCE,
                                                                   textureSize.width,
                                                                   textureSize.height,
                                                                   GL_LUMINANCE,
                                                                   GL_UNSIGNED_BYTE,
                                                                   0,
                                                                   &luminanceTextureRef);
    if (status != kCVReturnSuccess) {
        NSLog(@"Can't create luminanceTexture");
    }
    
    status = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                          self.textureCache,
                                                          pixelBuffer,
                                                          nil,
                                                          GL_TEXTURE_2D,
                                                          GL_LUMINANCE_ALPHA,
                                                          textureSize.width / 2,
                                                          textureSize.height / 2,
                                                          GL_LUMINANCE_ALPHA,
                                                          GL_UNSIGNED_BYTE,
                                                          1,
                                                          &chrominanceTextureRef);
    
    if (status != kCVReturnSuccess) {
        NSLog(@"Can't create chrominanceTexture");
    }
    
    glActiveTexture(GL_TEXTURE0);
    glBindTexture(GL_TEXTURE_2D, CVOpenGLESTextureGetName(luminanceTextureRef));
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glUniform1i(glGetUniformLocation(self.yuvConversionProgram, "luminanceTexture"), 0);
    
    glActiveTexture(GL_TEXTURE1);
    glBindTexture(GL_TEXTURE_2D, CVOpenGLESTextureGetName(chrominanceTextureRef));
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glUniform1i(glGetUniformLocation(self.yuvConversionProgram, "chrominanceTexture"), 1);
    
    GLfloat kXDXPreViewColorConversion601FullRange[] = {
        1.0,    1.0,    1.0,
        0.0,    -0.343, 1.765,
        1.4,    -0.711, 0.0,
    };
    
    GLuint yuvConversionMatrixUniform = glGetUniformLocation(self.yuvConversionProgram, "colorConversionMatrix");
    glUniformMatrix3fv(yuvConversionMatrixUniform, 1, GL_FALSE, kXDXPreViewColorConversion601FullRange);
    
    // VBO
    glBindBuffer(GL_ARRAY_BUFFER, self.VBO);
    
    GLuint positionSlot = glGetAttribLocation(self.yuvConversionProgram, "position");
    glEnableVertexAttribArray(positionSlot);
    glVertexAttribPointer(positionSlot, 3, GL_FLOAT, GL_FALSE, 5 * sizeof(float), (void*)0);
    
    GLuint textureSlot = glGetAttribLocation(self.yuvConversionProgram, "inputTextureCoordinate");
    glEnableVertexAttribArray(textureSlot);
    glVertexAttribPointer(textureSlot, 2, GL_FLOAT, GL_FALSE, 5 * sizeof(float), (void*)(3* sizeof(float)));
    
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
    
    glDeleteFramebuffers(1, &frameBuffer);
    
    glBindFramebuffer(GL_FRAMEBUFFER, 0);
    glBindBuffer(GL_ARRAY_BUFFER, 0);
    
    glFlush();
    
    self.luminanceTexture = luminanceTextureRef;
    self.chrominanceTexture = chrominanceTextureRef;
    if (luminanceTextureRef) {
        CFRelease(luminanceTextureRef);
    }
    if (chrominanceTextureRef) {
        CFRelease(chrominanceTextureRef);
    }
  
    
    return textureID;
}

使用GPUImageTextureInput 加载滤镜和使用GPUImageTextureOutput输出数据

    [GPUImageContext setActiveShaderProgram:nil];
        GPUImageTextureInput *textureInput = [[ARGPUImageTextureInput alloc] initWithTexture:textureID size:size];
     GPUImageSmoothToonFilter *filter = [[GPUImageSmoothToonFilter alloc] init];
     [textureInput addTarget:filter];
     GPUImageTextureOutput *textureOutput = [[GPUImageTextureOutput alloc] init];
     [filter addTarget:textureOutput];
     [textureInput processTextureWithFrameTime:kCMTimeZero];

得到textureOutput,即得到输出的纹理。

GPUImageTextureOutput纹理转化为CVPixelBufferRef 数据

- (CVPixelBufferRef)convertTextureToPixelBuffer:(GLuint)texture
                                    textureSize:(CGSize)textureSize {
    [EAGLContext setCurrentContext:self.context];
    
    CVPixelBufferRef pixelBuffer = [self createPixelBufferWithSize:textureSize];
    GLuint targetTextureID = [self convertRGBPixelBufferToTexture:pixelBuffer];
    
    GLuint frameBuffer;
    
    // FBO
    glGenFramebuffers(1, &frameBuffer);
    glBindFramebuffer(GL_FRAMEBUFFER, frameBuffer);
    
    // texture
    glBindTexture(GL_TEXTURE_2D, targetTextureID);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, textureSize.width, textureSize.height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
    
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    
    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, targetTextureID, 0);
    
    glViewport(0, 0, textureSize.width, textureSize.height);
    
    // program
    glUseProgram(self.normalProgram);
    
    // texture
    glActiveTexture(GL_TEXTURE0);
    glBindTexture(GL_TEXTURE_2D, texture);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glUniform1i(glGetUniformLocation(self.normalProgram, "renderTexture"), 0);
    
    // VBO
    glBindBuffer(GL_ARRAY_BUFFER, self.VBO);
    
    GLuint positionSlot = glGetAttribLocation(self.normalProgram, "position");
    glEnableVertexAttribArray(positionSlot);
    glVertexAttribPointer(positionSlot, 3, GL_FLOAT, GL_FALSE, 5 * sizeof(float), (void*)0);
    
    GLuint textureSlot = glGetAttribLocation(self.normalProgram, "inputTextureCoordinate");
    glEnableVertexAttribArray(textureSlot);
    glVertexAttribPointer(textureSlot, 2, GL_FLOAT, GL_FALSE, 5 * sizeof(float), (void*)(3* sizeof(float)));
    
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
    
    glDeleteFramebuffers(1, &frameBuffer);
    
    glBindFramebuffer(GL_FRAMEBUFFER, 0);
    glBindBuffer(GL_ARRAY_BUFFER, 0);
    
    glFlush();
    
    return pixelBuffer;
}

把美颜后的CVPixelBufferRef同步返回给SDK,进行渲染传输。

三、总结

对音视频的美颜,已经成为了音视频应用的常用功能,除了上述两种做法外,还可以使用第三方美颜,一般音视频厂商都有提供自采集功能,而第三方美颜功能则提供有采集美颜相机功能,二者正好可以无缝结合。如果自身的应用中对美颜要求不是很高,采用音视频SDK自带的美颜即可(美白、美颜、红润),如果用在娱乐场景,除了美颜,还要美型(廋脸,大眼)、贴纸(2D、3D)的,必须要集成第三方美颜SDK了。

点赞
收藏
评论区
推荐文章
blmius blmius
3年前
MySQL:[Err] 1292 - Incorrect datetime value: ‘0000-00-00 00:00:00‘ for column ‘CREATE_TIME‘ at row 1
文章目录问题用navicat导入数据时,报错:原因这是因为当前的MySQL不支持datetime为0的情况。解决修改sql\mode:sql\mode:SQLMode定义了MySQL应支持的SQL语法、数据校验等,这样可以更容易地在不同的环境中使用MySQL。全局s
Wesley13 Wesley13
3年前
MySQL部分从库上面因为大量的临时表tmp_table造成慢查询
背景描述Time:20190124T00:08:14.70572408:00User@Host:@Id:Schema:sentrymetaLast_errno:0Killed:0Query_time:0.315758Lock_
美凌格栋栋酱 美凌格栋栋酱
6个月前
Oracle 分组与拼接字符串同时使用
SELECTT.,ROWNUMIDFROM(SELECTT.EMPLID,T.NAME,T.BU,T.REALDEPART,T.FORMATDATE,SUM(T.S0)S0,MAX(UPDATETIME)CREATETIME,LISTAGG(TOCHAR(
皕杰报表之UUID
​在我们用皕杰报表工具设计填报报表时,如何在新增行里自动增加id呢?能新增整数排序id吗?目前可以在新增行里自动增加id,但只能用uuid函数增加UUID编码,不能新增整数排序id。uuid函数说明:获取一个UUID,可以在填报表中用来创建数据ID语法:uuid()或uuid(sep)参数说明:sep布尔值,生成的uuid中是否包含分隔符'',缺省为
Jacquelyn38 Jacquelyn38
4年前
2020年前端实用代码段,为你的工作保驾护航
有空的时候,自己总结了几个代码段,在开发中也经常使用,谢谢。1、使用解构获取json数据let jsonData  id: 1,status: "OK",data: 'a', 'b';let  id, status, data: number   jsonData;console.log(id, status, number )
Stella981 Stella981
3年前
JS 苹果手机日期显示NaN问题
问题描述newDate("2019122910:30:00")在IOS下显示为NaN原因分析带的日期IOS下存在兼容问题解决方法字符串替换letdateStr"2019122910:30:00";datedateStr.repl
Wesley13 Wesley13
3年前
FLV文件格式
1.        FLV文件对齐方式FLV文件以大端对齐方式存放多字节整型。如存放数字无符号16位的数字300(0x012C),那么在FLV文件中存放的顺序是:|0x01|0x2C|。如果是无符号32位数字300(0x0000012C),那么在FLV文件中的存放顺序是:|0x00|0x00|0x00|0x01|0x2C。2.  
Wesley13 Wesley13
3年前
mysql设置时区
mysql设置时区mysql\_query("SETtime\_zone'8:00'")ordie('时区设置失败,请联系管理员!');中国在东8区所以加8方法二:selectcount(user\_id)asdevice,CONVERT\_TZ(FROM\_UNIXTIME(reg\_time),'08:00','0
Stella981 Stella981
3年前
Django中Admin中的一些参数配置
设置在列表中显示的字段,id为django模型默认的主键list_display('id','name','sex','profession','email','qq','phone','status','create_time')设置在列表可编辑字段list_editable
Python进阶者 Python进阶者
1年前
Excel中这日期老是出来00:00:00,怎么用Pandas把这个去除
大家好,我是皮皮。一、前言前几天在Python白银交流群【上海新年人】问了一个Pandas数据筛选的问题。问题如下:这日期老是出来00:00:00,怎么把这个去除。二、实现过程后来【论草莓如何成为冻干莓】给了一个思路和代码如下:pd.toexcel之前把这