较新的WebRTC源代码中已经没有了与VoiceEngine结构相应的VidoeEngine了,取而代之的是MeidaEngine。MediaEngine包括了MediaEngineInterface接口及事实上现CompositeMediaEngine,CompositeMediaEngine本身也是个模板类,两个模板參数各自是音频引擎和视频引擎。CompositeMediaEngine派生类WebRtcMediaEngine依赖的模板參数是WebRtcVoiceEngine和WebRtcVideoEngine2。 上图中base文件夹中是一些抽象类。engine文件夹中是相应抽象类的实现,使用时直接调用engine文件夹中的接口就可以。WebRtcVoiceEngine实际上是VoiceEngine的再次封装。它使用VoiceEngine进行音频处理。
注意命名,WebRtcVideoEngine2带了个2字,不用想,这肯定是个升级版本号的VideoEngine,还有个WebRtcVideoEngine类。
WebRtcVideoEngine2比WebRtcVideoEngine改进之处在于将视频流一分为二:发送流(WebRtcVideoSendStream)和接收流(WebRtcVideoReceiveStream),从而结构上更合理。源代码更清晰。
本文的实现主要是使用了WebRtcVideoEngine2中WebRtcVideoCapturer类。 一.环境 參考上篇:二.实现 打开WebRtcVideoCapturer的头文件webrtcvideocapture.h。公有的函数基本上都是base文件夹中VideoCapturer类的实现,用于初始化设备和启动捕捉。私有函数OnIncomingCapturedFrame和OnCaptureDelayChanged会在摄像头採集模块VideoCaptureModeule中回调。将採集的图像传给OnIncomingCapturedFrame,并将採集的延时变化传给OnCaptureDelayChanged。 WebRTC中也实现了类似Qt中的信号和槽机制,详见 。可是就像在该文中提到的,sigslot.h中的emit函数名会和Qt中的emit宏冲突。我将sigslot.h中的emit改成了Emit,当然改完之后,须要又一次编译rtc_baseproject。 VideoCapturer类有两个信号sigslot::signal2<VideoCapturer*, CaptureState> SignalStateChange和sigslot::signal2<VideoCapturer*, const CapturedFrame*, sigslot::multi_threaded_local> SignalFrameCaptured,从SignalFrameCaptured的參数能够看出我们仅仅要实现相应的槽函数就能获取到CapturedFrame,在槽函数中将 CapturedFrame进行转换显示就可以。SignalStateChange信号的參数CaptureState是个枚举。标识捕捉的状态(停止、開始、正在进行、失败)。 信号SignalFrameCaptured正是在回调函数OnIncomingCapturedFrame中发射出去的。OnIncomingCapturedFrame里面用到了函数的异步运行。详见。
mainwindow.h
#ifndef MAINWINDOW_H#define MAINWINDOW_H#includemainwindow.cpp#include #include
#include "mainwindow.h"#include "ui_mainwindow.h"MainWindow::MainWindow(QWidget *parent) : QMainWindow(parent), ui(new Ui::MainWindow), videoCapturer(new cricket::WebRtcVideoCapturer()), videoFrame(new cricket::WebRtcVideoFrame()){ ui->setupUi(this); getDeviceList();}MainWindow::~MainWindow(){ delete ui; videoCapturer->SignalFrameCaptured.disconnect(this); videoCapturer->SignalStateChange.disconnect(this); videoCapturer->Stop();}void MainWindow::OnFrameCaptured(cricket::VideoCapturer* capturer,const cricket::CapturedFrame* frame){ videoFrame->Init(frame, frame->width, frame->height,true); //将视频图像转成RGB格式 videoFrame->ConvertToRgbBuffer(cricket::FOURCC_ARGB, videoImage.get(), videoFrame->width()*videoFrame->height()*32/8, videoFrame->width()*32/8); QImage image(videoImage.get(), videoFrame->width(), videoFrame->height(), QImage::Format_RGB32); ui->label->setPixmap(QPixmap::fromImage(image));}void MainWindow::OnStateChange(cricket::VideoCapturer* capturer, cricket::CaptureState state){}void MainWindow::getDeviceList(){ deviceNameList.clear(); deviceIDList.clear(); webrtc::VideoCaptureModule::DeviceInfo *info=webrtc::VideoCaptureFactory::CreateDeviceInfo(0); int deviceNum=info->NumberOfDevices(); for (int i = 0; i < deviceNum; ++i) { const uint32_t kSize = 256; char name[kSize] = {0}; char id[kSize] = {0}; if (info->GetDeviceName(i, name, kSize, id, kSize) != -1) { deviceNameList.append(QString(name)); deviceIDList.append(QString(id)); ui->comboBoxDeviceList->addItem(QString(name)); } } if(deviceNum==0) { ui->pushButtonOpen->setEnabled(false); }}void MainWindow::on_pushButtonOpen_clicked(){ static bool flag=true; if(flag) { ui->pushButtonOpen->setText(QStringLiteral("关闭")); const std::string kDeviceName = ui->comboBoxDeviceList->currentText().toStdString(); const std::string kDeviceId = deviceIDList.at(ui->comboBoxDeviceList->currentIndex()).toStdString(); videoCapturer->Init(cricket::Device(kDeviceName, kDeviceId)); int width=videoCapturer->GetSupportedFormats()->at(0).width; int height=videoCapturer->GetSupportedFormats()->at(0).height; cricket::VideoFormat format(videoCapturer->GetSupportedFormats()->at(0)); //開始捕捉 if(cricket::CS_STARTING == videoCapturer->Start(format)) { qDebug()<<"Capture is started"; } //连接WebRTC的信号和槽 videoCapturer->SignalFrameCaptured.connect(this,&MainWindow::OnFrameCaptured); videoCapturer->SignalStateChange.connect(this,&MainWindow::OnStateChange); if(videoCapturer->IsRunning()) { qDebug()<<"Capture is running"; } videoImage.reset(new uint8_t[width*height*32/8]); } else { ui->pushButtonOpen->setText(QStringLiteral("打开")); //反复连接会报错,须要先断开,才干再次连接 videoCapturer->SignalFrameCaptured.disconnect(this); videoCapturer->SignalStateChange.disconnect(this); videoCapturer->Stop(); if(!videoCapturer->IsRunning()) { qDebug()<<"Capture is stoped"; } ui->label->clear(); } flag=!flag;}main.cpp
#include "mainwindow.h"#include注意main函数中对WebRTC和Qt消息循环的处理,这是用Qt调用WebRTC进行摄像头捕捉和显示的关键。int main(int argc, char *argv[]){ QApplication a(argc, argv); MainWindow w; w.show(); while(true) { //WebRTC消息循环 rtc::Thread::Current()->ProcessMessages(0); rtc::Thread::Current()->SleepMs(1); //Qt消息循环 a.processEvents( ); }}
三.效果