本文主要抛砖引玉,粗略介绍下Android平台RTMP/RTSP播放器中解码和绘制相关的部分(Github)。
解码
提到解码,大家都知道软硬解,甚至一些公司觉得硬解码已经足够通用,慢慢抛弃软解了,如果考虑到设备匹配,软硬解码都支持,是个不错的选择,为此,大牛直播SDK在开发这块的时候,分类是这样的:
1. 软解码:解码后获取到原始数据,可进行后续的原始数据回调和快照等操作;
2. 硬解码:解码后获取到原始数据,可进行后续的原始数据回调和快照等操作;
3. 硬解码:设置surface模式,直接render到设置的surface上,不可进行快照和解码后数据回调操作。
大家可能会疑惑,有了模式2,干嘛要再支持模式3呢?模式2和3分别有什么优势呢?
硬解码直接设置surface模式,相对来说,大多芯片支持更好,解码通用性更好,而且减少了数据拷贝,资源占用更低,缺点是无法获得解码后的原始数据,更像个黑盒操作;模式2兼顾了硬解码资源占用(相对软解)和二次操作原始数据能力(如针对解码后的yuv/rgb数据二次处理),解码通用性相对模式3略差,但数据处理更灵活。
相关接口:
/*** Set Video H.264 HW decoder(设置H.264硬解码)** @param handle: return value from SmartPlayerOpen()** @param isHWDecoder: 0: software decoder; 1: hardware decoder.** @return {0} if successful*/public native int SetSmartPlayerVideoHWDecoder(long handle, int isHWDecoder);/*** Set Video H.265(hevc) HW decoder(设置H.265硬解码)** @param handle: return value from SmartPlayerOpen()** @param isHevcHWDecoder: 0: software decoder; 1: hardware decoder.** @return {0} if successful*/public native int SetSmartPlayerVideoHevcHWDecoder(long handle, int isHevcHWDecoder);/*** Set Surface view(设置播放的surfaceview).** @param handle: return value from SmartPlayerOpen()** @param surface: surface view** <pre> NOTE: if not set or set surface with null, it will playback audio only. </pre> ** @return {0} if successful*/public native int SmartPlayerSetSurface(long handle, Object surface);
考虑到不是所有设备都支持硬解,大牛直播SDK的设计思路是先做硬解检测,检测到不支持,直接切换到软解模式。
绘制
大牛直播SDK的RTMP和RTSP播放器绘制这块,支持两种模式,普通的SurfaceView和GLSurface,普通的surface兼容性更好,GLSurface绘制相对来说更细腻,此外,普通的surface模式下,还支持了一些抗锯齿参数设置。两种模式下,都设计了视频画面的填充模式设置选项(是否等比例显示),具体接口设计如下:
/*** 设置视频画面的填充模式,如填充整个view、等比例填充view,如不设置,默认填充整个view* @param handle: return value from SmartPlayerOpen()* @param render_scale_mode 0: 填充整个view; 1: 等比例填充view, 默认值是0* @return {0} if successful*/public native int SmartPlayerSetRenderScaleMode(long handle, int render_scale_mode);/*** 设置SurfaceView模式下(NTRenderer.CreateRenderer第二个参数传false的情况),render类型** @param handle: return value from SmartPlayerOpen()** @param format: 0: RGB565格式,如不设置,默认此模式; 1: ARGB8888格式** @return {0} if successful*/public native int SmartPlayerSetSurfaceRenderFormat(long handle, int format);/*** 设置SurfaceView模式下(NTRenderer.CreateRenderer第二个参数传false的情况),抗锯齿效果,注意:抗锯齿模式开启后,可能会影像性能,请慎用** @param handle: return value from SmartPlayerOpen()** @param isEnableAntiAlias: 0: 如不设置,默认不开启抗锯齿模式; 1: 开启抗锯齿模式** @return {0} if successful*/public native int SmartPlayerSetSurfaceAntiAlias(long handle, int isEnableAntiAlias);
音频输出这块,可以考虑audiotrack和opensl es,考虑到通用性,可以选择audiotrack模式,当然最好是设置个选项,用户自行选择:
/*** Set AudioOutput Type(设置audio输出类型)** @param handle: return value from SmartPlayerOpen()** @param use_audiotrack:** <pre> NOTE: if use_audiotrack with 0: it will use auto-select output devices; if with 1: will use audio-track mode. </pre>** @return {0} if successful*/public native int SmartPlayerSetAudioOutputType(long handle, int use_audiotrack);
视频view反转/旋转
/*** 设置视频垂直反转** @param handle: return value from SmartPlayerOpen()** @param is_flip: 0: 不反转, 1: 反转** @return {0} if successful*/public native int SmartPlayerSetFlipVertical(long handle, int is_flip);/*** 设置视频水平反转** @param handle: return value from SmartPlayerOpen()** @param is_flip: 0: 不反转, 1: 反转** @return {0} if successful*/public native int SmartPlayerSetFlipHorizontal(long handle, int is_flip);/*** 设置顺时针旋转, 注意除了0度之外, 其他角度都会额外消耗性能** @param handle: return value from SmartPlayerOpen()** @param degress: 当前支持 0度,90度, 180度, 270度 旋转** @return {0} if successful*/public native int SmartPlayerSetRotation(long handle, int degress);
解码后原始数据回调
在有些场景下,开发者需要针对解码后的YUV/RGB或者PCM数据进行处理,这个时候,需要设计针对解码后数据回调的接口模型:
/*** Set External Render(设置回调YUV/RGB数据)** @param handle: return value from SmartPlayerOpen()** @param external_render: External Render** @return {0} if successful*/public native int SmartPlayerSetExternalRender(long handle, Object external_render);/*** Set External Audio Output(设置回调PCM数据)** @param handle: return value from SmartPlayerOpen()** @param external_audio_output: External Audio Output** @return {0} if successful*/public native int SmartPlayerSetExternalAudioOutput(long handle, Object external_audio_output);
具体调用实例:
//libPlayer.SmartPlayerSetExternalRender(playerHandle, new RGBAExternalRender());
//libPlayer.SmartPlayerSetExternalRender(playerHandle, new I420ExternalRender());
拿到原始数据,进行二次操作(如人脸识别等):
class RGBAExternalRender implements NTExternalRender {// public static final int NT_FRAME_FORMAT_RGBA = 1;// public static final int NT_FRAME_FORMAT_ABGR = 2;// public static final int NT_FRAME_FORMAT_I420 = 3;private int width_ = 0;private int height_ = 0;private int row_bytes_ = 0;private ByteBuffer rgba_buffer_ = null;@Overridepublic int getNTFrameFormat() {Log.i(TAG, "RGBAExternalRender::getNTFrameFormat return "+ NT_FRAME_FORMAT_RGBA);return NT_FRAME_FORMAT_RGBA;}@Overridepublic void onNTFrameSizeChanged(int width, int height) {width_ = width;height_ = height;row_bytes_ = width_ * 4;Log.i(TAG, "RGBAExternalRender::onNTFrameSizeChanged width_:"+ width_ + " height_:" + height_);rgba_buffer_ = ByteBuffer.allocateDirect(row_bytes_ * height_);}@Overridepublic ByteBuffer getNTPlaneByteBuffer(int index) {if (index == 0) {return rgba_buffer_;} else {Log.e(TAG,"RGBAExternalRender::getNTPlaneByteBuffer index error:"+ index);return null;}}@Overridepublic int getNTPlanePerRowBytes(int index) {if (index == 0) {return row_bytes_;} else {Log.e(TAG,"RGBAExternalRender::getNTPlanePerRowBytes index error:"+ index);return 0;}}public void onNTRenderFrame(int width, int height, long timestamp) {if (rgba_buffer_ == null)return;rgba_buffer_.rewind();// copy buffer// test// byte[] test_buffer = new byte[16];// rgba_buffer_.get(test_buffer);Log.i(TAG, "RGBAExternalRender:onNTRenderFrame w=" + width + " h="+ height + " timestamp=" + timestamp);// Log.i(TAG, "RGBAExternalRender:onNTRenderFrame rgba:" +// bytesToHexString(test_buffer));}}class I420ExternalRender implements NTExternalRender {// public static final int NT_FRAME_FORMAT_RGBA = 1;// public static final int NT_FRAME_FORMAT_ABGR = 2;// public static final int NT_FRAME_FORMAT_I420 = 3;private int width_ = 0;private int height_ = 0;private int y_row_bytes_ = 0;private int u_row_bytes_ = 0;private int v_row_bytes_ = 0;private ByteBuffer y_buffer_ = null;private ByteBuffer u_buffer_ = null;private ByteBuffer v_buffer_ = null;@Overridepublic int getNTFrameFormat() {Log.i(TAG, "I420ExternalRender::getNTFrameFormat return "+ NT_FRAME_FORMAT_I420);return NT_FRAME_FORMAT_I420;}@Overridepublic void onNTFrameSizeChanged(int width, int height) {width_ = width;height_ = height;y_row_bytes_ = (width_ + 15) & (~15);u_row_bytes_ = ((width_ + 1) / 2 + 15) & (~15);v_row_bytes_ = ((width_ + 1) / 2 + 15) & (~15);y_buffer_ = ByteBuffer.allocateDirect(y_row_bytes_ * height_);u_buffer_ = ByteBuffer.allocateDirect(u_row_bytes_* ((height_ + 1) / 2));v_buffer_ = ByteBuffer.allocateDirect(v_row_bytes_* ((height_ + 1) / 2));Log.i(TAG, "I420ExternalRender::onNTFrameSizeChanged width_="+ width_ + " height_=" + height_ + " y_row_bytes_="+ y_row_bytes_ + " u_row_bytes_=" + u_row_bytes_+ " v_row_bytes_=" + v_row_bytes_);}@Overridepublic ByteBuffer getNTPlaneByteBuffer(int index) {if (index == 0) {return y_buffer_;} else if (index == 1) {return u_buffer_;} else if (index == 2) {return v_buffer_;} else {Log.e(TAG, "I420ExternalRender::getNTPlaneByteBuffer index error:" + index);return null;}}@Overridepublic int getNTPlanePerRowBytes(int index) {if (index == 0) {return y_row_bytes_;} else if (index == 1) {return u_row_bytes_;} else if (index == 2) {return v_row_bytes_;} else {Log.e(TAG, "I420ExternalRender::getNTPlanePerRowBytes index error:" + index);return 0;}}public void onNTRenderFrame(int width, int height, long timestamp) {if (y_buffer_ == null)return;if (u_buffer_ == null)return;if (v_buffer_ == null)return;y_buffer_.rewind();u_buffer_.rewind();v_buffer_.rewind();/*if ( !is_saved_image ){is_saved_image = true;int y_len = y_row_bytes_*height_;int u_len = u_row_bytes_*((height_+1)/2);int v_len = v_row_bytes_*((height_+1)/2);int data_len = y_len + (y_row_bytes_*((height_+1)/2));byte[] nv21_data = new byte[data_len];byte[] u_data = new byte[u_len];byte[] v_data = new byte[v_len];y_buffer_.get(nv21_data, 0, y_len);u_buffer_.get(u_data, 0, u_len);v_buffer_.get(v_data, 0, v_len);int[] strides = new int[2];strides[0] = y_row_bytes_;strides[1] = y_row_bytes_;int loop_row_c = ((height_+1)/2);int loop_c = ((width_+1)/2);int dst_row = y_len;int src_v_row = 0;int src_u_row = 0;for ( int i = 0; i < loop_row_c; ++i){int dst_pos = dst_row;for ( int j = 0; j <loop_c; ++j ){nv21_data[dst_pos++] = v_data[src_v_row + j];nv21_data[dst_pos++] = u_data[src_u_row + j];}dst_row += y_row_bytes_;src_v_row += v_row_bytes_;src_u_row += u_row_bytes_;}String imagePath = "/sdcard" + "/" + "testonv21" + ".jpeg";Log.e(TAG, "I420ExternalRender::begin test save iamge++ image_path:" + imagePath);try{File file = new File(imagePath);FileOutputStream image_os = new FileOutputStream(file);YuvImage image = new YuvImage(nv21_data, ImageFormat.NV21, width_, height_, strides);image.compressToJpeg(new android.graphics.Rect(0, 0, width_, height_), 50, image_os);image_os.flush();image_os.close();}catch(IOException e){e.printStackTrace();}Log.e(TAG, "I420ExternalRender::begin test save iamge--");}*/Log.i(TAG, "I420ExternalRender::onNTRenderFrame w=" + width + " h=" + height + " timestamp=" + timestamp);// copy buffer// test// byte[] test_buffer = new byte[16];// y_buffer_.get(test_buffer);// Log.i(TAG, "I420ExternalRender::onNTRenderFrame y data:" + bytesToHexString(test_buffer));// u_buffer_.get(test_buffer);// Log.i(TAG, "I420ExternalRender::onNTRenderFrame u data:" + bytesToHexString(test_buffer));// v_buffer_.get(test_buffer);// Log.i(TAG, "I420ExternalRender::onNTRenderFrame v data:" + bytesToHexString(test_buffer));}}
总结
以上就是Android平台开发RTMP/RTSP播放器时,针对解码和绘制部分的一点考量,算是抛砖引玉,感兴趣的开发者可酌情参考。