Gloomy's Blog

Gloomy's Blog Website

0%

利用Ffmpeg解码视频绘制图像到UI

<Excerpt in index | 首页摘要>

利用Ffmpeg解码视频绘制图像到UI

<The rest of contents | 余下全文>

前提

阅读之前可以先看一下我的

自己编译ffmpeg在Android平台实现转码功能

里面讲解了如何编译ffmpeg的so包 本篇文章不在阐述.

需要的工具

1->编译好的ffmpeg动态库(还有include文件)

2->libyuv(会讲如何编译/文章末尾源码文件中提供下载)

3->as

4->ndk

编译libyuv

视屏解码出来是yuv的编码,我们需要转换为rgb才能够显示在UI上

首先创建一个libyuv的文件夹

1
2
3
cd ~
mkdir libyuv
cd libyuv

然后下载libyuv,他是利用git托管的。

1
2
git clone https://chromium.googlesource.com/libyuv/libyuv
mv libyuv jni //修改为jni目录 然后利用ndk编译

修改jni目录中的Android.mk

内容修改为如下:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
# This is the Android makefile for libyuv for NDK.
LOCAL_PATH:= $(call my-dir)

include $(CLEAR_VARS)

LOCAL_CPP_EXTENSION := .cc

LOCAL_SRC_FILES := \
source/compare.cc \
source/compare_common.cc \
source/compare_gcc.cc \
source/compare_neon.cc \
source/compare_neon64.cc \
source/convert.cc \
source/convert_argb.cc \
source/convert_from.cc \
source/convert_from_argb.cc \
source/convert_to_argb.cc \
source/convert_to_i420.cc \
source/planar_functions.cc \
source/rotate.cc \
source/rotate_any.cc \
source/rotate_argb.cc \
source/rotate_common.cc \
source/rotate_dspr2.cc \
source/rotate_gcc.cc \
source/rotate_msa.cc \
source/rotate_neon.cc \
source/rotate_neon64.cc \
source/row_any.cc \
source/row_common.cc \
source/row_dspr2.cc \
source/row_gcc.cc \
source/row_msa.cc \
source/row_neon.cc \
source/row_neon64.cc \
source/scale.cc \
source/scale_any.cc \
source/scale_argb.cc \
source/scale_common.cc \
source/scale_dspr2.cc \
source/scale_gcc.cc \
source/scale_msa.cc \
source/scale_neon.cc \
source/scale_neon64.cc \
source/video_common.cc

#common_CFLAGS := -Wall -fexceptions
#ifneq ($(LIBYUV_DISABLE_JPEG), "yes")
#LOCAL_SRC_FILES += \
# source/convert_jpeg.cc \
# source/mjpeg_decoder.cc \
# source/mjpeg_validate.cc
#common_CFLAGS += -DHAVE_JPEG
#LOCAL_SHARED_LIBRARIES := libjpeg
#endif

#LOCAL_CFLAGS += $(common_CFLAGS)
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
LOCAL_C_INCLUDES += $(LOCAL_PATH)/include
LOCAL_EXPORT_C_INCLUDE_DIRS := $(LOCAL_PATH)/include

LOCAL_MODULE := libyuv
LOCAL_MODULE_TAGS := optional

include $(BUILD_SHARED_LIBRARY)

#include $(CLEAR_VARS)

#LOCAL_WHOLE_STATIC_LIBRARIES := libyuv_static
#LOCAL_MODULE := libyuv
#ifneq ($(LIBYUV_DISABLE_JPEG), "yes")
#LOCAL_SHARED_LIBRARIES := libjpeg
#endif

#include $(BUILD_SHARED_LIBRARY)

#include $(CLEAR_VARS)
#LOCAL_STATIC_LIBRARIES := libyuv_static
#LOCAL_SHARED_LIBRARIES := libjpeg
#LOCAL_MODULE_TAGS := tests
#LOCAL_CPP_EXTENSION := .cc
#LOCAL_C_INCLUDES += $(LOCAL_PATH)/include
#LOCAL_SRC_FILES := \
# unit_test/unit_test.cc \
# unit_test/basictypes_test.cc \
# unit_test/color_test.cc \
# unit_test/compare_test.cc \
# unit_test/convert_test.cc \
# unit_test/cpu_test.cc \
# unit_test/cpu_thread_test.cc \
# unit_test/math_test.cc \
# unit_test/planar_test.cc \
# unit_test/rotate_argb_test.cc \
# unit_test/rotate_test.cc \
# unit_test/scale_argb_test.cc \
# unit_test/scale_test.cc \
# unit_test/video_common_test.cc

#LOCAL_MODULE := libyuv_unittest
##include $(BUILD_NATIVE_TEST)

在jni目录下创建Application.mk 内容如下:

1
APP_ABI	:= armeabi

然后回退到我们创建的libyuv目录下.

1
ndk-build //利用ndk编译

然后在生成的libs/armeabi目录下找到libyuv.so

还有jni目录下的include目录

这两部分就是我们所需要的内容

创建Android项目

创建Android项目

在gradle.properties文件中加入

1
android.useDeprecatedNdk=true

打开app下的build.gradle

在defaultConfig节点下加入:

1
2
3
ndk {
abiFilters "armeabi"
}

android节点下加入:

1
2
3
4
sourceSets.main {
jni.srcDirs = []
jniLibs.srcDirs = ['src/main/libs']
}

在app/src/main目录下创建jni目录

继续在jni目录下创建include目录

然后修改之前libyuv的include名字为libyuv

然后将libyuv拷贝至jni目录下的include文件夹中

将ffmpeg的include改名为ffmpeg继续放到jni目录下的include文件夹中

然后将ffmpeg/libyuv的so放在jni目录下

创建一个VideoUtils类 内容如下:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
package com.gloomyer.ffmepgplay;

import android.view.Surface;

public class VideoUtils {
public static native void render(String videoPath, Surface surface);
static {
System.loadLibrary("avutil-54");
System.loadLibrary("swresample-1");
System.loadLibrary("avcodec-56");
System.loadLibrary("avformat-56");
System.loadLibrary("swscale-3");
System.loadLibrary("postproc-53");
System.loadLibrary("avfilter-5");
System.loadLibrary("avdevice-56");
System.loadLibrary("myffmpeg");
}
}

然后进入app/src/main/java 目录下利用javah生成头文件

1
javah -bootclasspath ~/Android/Sdk/platforms/android-?/android.jar com.gloomyer.ffmepgplay.VideoUtils

然后拷贝生成的头文件到jni目录下

然后创建myffmpeg.c文件

创建Android.mk文件 内容如下:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60

LOCAL_PATH := $(call my-dir)

#ffmpeg lib
include $(CLEAR_VARS)
LOCAL_MODULE := avcodec
LOCAL_SRC_FILES := libavcodec-56.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avdevice
LOCAL_SRC_FILES := libavdevice-56.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avfilter
LOCAL_SRC_FILES := libavfilter-5.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avformat
LOCAL_SRC_FILES := libavformat-56.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avutil
LOCAL_SRC_FILES := libavutil-54.so
include $(PREBUILT_SHARED_LIBRARY)


include $(CLEAR_VARS)
LOCAL_MODULE := postproc
LOCAL_SRC_FILES := libpostproc-53.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := swresample
LOCAL_SRC_FILES := libswresample-1.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := swscale
LOCAL_SRC_FILES := libswscale-3.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := yuv
LOCAL_SRC_FILES := libyuv.so
include $(PREBUILT_SHARED_LIBRARY)

#myapp
include $(CLEAR_VARS)
LOCAL_MODULE := myffmpeg
LOCAL_SRC_FILES := myffmpeg.c
LOCAL_C_INCLUDES += $(LOCAL_PATH)/include/ffmpeg
LOCAL_C_INCLUDES += $(LOCAL_PATH)/include/libyuv
##-landroid参数 for native windows
LOCAL_LDLIBS := -llog -landroid
LOCAL_SHARED_LIBRARIES := avcodec avdevice avfilter avformat avutil postproc swresample swscale yuv
include $(BUILD_SHARED_LIBRARY)

Application.mk,内容如下:

1
2
APP_ABI := armeabi
APP_PLATFORM := android-9

最后的文件格式为:

打开myffmpeg.c

内容如下

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
#include "com_gloomyer_ffmepgplay_VideoUtils.h"
#include "include/ffmpeg/libavcodec/avcodec.h"
#include "include/ffmpeg/libavformat/avformat.h"
#include "include/ffmpeg/libswscale/swscale.h"
#include "include/libyuv/libyuv.h"
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>
#include <android/log.h>
#include <android/native_window.h>
#include <android/native_window_jni.h>

#define LOGI(FORMAT, ...) __android_log_print(ANDROID_LOG_INFO,"TAG",FORMAT,##__VA_ARGS__);
#define LOGE(FORMAT, ...) __android_log_print(ANDROID_LOG_ERROR,"TAG",FORMAT,##__VA_ARGS__);

JNIEXPORT void JNICALL Java_com_gloomyer_ffmepgplay_VideoUtils_render
(JNIEnv *env, jclass jcls, jstring input_jstr, jobject suface_jobj) {
const char *input_cstr = (*env)->GetStringUTFChars(env, input_jstr, NULL);
LOGE("%s", "开始执行jni代码");
av_register_all();//注册
LOGE("%s", "注册组件成功");
AVFormatContext *pFormatCtx = avformat_alloc_context();
LOGE("%s", "注册变量");
//2.打开输入视频文件
if (avformat_open_input(&pFormatCtx, input_cstr, NULL, NULL) != 0) {
LOGE("%s", "打开输入视频文件失败");
return;
} else {
LOGE("%s", "打开视频文件成功!");
}

//3.获取视频信息
if (avformat_find_stream_info(pFormatCtx, NULL) < 0) {
LOGE("%s", "获取视频信息失败");
return;
} else {
LOGE("%s", "获取视频信息成功!");
}

int index;
for (int i = 0; i < pFormatCtx->nb_streams; i++) {
if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO) {
index = i;
break;
}
}

//4.获取视频解码器
AVCodecContext *pCodecCtx = pFormatCtx->streams[index]->codec;
AVCodec *pCodec = avcodec_find_decoder(pCodecCtx->codec_id);

if (pCodec == NULL) {
LOGE("%s", "无法解码");
return;
} else {
LOGE("%s", "可以正常解码");
}

//5.打开解码器
if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0) {
LOGE("%s", "解码器无法打开");
return;
} else {
LOGE("%s", "解码器打开成功!");
}

//编码数据
AVPacket *pPacket = (AVPacket *) av_malloc(sizeof(AVPacket));

//像素数据(解码数据)
AVFrame *pYuvFrame = av_frame_alloc();
AVFrame *pRgbFrame = av_frame_alloc();

//native绘制
//窗体
ANativeWindow *pNativeWindow = ANativeWindow_fromSurface(env, suface_jobj);
//绘制时的缓冲区
ANativeWindow_Buffer outBuffer;
//6.一阵一阵读取压缩的视频数据AVPacket
int len, got_frame, framecount = 0;
LOGE("%s", "开始一帧一帧解码");
while (av_read_frame(pFormatCtx, pPacket) >= 0) {
//解码AVPacket->AVFrame
len = avcodec_decode_video2(pCodecCtx, pYuvFrame, &got_frame, pPacket);
LOGE("%s len=%d got_frame=%d", "尝试解码", len, got_frame);
if (got_frame) {
LOGI("解码%d帧", framecount++);
//设置缓冲区的属性(宽、高、像素格式)
ANativeWindow_setBuffersGeometry(pNativeWindow,
pCodecCtx->width,
pCodecCtx->height,
WINDOW_FORMAT_RGBA_8888);
//lock
ANativeWindow_lock(pNativeWindow, &outBuffer, NULL);

//设置rgb_frame的属性(像素格式、宽高)和缓冲区
//rgb_frame缓冲区与outBuffer.bits是同一块内存
avpicture_fill((AVPicture *) pRgbFrame,
outBuffer.bits,
PIX_FMT_RGBA,
pCodecCtx->width,
pCodecCtx->height);


I420ToARGB(pYuvFrame->data[0], pYuvFrame->linesize[0],
pYuvFrame->data[2], pYuvFrame->linesize[2],
pYuvFrame->data[1], pYuvFrame->linesize[1],
pRgbFrame->data[0], pRgbFrame->linesize[0],
pCodecCtx->width, pCodecCtx->height);

ANativeWindow_unlockAndPost(pNativeWindow);

usleep(1000 * 16);
}
av_free_packet(pPacket);
}

av_frame_free(&pYuvFrame);
av_frame_free(&pRgbFrame);
avcodec_close(pCodecCtx);
avformat_free_context(pFormatJNIEXPORT void JNICALL Java_com_gloomyer_ffmepgplay_VideoUtils_render
(JNIEnv *env, jclass jcls, jstring input_jstr, jobject suface_jobj) {
const char *input_cstr = (*env)->GetStringUTFChars(env, input_jstr, NULL);
LOGE("%s", "开始执行jni代码");
av_register_all();//注册
LOGE("%s", "注册组件成功");
AVFormatContext *pFormatCtx = avformat_alloc_context();
LOGE("%s", "注册变量");
//2.打开输入视频文件
if (avformat_open_input(&pFormatCtx, input_cstr, NULL, NULL) != 0) {
LOGE("%s", "打开输入视频文件失败");
return;
} else {
LOGE("%s", "打开视频文件成功!");
}

//3.获取视频信息
if (avformat_find_stream_info(pFormatCtx, NULL) < 0) {
LOGE("%s", "获取视频信息失败");
return;
} else {
LOGE("%s", "获取视频信息成功!");
}

int index;
for (int i = 0; i < pFormatCtx->nb_streams; i++) {
if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO) {
index = i;
break;
}
}

//4.获取视频解码器
AVCodecContext *pCodecCtx = pFormatCtx->streams[index]->codec;
AVCodec *pCodec = avcodec_find_decoder(pCodecCtx->codec_id);

if (pCodec == NULL) {
LOGE("%s", "无法解码");
return;
} else {
LOGE("%s", "可以正常解码");
}

//5.打开解码器
if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0) {
LOGE("%s", "解码器无法打开");
return;
} else {
LOGE("%s", "解码器打开成功!");
}

//编码数据
AVPacket *pPacket = (AVPacket *) av_malloc(sizeof(AVPacket));

//像素数据(解码数据)
AVFrame *pYuvFrame = av_frame_alloc();
AVFrame *pRgbFrame = av_frame_alloc();

//native绘制
//窗体
ANativeWindow *pNativeWindow = ANativeWindow_fromSurface(env, suface_jobj);
//绘制时的缓冲区
ANativeWindow_Buffer outBuffer;
//6.一阵一阵读取压缩的视频数据AVPacket
int len, got_frame, framecount = 0;
LOGE("%s", "开始一帧一帧解码");
while (av_read_frame(pFormatCtx, pPacket) >= 0) {
//解码AVPacket->AVFrame
len = avcodec_decode_video2(pCodecCtx, pYuvFrame, &got_frame, pPacket);
LOGE("%s len=%d got_frame=%d", "尝试解码", len, got_frame);
if (got_frame) {
LOGI("解码%d帧", framecount++);
//设置缓冲区的属性(宽、高、像素格式)
ANativeWindow_setBuffersGeometry(pNativeWindow,
pCodecCtx->width,
pCodecCtx->height,
WINDOW_FORMAT_RGBA_8888);
//lock
ANativeWindow_lock(pNativeWindow, &outBuffer, NULL);

//设置rgb_frame的属性(像素格式、宽高)和缓冲区
//rgb_frame缓冲区与outBuffer.bits是同一块内存
avpicture_fill((AVPicture *) pRgbFrame,
outBuffer.bits,
PIX_FMT_RGBA,
pCodecCtx->width,
pCodecCtx->height);


I420ToARGB(pYuvFrame->data[0], pYuvFrame->linesize[0],
pYuvFrame->data[2], pYuvFrame->linesize[2],
pYuvFrame->data[1], pYuvFrame->linesize[1],
pRgbFrame->data[0], pRgbFrame->linesize[0],
pCodecCtx->width, pCodecCtx->height);

ANativeWindow_unlockAndPost(pNativeWindow);

usleep(1000 * 16);
}
av_free_packet(pPacket);
}

av_frame_free(&pYuvFrame);
av_frame_free(&pRgbFrame);
avcodec_close(pCodecCtx);
avformat_free_context(pFormatCtx);
ANativeWindow_release(pNativeWindow);
(*env)->ReleaseStringUTFChars(env, input_jstr, input_cstr);JNIEXPORT void JNICALL Java_com_gloomyer_ffmepgplay_VideoUtils_render
(JNIEnv *env, jclass jcls, jstring input_jstr, jobject suface_jobj) {
const char *input_cstr = (*env)->GetStringUTFChars(env, input_jstr, NULL);
LOGE("%s", "开始执行jni代码");
av_register_all();//注册
LOGE("%s", "注册组件成功");
AVFormatContext *pFormatCtx = avformat_alloc_context();
LOGE("%s", "注册变量");
//2.打开输入视频文件
if (avformat_open_input(&pFormatCtx, input_cstr, NULL, NULL) != 0) {
LOGE("%s", "打开输入视频文件失败");
return;
} else {
LOGE("%s", "打开视频文件成功!");
}

//3.获取视频信息
if (avformat_find_stream_info(pFormatCtx, NULL) < 0) {
LOGE("%s", "获取视频信息失败");
return;
} else {
LOGE("%s", "获取视频信息成功!");
}

int index;
for (int i = 0; i < pFormatCtx->nb_streams; i++) {
if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO) {
index = i;
break;
}
}

//4.获取视频解码器
AVCodecContext *pCodecCtx = pFormatCtx->streams[index]->codec;
AVCodec *pCodec = avcodec_find_decoder(pCodecCtx->codec_id);

if (pCodec == NULL) {
LOGE("%s", "无法解码");
return;
} else {
LOGE("%s", "可以正常解码");
}

//5.打开解码器
if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0) {
LOGE("%s", "解码器无法打开");
return;
} else {
LOGE("%s", "解码器打开成功!");
}

//编码数据
AVPacket *pPacket = (AVPacket *) av_malloc(sizeof(AVPacket));

//像素数据(解码数据)
AVFrame *pYuvFrame = av_frame_alloc();
AVFrame *pRgbFrame = av_frame_alloc();

//native绘制
//窗体
ANativeWindow *pNativeWindow = ANativeWindow_fromSurface(env, suface_jobj);
//绘制时的缓冲区
ANativeWindow_Buffer outBuffer;
//6.一阵一阵读取压缩的视频数据AVPacket
int len, got_frame, framecount = 0;
LOGE("%s", "开始一帧一帧解码");
while (av_read_frame(pFormatCtx, pPacket) >= 0) {
//解码AVPacket->AVFrame
len = avcodec_decode_video2(pCodecCtx, pYuvFrame, &got_frame, pPacket);
LOGE("%s len=%d got_frame=%d", "尝试解码", len, got_frame);
if (got_frame) {
LOGI("解码%d帧", framecount++);
//设置缓冲区的属性(宽、高、像素格式)
ANativeWindow_setBuffersGeometry(pNativeWindow,
pCodecCtx->width,
pCodecCtx->height,
WINDOW_FORMAT_RGBA_8888);
//lock
ANativeWindow_lock(pNativeWindow, &outBuffer, NULL);

//设置rgb_frame的属性(像素格式、宽高)和缓冲区
//rgb_frame缓冲区与outBuffer.bits是同一块内存
avpicture_fill((AVPicture *) pRgbFrame,
outBuffer.bits,
PIX_FMT_RGBA,
pCodecCtx->width,
pCodecCtx->height);


I420ToARGB(pYuvFrame->data[0], pYuvFrame->linesize[0],
pYuvFrame->data[2], pYuvFrame->linesize[2],
pYuvFrame->data[1], pYuvFrame->linesize[1],
pRgbFrame->data[0], pRgbFrame->linesize[0],
pCodecCtx->width, pCodecCtx->height);

ANativeWindow_unlockAndPost(pNativeWindow);

usleep(1000 * 16);
}
av_free_packet(pPacket);
}

av_frame_free(&pYuvFrame);
av_frame_free(&pRgbFrame);
avcodec_close(pCodecCtx);
avformat_free_context(pFormatCtx);
ANativeWindow_release(pNativeWindow);
(*env)->ReleaseStringUTFChars(env, input_jstr, input_cstr);
}
}Ctx);
ANativeWindow_release(pNativeWindow);
(*env)->ReleaseStringUTFChars(env, input_jstr, input_cstr);
}

然后进入app/src/main/目录下输入ndk-build生成so库

回到java方面,打开mainactivity的布局:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:orientation="vertical"
android:layout_height="match_parent">

<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:onClick="play"
android:text="播放视频" />


<SurfaceView
android:id="@+id/mSurfaceView"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</LinearLayout>

看一看MainActivity的代码:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
package com.gloomyer.ffmepgplay;

import android.graphics.PixelFormat;
import android.os.Environment;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;

import java.io.File;

public class MainActivity extends AppCompatActivity {

SurfaceView mSurfaceView;
private SurfaceHolder mHolder;

@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
mSurfaceView = (SurfaceView) findViewById(R.id.mSurfaceView);
init();
}

private void init() {
mHolder = mSurfaceView.getHolder();
mHolder.setFormat(PixelFormat.RGBA_8888);
}

public void play(View v) {
final String input = new File(Environment.getExternalStorageDirectory(), "input.mp4").getAbsolutePath();
new Thread() {
@Override
public void run() {
VideoUtils.render(input, mHolder.getSurface());
}
}.start();
}
}

下载一个mp4上传到手机sd卡下,名字叫input.mp4

然后把项目跑起来,点击播放视频。就发现可以显示视频了(无声音,,我们没有做声音的解码播放,后续继续…)

源码打包下载

[源码打包下载](http://gloomyer.com/upload/ ffmpeg_play_video.zip)