把Android原生的View渲染到OpenGL Texture
- 格式:docx
- 大小:17.73 KB
- 文档页数:19
OpenGL.ES在Android上的简单实践:19OpenGL.ES在Android上的简单实践:19-水印录制(EGL+摄像头预览GL_TEXTURE_EXTERNAL_OES)0、补充EglSurfaceBase在自己实际运用中,发现EglSurfaceBase还是缺了对原生的surface的管理,对整体的理解好像总缺了点啥。
所以在EglSurfaceBase的基础上,派生出了WindowSurface。
代码超级简单的,但从理解学习上就完全不同一个台阶了。
1.public class WindowSurface extends EglSurfaceBase {2.3.private Surface mSurface;4.private boolean bReleaseSurface;5.6.//将native的surface 与 EGL关联起来7.public WindowSurface(EglCore eglCore, Surface surface, boolean isReleaseSurface) {8.super(eglCore);9.createWindowSurface(surface);10.mSurface = surface;11.bReleaseSurface = isReleaseSurface;12.}13.//将SurfaceTexture 与 EGL关联起来14.protected WindowSurface(EglCore eglCore, SurfaceTexture surfaceTexture) {15.super(eglCore);16.createWindowSurface(surfaceTexture);17.}18.//释放当前EGL上下文关联的 surface19.public void release() {20.releaseEglSurface();21.if (mSurface != null22.&& bReleaseSurface) {23.mSurface.release();24.mSurface = null;25.}26.}27.// That's All.28.}那么接下来,我们就要快速开始预览摄像头。
textureview用法关于TextureView的使用方法TextureView是Android平台中的一个视图控件,它提供了一个用于显示可渲染表面(如视频、OpenGL场景等)的自定义视图。
TextureView的特点有:1. 支持在View层次结构中呈现硬件加速的内容,例如OpenGL渲染器。
2. 可以在自定义视图中提供高性能的实时视频播放。
3. 可以与Camera API结合使用,实现实时相机预览。
下面,我将一步一步回答有关TextureView的使用方法,以帮助你更好地理解和使用它。
第一步:在布局文件中添加TextureView首先,在你的布局文件中添加一个TextureView元素。
它可以作为一个独立的视图或作为其他视图的一部分。
xml<RelativeLayout xmlns:android="android:layout_width="match_parent"android:layout_height="match_parent"><TextureViewandroid:id="@+id/textureView"android:layout_width="match_parent"android:layout_height="match_parent" /></RelativeLayout>第二步:在Activity或Fragment中查找TextureView接下来,在Activity或Fragment的onCreate()方法或相关生命周期方法中,通过findViewById()方法来查找并获得对TextureView的引用。
javaTextureView textureView = (TextureView)findViewById(R.id.textureView);第三步:设置SurfaceTextureListener然后,通过调用TextureView的setSurfaceTextureListener()方法来设置SurfaceTextureListener,以便处理与SurfaceTexture相关的事件。
android textureview绘制原理-回复Android TextureView 绘制原理TextureView 是Android 提供的一种用于在UI 界面中展示视频、图像等图形内容的View。
与常用的SurfaceView 不同,TextureView 可以直接嵌入到布局层级中,可以通过常规的LayoutParams 设置位置和大小,并且可以与其他View 进行动画和过渡效果。
本文将一步一步地解析TextureView 的绘制原理。
1. TextureView 简介TextureView 是View 的子类,它通过OpenGL ES 2.0 来渲染图像。
GLSurfaceView 也是使用OpenGL ES 2.0,但它封装了一些渲染相关的细节,使用起来更加简单。
而TextureView 则提供了更灵活的使用方式,但需要手动管理GLSurfaceView 内部的渲染流程。
2. TextureView 创建和初始化当TextureView 首次被添加到布局层级中时,会触发以下过程:- measure():系统调用measure() 方法测量TextureView 的尺寸。
此时TextureView 的尺寸为默认值0。
- onMeasure():在TextureView 的onMeasure() 方法中,根据测量模式和父容器的限制,计算出TextureView 的最终尺寸。
3. TextureView 的绘制流程TextureView 的绘制流程相对复杂。
以下是TextureView 的绘制流程的大致步骤:- onDraw():在onDraw() 方法中,将渲染请求发送到渲染线程。
- onAttachedToWindow():当TextureView 附加到窗口时,会创建一个新的渲染线程。
- onSurfaceTextureAvailable():在渲染线程中,通过SurfaceTexture 的onSurfaceTextureAvailable() 方法,创建一个新的EGLContext 和EGLSurface。
AndroidEGL+CC++编写OpenGLES程序本人在编写基于FFmpeg的播放器时,需要将解码后的视频帧数据upload到GPU进行渲染输出,方便给视频添加滤镜之类的。
输出部分有两种方案,一种是使用GLSurfaceView,就是将Native解码得到的数据回到到Java层进行渲染。
第二种是使用EGL + ANativeWindow 直接在Native层利用GPU进行渲染。
第一种方案需要在Native层与Java层之间不断进行数据交换,这方的方式其实并不太好,如果遇到多线程处理,由于Java层与Native层之间各自线程空间的不同,个人不太推荐这样的方式。
下面来讲解一下如何在Native层使用EGL进行渲染。
如果不了解什么是EGL的话,建议先看看本人的文章EGL简介以及窗口初始化了解一下。
另外,大家有兴趣的话,可以看下GLSurfaceView的源码,GLSurfaceView也是通过开辟一个Looper线程GLThread来封装EGL来实现的。
我们接下来实现一个Native层封装EGL + ANativeWindow的项目,项目的Github 地址如下:EGLNativeRenderEGL的封装由于EGL的API使用起来比较复杂,为了方便使用,我们需要将EGL做一层封装。
关于EGL的API使用方式,可以参考Grafika项目中的EglCore.java、EglSurfaceBase.java、WindowSurface.java和OffscreenSurface.java,我的CainCamera 项目中也用到了Grafika 的EGL封装类,使用方便。
由于Grafika的EglCore是用Java层封装的,我们现在需要在Native层实现类似的EglCore、EglSurfaceBase、WindowSurface和OffscreenSurface类。
废话不多数,直接上源码:EglCore.h:#ifndef CAINCAMERA_EGLCORE_H#define CAINCAMERA_EGLCORE_H#include <android/native_window.h>#include <EGL/egl.h>#include <EGL/eglext.h>#include <EGL/eglplatform.h>/*** Constructor flag: surface must be recordable. This discourages EGL from using a* pixel format that cannot be converted efficiently to something usable by the video* encoder.*/#define FLAG_RECORDABLE 0x01/*** Constructor flag: ask for GLES3, fall back to GLES2 if not available. Without this* flag, GLES2 is used.*/#define FLAG_TRY_GLES3 002// Android-specific extension#define EGL_RECORDABLE_ANDROID 0x3142typedef EGLBoolean (EGLAPIENTRYP EGL_PRESENTATION_TIME_ANDROIDPROC)(EGLDisplay display, EGLSurface surface, khronos_stime_nanoseconds_t time);class EglCore {private:EGLDisplay mEGLDisplay = EGL_NO_DISPLAY;EGLConfig mEGLConfig = NULL;EGLContext mEGLContext = EGL_NO_CONTEXT;// 设置时间戳方法EGL_PRESENTATION_TIME_ANDROIDPROC eglPresentationTimeANDROID = NULL;int mGlVersion = -1;// 查找合适的EGLConfigEGLConfig getConfig(int flags, int version);public:EglCore();~EglCore();EglCore(EGLContext sharedContext, int flags);bool init(EGLContext sharedContext, int flags);// 释放资源void release();// 获取EglContextEGLContext getEGLContext();// 销毁Surfacevoid releaseSurface(EGLSurface eglSurface);// 创建EGLSurfaceEGLSurface createWindowSurface(ANativeWindow *surface);// 创建离屏EGLSurfaceEGLSurface createOffscreenSurface(int width, int height);// 切换到当前上下文void makeCurrent(EGLSurface eglSurface);// 切换到某个上下文void makeCurrent(EGLSurface drawSurface, EGLSurface readSurface);// 没有上下文void makeNothingCurrent();// 交换显示bool swapBuffers(EGLSurface eglSurface);// 设置ptsvoid setPresentationTime(EGLSurface eglSurface, long nsecs);// 判断是否属于当前上下文bool isCurrent(EGLSurface eglSurface);// 执行查询int querySurface(EGLSurface eglSurface, int what);// 查询字符串const char *queryString(int what);// 获取当前的GLES 版本号int getGlVersion();// 检查是否出错void checkEglError(const char *msg);};#endif //CAINCAMERA_EGLCORE_HEglCore.cpp:#include "../common/native_log.h"#include "EglCore.h"#include <assert.h>EglCore::EglCore() {init(NULL, 0);EglCore::~EglCore() {release();}/*** 构造方法* @param sharedContext* @param flags*/EglCore::EglCore(EGLContext sharedContext, int flags) { init(sharedContext, flags);}/*** 初始化* @param sharedContext* @param flags* @return*/bool EglCore::init(EGLContext sharedContext, int flags) { assert(mEGLDisplay == EGL_NO_DISPLAY);if (mEGLDisplay != EGL_NO_DISPLAY) {ALOGE("EGL already set up");return false;}if (sharedContext == NULL) {sharedContext = EGL_NO_CONTEXT;mEGLDisplay = eglGetDisplay(EGL_DEFAULT_DISPLAY); assert(mEGLDisplay != EGL_NO_DISPLAY);if (mEGLDisplay == EGL_NO_DISPLAY) {ALOGE("unable to get EGL14 display.\n");return false;}if (!eglInitialize(mEGLDisplay, 0, 0)) {mEGLDisplay = EGL_NO_DISPLAY;ALOGE("unable to initialize EGL14");return false;}// 尝试使用GLES3if ((flags & FLAG_TRY_GLES3) != 0) {EGLConfig config = getConfig(flags, 3);if (config != NULL) {int attrib3_list[] = {EGL_CONTEXT_CLIENT_VERSION, 3,EGL_NONE};EGLContext context = eglCreateContext(mEGLDisplay, config, sharedContext, attrib3_list);checkEglError("eglCreateContext");if (eglGetError() == EGL_SUCCESS) {mEGLConfig = config;mEGLContext = context;mGlVersion = 3;}}// 如果GLES3没有获取到,则尝试使用GLES2if (mEGLContext == EGL_NO_CONTEXT) {EGLConfig config = getConfig(flags, 2);assert(config != NULL);int attrib2_list[] = {EGL_CONTEXT_CLIENT_VERSION, 2,EGL_NONE};EGLContext context = eglCreateContext(mEGLDisplay, config, sharedContext, attrib2_list);checkEglError("eglCreateContext");if (eglGetError() == EGL_SUCCESS) {mEGLConfig = config;mEGLContext = context;mGlVersion = 2;}}// 获取eglPresentationTimeANDROID方法的地址eglPresentationTimeANDROID = (EGL_PRESENTATION_TIME_ANDROIDPROC)eglGetProcAddress("eglPresentationTimeANDROID");if (!eglPresentationTimeANDROID) {ALOGE("eglPresentationTimeANDROID is not available!");}int values[1] = {0};eglQueryContext(mEGLDisplay, mEGLContext, EGL_CONTEXT_CLIENT_VERSION, values);ALOGD("EGLContext created, client version %d", values[0]);return true;}/*** 获取合适的EGLConfig* @param flags* @param version* @return*/EGLConfig EglCore::getConfig(int flags, int version) {int renderableType = EGL_OPENGL_ES2_BIT;if (version >= 3) {renderableType |= EGL_OPENGL_ES3_BIT_KHR;}int attribList[] = {EGL_RED_SIZE, 8,EGL_GREEN_SIZE, 8,EGL_BLUE_SIZE, 8,EGL_ALPHA_SIZE, 8,//EGL_DEPTH_SIZE, 16,//EGL_STENCIL_SIZE, 8,EGL_RENDERABLE_TYPE, renderableType,EGL_NONE, 0, // placeholder for recordable [@-3]EGL_NONE};int length = sizeof(attribList) / sizeof(attribList[0]);if ((flags & FLAG_RECORDABLE) != 0) {attribList[length - 3] = EGL_RECORDABLE_ANDROID;attribList[length - 2] = 1;}EGLConfig configs = NULL;int numConfigs;if (!eglChooseConfig(mEGLDisplay, attribList, &configs, 1, &numConfigs)) {ALOGW("unable to find RGB8888 / %d EGLConfig", version);return NULL;}return configs;}/*** 释放资源*/void EglCore::release() {if (mEGLDisplay != EGL_NO_DISPLAY) {eglMakeCurrent(mEGLDisplay, EGL_NO_SURFACE, EGL_NO_SURFACE, EGL_NO_CONTEXT);eglDestroyContext(mEGLDisplay, mEGLContext);eglReleaseThread();eglTerminate(mEGLDisplay);}mEGLDisplay = EGL_NO_DISPLAY;mEGLContext = EGL_NO_CONTEXT;mEGLConfig = NULL;}/*** 获取EGLContext* @return*/EGLContext EglCore::getEGLContext() {return mEGLContext;}/*** 销毁EGLSurface* @param eglSurface*/void EglCore::releaseSurface(EGLSurface eglSurface) {eglDestroySurface(mEGLDisplay, eglSurface);}/*** 创建EGLSurface* @param surface* @return*/EGLSurface EglCore::createWindowSurface(ANativeWindow *surface) {assert(surface != NULL);if (surface == NULL) {ALOGE("ANativeWindow is NULL!");return NULL;}int surfaceAttribs[] = {EGL_NONE};ALOGD("eglCreateWindowSurface start");EGLSurface eglSurface = eglCreateWindowSurface(mEGLDisplay, mEGLConfig, surface, surfaceAttribs);checkEglError("eglCreateWindowSurface");assert(eglSurface != NULL);if (eglSurface == NULL) {ALOGE("EGLSurface is NULL!");return NULL;}return eglSurface;}/*** 创建离屏渲染的EGLSurface* @param width* @param height* @return*/EGLSurface EglCore::createOffscreenSurface(int width, int height) {int surfaceAttribs[] = {EGL_WIDTH, width,EGL_HEIGHT, height,EGL_NONE};EGLSurface eglSurface =eglCreatePbufferSurface(mEGLDisplay, mEGLConfig, surfaceAttribs);assert(eglSurface != NULL);if (eglSurface == NULL) {ALOGE("Surface was null");return NULL;}return eglSurface;}/*** 切换到当前的上下文* @param eglSurface*/void EglCore::makeCurrent(EGLSurface eglSurface) {if (mEGLDisplay == EGL_NO_DISPLAY) {ALOGD("Note: makeCurrent w/o display.\n");}if (!eglMakeCurrent(mEGLDisplay, eglSurface, eglSurface, mEGLContext)) {// TODO 抛出异常}}/*** 切换到某个上下文* @param drawSurface* @param readSurface*/void EglCore::makeCurrent(EGLSurface drawSurface,EGLSurface readSurface) {if (mEGLDisplay == EGL_NO_DISPLAY) {ALOGD("Note: makeCurrent w/o display.\n");}if (!eglMakeCurrent(mEGLDisplay, drawSurface, readSurface, mEGLContext)) {// TODO 抛出异常}}/****/void EglCore::makeNothingCurrent() {if (!eglMakeCurrent(mEGLDisplay, EGL_NO_SURFACE, EGL_NO_SURFACE, EGL_NO_CONTEXT)) {// TODO 抛出异常}}/*** 交换显示* @param eglSurface* @return*/bool EglCore::swapBuffers(EGLSurface eglSurface) {return eglSwapBuffers(mEGLDisplay, eglSurface);}/*** 设置显示时间戳pts* @param eglSurface* @param nsecs*/void EglCore::setPresentationTime(EGLSurface eglSurface, long nsecs) {eglPresentationTimeANDROID(mEGLDisplay, eglSurface, nsecs);}/*** 是否处于当前上下文* @param eglSurface* @return*/bool EglCore::isCurrent(EGLSurface eglSurface) {return mEGLContext == eglGetCurrentContext() &&eglSurface == eglGetCurrentSurface(EGL_DRAW);}/*** 查询surface* @param eglSurface* @param what* @return*/int EglCore::querySurface(EGLSurface eglSurface, int what) { int value;eglQuerySurface(mEGLContext, eglSurface, what, &value);return value;}/*** 查询字符串* @param what* @return*/const char* EglCore::queryString(int what) { return eglQueryString(mEGLDisplay, what);}/*** 获取GLES版本号* @return*/int EglCore::getGlVersion() {return mGlVersion;}/*** 检查是否出错* @param msg*/void EglCore::checkEglError(const char *msg) { int error;if ((error = eglGetError()) != EGL_SUCCESS) { // TODO 抛出异常ALOGE("%s: EGL error: %x", msg, error);}}到这里,我们就封装好了EglCore核心,那么接下来我们需要对EGLSurface进行封装处理:EglSurfaceBase.h:#ifndef CAINCAMERA_EGLSURFACEBASE_H#define CAINCAMERA_EGLSURFACEBASE_H#include "EglCore.h"#include "../common/native_log.h"class EglSurfaceBase {public:EglSurfaceBase(EglCore *eglCore);// 创建窗口Surfacevoid createWindowSurface(ANativeWindow *nativeWindow);// 创建离屏Surfacevoid createOffscreenSurface(int width, int height);// 获取宽度int getWidth();// 获取高度int getHeight();// 释放EGLSurfacevoid releaseEglSurface();// 切换到当前上下文void makeCurrent();// 交换缓冲区,显示图像bool swapBuffers();// 设置显示时间戳void setPresentationTime(long nsecs);// 获取当前帧缓冲char *getCurrentFrame();protected:EglCore *mEglCore;EGLSurface mEglSurface;int mWidth;int mHeight;};#endif //CAINCAMERA_EGLSURFACEBASE_HEglSurfaceBase.cpp:#include <assert.h>#include <GLES2/gl2.h>#include "EglSurfaceBase.h"EglSurfaceBase::EglSurfaceBase(EglCore *eglCore) : mEglCore(eglCore) {mEglSurface = EGL_NO_SURFACE;}/*** 创建显示的Surface* @param nativeWindow*/void EglSurfaceBase::createWindowSurface(ANativeWindow *nativeWindow) {assert(mEglSurface == EGL_NO_SURFACE);if (mEglSurface != EGL_NO_SURFACE) {ALOGE("surface already created\n");return;}mEglSurface = mEglCore->createWindowSurface(nativeWindow);}/*** 创建离屏surface* @param width* @param height*/void EglSurfaceBase::createOffscreenSurface(int width, int height) {assert(mEglSurface == EGL_NO_SURFACE);if (mEglSurface != EGL_NO_SURFACE) {ALOGE("surface already created\n");return;}mEglSurface = mEglCore->createOffscreenSurface(width, height);mWidth = width;mHeight = height;}/*** 获取宽度* @return*/int EglSurfaceBase::getWidth() {if (mWidth < 0) {return mEglCore->querySurface(mEglSurface, EGL_WIDTH); } else {return mWidth;}}/*** 获取高度* @return*/int EglSurfaceBase::getHeight() {if (mHeight < 0) {return mEglCore->querySurface(mEglSurface, EGL_HEIGHT); } else {return mHeight;}}/*** 释放EGLSurface*/void EglSurfaceBase::releaseEglSurface() {mEglCore->releaseSurface(mEglSurface);mEglSurface = EGL_NO_SURFACE;mWidth = mHeight = -1;}/*** 切换到当前EGLContext*/void EglSurfaceBase::makeCurrent() {mEglCore->makeCurrent(mEglSurface);}/*** 交换到前台显示* @return*/bool EglSurfaceBase::swapBuffers() {bool result = mEglCore->swapBuffers(mEglSurface); if (!result) {ALOGD("WARNING: swapBuffers() failed");}return result;}/*** 设置当前时间戳* @param nsecs*/void EglSurfaceBase::setPresentationTime(long nsecs) { mEglCore->setPresentationTime(mEglSurface, nsecs); }/*** 获取当前像素* @return*/char* EglSurfaceBase::getCurrentFrame() {char *pixels = NULL;glReadPixels(0, 0, getWidth(), getHeight(), GL_RGBA, GL_UNSIGNED_BYTE, pixels);return pixels;}好了,我们封装好了EGLSurface的基类,接下来我们需要实现一个用于显示的WindowSurface 和用于离屏渲染的OffscreenSurface,实现如下:WindowSurface.h:#ifndef CAINCAMERA_WINDOWSURFACE_H#define CAINCAMERA_WINDOWSURFACE_H#include "EglSurfaceBase.h"class WindowSurface : public EglSurfaceBase {public:WindowSurface(EglCore *eglCore, ANativeWindow *window, bool releaseSurface);WindowSurface(EglCore *eglCore, ANativeWindow *window);// 释放资源void release();// 重新创建void recreate(EglCore *eglCore);private:ANativeWindow *mSurface;bool mReleaseSurface;};#endif //CAINCAMERA_WINDOWSURFACE_HWindowSurface.cpp:#include <assert.h>#include "WindowSurface.h"WindowSurface::WindowSurface(EglCore *eglCore, ANativeWindow *window, bool releaseSurface): EglSurfaceBase(eglCore) {mSurface = window;createWindowSurface(mSurface);mReleaseSurface = releaseSurface;}WindowSurface::WindowSurface(EglCore *eglCore, ANativeWindow *window): EglSurfaceBase(eglCore) {createWindowSurface(window);mSurface = window;}void WindowSurface::release() {releaseEglSurface();if (mSurface != NULL) {ANativeWindow_release(mSurface);mSurface = NULL;}}void WindowSurface::recreate(EglCore *eglCore) {assert(mSurface != NULL);if (mSurface == NULL) {ALOGE("not yet implemented ANativeWindow");return;}mEglCore = eglCore;createWindowSurface(mSurface);}OffscreenSurface.h:#ifndef CAINCAMERA_OFFSCREENSURFACE_H#define CAINCAMERA_OFFSCREENSURFACE_H#include "EglSurfaceBase.h"class OffscreenSurface : public EglSurfaceBase {public:OffscreenSurface(EglCore *eglCore, int width, int height);void release();};#endif //CAINCAMERA_OFFSCREENSURFACE_HOffscreenSurface.cpp:#include "OffscreenSurface.h"OffscreenSurface::OffscreenSurface(EglCore *eglCore, int width, int height): EglSurfaceBase(eglCore) {createOffscreenSurface(width, height);}void OffscreenSurface::release() {releaseEglSurface();}到这里,我们就在Native实现了跟Grafika 的Java层一样的EGL 封装。
android rendereffect 原理Android RenderEffect 是一个用于图形渲染的功能,它可以让开发者在 Android平台上实现各种视觉效果。
RenderEffect 利用 GPU 加速来对图像进行处理,以提供更高效的图形渲染和动画效果。
RenderEffect 的核心原理是通过使用 OpenGL ES 来渲染图形。
它利用图形处理单元(GPU)进行高性能的计算和渲染,同时使用 GPU 的并行处理能力来提高图形渲染的速度和效果质量。
RenderEffect 主要通过以下几个步骤来实现图形渲染:1. 创建 RenderEffect 对象:通过 RenderEffect.create() 方法来创建一个RenderEffect 对象。
开发者可以选择不同的效果类型,如模糊、颜色滤镜、亮度调整等。
2. 应用 RenderEffect:通过调用 View 的 setRenderEffect() 方法,将创建的RenderEffect 对象应用到指定的视图上。
这个方法会将 RenderEffect 对象包装为RenderNode 对象,并将其与特定的视图相关联。
3. 绘制 RenderNode:在绘制视图的过程中,系统将会调用 RenderNode 的draw() 方法。
在该方法中,RenderEffect 会被应用到视图的绘制过程中,对图像进行处理。
4. 渲染结果:系统使用 OpenGL ES 将 RenderNode 渲染到 GPU 上,然后将渲染结果显示在屏幕上。
由于GPU 的高性能计算能力,渲染的过程非常快速和高效。
通过使用 RenderEffect,开发者可以实现各种视觉效果,例如模糊背景、阴影效果、颜色滤镜等。
这些效果可以提升应用的用户体验和界面美观度。
总结起来,Android RenderEffect 的原理是利用 GPU 加速的图形渲染功能,将RenderEffect 应用到指定视图的绘制过程中,并通过 OpenGL ES 渲染到 GPU 上,最终显示在屏幕上。
Android使用OPENGL显示YUV图像需要流畅显示YUV图像需要使用Opengl库调用GPU资源,网上在这部分的资料很少。
实际上Android已经为我们提供了相关的Opengl方法主体过程如下:1、建立GLSurfaceView2、建立Render4、设置GLSurfaceView的Render为你创建的Render,并设置RenderMode为RENDERMODE_WHEN_DIRTY3、把数据推入Render而在Render中需要实现3个方法1、void onSurfaceCreated(GL10 gl, EGLConfig config)2、void onDrawFrame(GL10 gl)3、void onSurfaceChanged(GL10 gl, int width, int height)onSurfaceCreated需要一些初始化工作,括号内是具体方法:生成Textures的ID(glGenTextures)->绑定需要操作的Texture(glBindT exture)->设置相关参数(glTexParameterf)->操作下一个Texture....->建立Shader(glCreateShader)->导入Shader代码字符串(glShaderSource)->编译Shader(glCompileShader)->检查编译状态(glGetShaderiv)->下一个Shader....->建立Program(glCreateProgram)->绑定Shader (glAttachShader)->绑定属性(glBindAttribLocation)->链接Program(glLinkProgram)->检查Program链接状态(glGetProgramiv)->取得attribute变量、uniform变量的句柄->告诉Opengl在rendering的时候使用这个Program(glUseProgram)onDrawFrame在视图刷新的时候会被使用绑定需要操作的T exture(glBindT exture)->设置Texture内容(glTexImage2D)->操作下一个T exture....清缓存->指定顶点属性数组(glVertexAttribPointer)->使能顶点数组(glEnableVertexAttribArray)->激活Opengl的Texture(glActiveTexture)->绑定需要操作的Texture(glBindT exture)->设置纹理单元的一致变量(glUniform1i)->操作下一个T exture....绘图(glDrawArrays)在将数据准备完成的时候,使用requestRender()就会调用onDrawFrame。
Android OpenGL 入门实例(12):多纹理贴图这一章是前两章的总结,也是为了写雾效果而做的一个例子。
如果不用纹理,单靠颜色绘制雾效果,感觉效果很假,灰蒙蒙的。
多纹理贴图绘制多边形并不复杂,启动混合效果也很简单。
大家会看到一个很不错的效果。
RTFC:view sourceprint?001 public class DrawTextureCube extends Activity implements GLSurfaceView.Renderer {002 Bitmap[] bitmaps;003 TextureCube myCube;004 @Override005 public void onCreate(Bundle savedInstanceState) {006 super.onCreate(savedInstanceState);007 GLSurfaceView myView = new GLSurfaceView(this);008 myView.setRenderer(this);009 setContentView(myView);010 bitmaps = new Bitmap[6];011 for (int i = 0; i < bitmaps.length; i++) {012 bitmaps[i] = BitmapFactory.decodeResource(getResources(),013 R.drawable.glass);014 }015 myCube = new TextureCube(bitmaps);016 }017018 public void onDrawFrame(GL10 gl) {019 gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT); 020 gl.glLoadIdentity();021 GLU.gluLookAt(gl, 1, 1, 3, 0, 0, 0, 0, 1, 0);022 myCube.draw(gl);023024 }025026 public void onSurfaceChanged(GL10 gl, int width, int height) {027 gl.glViewport(0, 0, width, height);028 gl.glMatrixMode(GL10.GL_PROJECTION);029 gl.glLoadIdentity();030 GLU.gluPerspective(gl, 45.0f, (float) width / (float) height, 0.1f,031 100.0f);032 myCube.init(gl);033 myCube.LoadTextures(gl);034 gl.glMatrixMode(GL10.GL_MODELVIEW);035 gl.glLoadIdentity();036 }037038 public void onSurfaceCreated(GL10 gl, EGLConfig config) {039040 }041 }042043 class TextureCube {044 Bitmap[] mbitmaps;045 int[] textures;046047 float box[] = new float[] {048// FRONT049 -0.5f, -0.5f, 0.5f, 050 0.5f, -0.5f, 0.5f, 051 -0.5f, 0.5f, 0.5f, 052 0.5f, 0.5f, 0.5f,053// BACK054 -0.5f, -0.5f, -0.5f, 055 -0.5f, 0.5f, -0.5f, 056 0.5f, -0.5f, -0.5f, 057 0.5f, 0.5f, -0.5f,058// LEFT059 -0.5f, -0.5f, 0.5f, 060 -0.5f, 0.5f, 0.5f, 061 -0.5f, -0.5f, -0.5f,062 -0.5f, 0.5f, -0.5f,063// RIGHT064 0.5f, -0.5f, -0.5f, 065 0.5f, 0.5f, -0.5f, 066 0.5f, -0.5f, 0.5f, 067 0.5f, 0.5f, 0.5f,068// TOP069 -0.5f, 0.5f, 0.5f, 070 0.5f, 0.5f, 0.5f, 071 -0.5f, 0.5f, -0.5f, 072 0.5f, 0.5f, -0.5f,073// BOTTOM074 -0.5f, -0.5f, 0.5f, 075 -0.5f, -0.5f, -0.5f, 076 0.5f, -0.5f, 0.5f, 077 0.5f, -0.5f, -0.5f, }; 078079 float textureCoordinates[] = {080// FRONT081 0.0f, 1.0f,082 0.0f, 0.0f,083 1.0f, 1.0f, 084 1.0f, 0.0f,085// BACK086 1.0f, 0.0f, 087 1.0f, 1.0f, 088 0.0f, 0.0f, 089 0.0f, 1.0f,090// LEFT091 1.0f, 0.0f, 092 1.0f, 1.0f, 093 0.0f, 0.0f, 094 0.0f, 1.0f,095// RIGHT096 1.0f, 0.0f, 097 1.0f, 1.0f, 098 0.0f, 0.0f, 099 0.0f, 1.0f,100// TOP101 0.0f, 0.0f,102 1.0f, 0.0f,103 0.0f, 1.0f,104 1.0f, 1.0f,105// BOTTOM106 1.0f, 0.0f,107 1.0f, 1.0f,108 0.0f, 0.0f,109 0.0f, 1.0f };110111 float[][] normals = {112 { 0.0f, 0.0f, 1.0f }, 113 { 0.0f, 0.0f, -1.0f }, 114 { -1.0f, 0.0f, 0.0f }, 115 { 1.0f, 0.0f, 0.0f }, 116 { 0.0f, -1.0f, 0.0f }, 117 { 0.0f, 1.0f, 0.0f } 118 };119120 FloatBuffer textureBuffer; 121 FloatBuffer cubeBuff;122123 public TextureCube(Bitmap[] bitmaps) {124 mbitmaps = bitmaps;125 }126127 protected void init(GL10 gl) {128 cubeBuff = makeFloatBuffer(box);129 textureBuffer = makeFloatBuffer(textureCoordinates);130131 gl.glEnable(GL10.GL_BLEND);132 gl.glDisable(GL10.GL_DEPTH_TEST);133 gl.glBlendFunc(GL10.GL_ONE, GL10.GL_ONE);134135 gl.glEnable(GL10.GL_TEXTURE_2D);136 gl.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);137 gl.glClearDepthf(1.0f);138139 gl.glVertexPointer(3, GL10.GL_FLOAT, 0, cubeBuff);140 gl.glEnableClientState(GL10.GL_VERTEX_ARRA Y);141 gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer); 142 gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY); 143144 gl.glShadeModel(GL10.GL_SMOOTH);145 }146147 public void draw(GL10 gl) {148 gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);149 for (int i = 0; i < textures.length; i++) {150 gl.glNormal3f(normals[i][0], normals[i][1], normals[i][2]); 151 gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 4 * i, 4); 152 }153 }154155 private FloatBuffer makeFloatBuffer(float[] arr) {156 ByteBuffer bb = ByteBuffer.allocateDirect(arr.length * 4);157 bb.order(ByteOrder.nativeOrder());158 FloatBuffer fb = bb.asFloatBuffer();159 fb.put(arr);160 fb.position(0);161 return fb;162 }163164 public void LoadTextures(GL10 gl) {165 textures = new int[6];166 gl.glGenTextures(6, textures, 0);167 for (int i = 0; i < textures.length; i++) {168 gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[i]);169 gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, 170 GL10.GL_LINEAR);171 gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, 172 GL10.GL_LINEAR);173 gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_S, 174 GL10.GL_CLAMP_TO_EDGE);175 gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_WRAP_T, 176 GL10.GL_REPEA T);177 GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, mbitmaps[i], 0);178 mbitmaps[i].recycle();179 }180 mbitmaps = null;181 }182 }important:如果想要立方体透明要禁用DEPTH_TEST.keycode:view sourceprint?1 gl.glEnable(GL10.GL_BLEND);2 gl.glDisable(GL10.GL_DEPTH_TEST);3 gl.glBlendFunc(GL10.GL_ONE, GL10.GL_ONE);effect:。
OpenGL播放yuv数据流(着色器SHADER)-android(一)这个和windows还有iOS略有不同,下面将步骤整理一下以做记录:1:在avtivity_main.xml中添加用于显示的GLsurfaceView[html] view plain copy<android.opengl.GLSurfaceViewandroid:id="@+id/lvsPlaySurfaceView"android:layout_width="match_parent"android:layout_height="400dp"/>2:将GLsurfaceView传到里面[cpp] view plain copy//得到opengal渲染用的surfaceViewopenglsurfaceView = (GLSurfaceView) findViewById(R.id.lvsPlaySurfaceView);3:需要添加权限在AndroidMainfest.xml中:[html] view plain copy<!--为了能使用OpenGLES 2.0 API,你必须在你的manifest中添加以下声明:--><uses-feature android:glEsV ersion="0x00020000" android:required="true" /><!-- 如果你的应用要使用纹理压缩功能,你必须还要声明设备需要支持什么样的压缩格式--><supports-gl-texture android:name="GL_OES_compressed_ETC1_RGB8_texture" /><supports-gl-texture android:name="GL_OES_compressed_paletted_texture" />4:下面就是具体的实现代码//.java[java] view plain copy 在CODE上查看代码片派生到我的代码片package com.example.zhuweigang.lvsandroidplay;import android.content.Context;import android.opengl.GLES20;import android.opengl.GLSurfaceView;import android.support.v4.app.NavUtils;import android.util.AttributeSet;import android.util.Log;import android.view.SurfaceHolder;import javax.microedition.khronos.egl.EGLConfig;import javax.microedition.khronos.opengles.GL10;import java.nio.Buffer;import java.nio.ByteBuffer;import java.nio.ByteOrder;/*** Created by zhuweigang on 2016/12/26.*/public class Lvs_OpenGl_Interface_Android implements GLSurfaceView.Renderer{private static final String TAG = "lvs_OpenGL";//顶点数组(物体表面坐标取值范围是-1到1,数组坐标:左下,右下,左上,右上)private static float[] vertexVertices = {-1.0f, -1.0f,1.0f, -1.0f,-1.0f, 1.0f,1.0f, 1.0f,};//像素,纹理数组(纹理坐标取值范围是0-1,坐标原点位于左下角,数组坐标:左上,右上,左下,右下,如果先左下,图像会倒过来)private static float[] textureVertices = {0.0f, 1.0f,1.0f, 1.0f,0.0f, 0.0f,1.0f, 0.0f,};//shader的vsh源码字符串private static final String vertexShaderString ="attribute vec4 vertexIn;" +"attribute vec2 textureIn;" +"varying vec2 textureOut;" +"void main() {" +"gl_Position = vertexIn;" +"textureOut = textureIn;" +"}";//shader的fsh源码字符串private static final String yuvFragmentShaderString ="precision mediump float;" +"uniform sampler2D tex_y;" +"uniform sampler2D tex_u;" +"uniform sampler2D tex_v;" +"varying vec2 textureOut;" +"void main() {" +"vec4 c = vec4((texture2D(tex_y, textureOut).r - 16./255.) * 1.164);" +"vec4 U = vec4(texture2D(tex_u, textureOut).r - 128./255.);" +"vec4 V = vec4(texture2D(tex_v, textureOut).r - 128./255.);" +"c += V * vec4(1.596, -0.813, 0, 0);" +"c += U * vec4(0, -0.392, 2.017, 0);" +"c.a = 1.0;" +"gl_FragColor = c;" +"}";//着色器用的顶点属性索引position是由3个(x,y,z)组成,public int ATTRIB_VERTEX = 0;//着色器用的像素,纹理属性索引而颜色是4个(r,g,b,a)public int ATTRIB_TEXTURE = 0;private GLSurfaceView mTargetSurface; //外部传入的GLSurfaceViewpublic int p = 0; //Program着色器程序的idByteBuffer vertexVertices_buffer = null; //定义顶点数组ByteBuffer textureVertices_buffer = null; //定义像素纹理数组public int m_IsInitShaders = 0; //是否已经InitShaders,onSurfaceCreatedpublic Lvs_Play_Interface_Sdk_Android.OpenGl_DisplayCallBackInterface m_displaydatack = null; //用于显示回调函数,参数数据及时间戳public byte m_yuvbuf[] = new byte[640*480*3]; //存放yuv数据的buf指针,申请buffer在外面public ByteBuffer yuvplaner_y = null; //分用于渲染的变量public ByteBuffer yuvplaner_u = null; //分用于渲染的变量public ByteBuffer yuvplaner_v = null; //分用于渲染的变量;public int[] m_millis_realtime = new int[1]; //实时的时间戳,每次回调会更新public int m_yuvdata_width = 0; //数据宽public int m_yuvdata_height = 0; //数据高public int m_frameBuffer = 0; //framebufferpublic int m_renderBuffer = 0; //renderbufferpublic int m_textureid_y, m_textureid_u, m_textureid_v;//纹理的名称,并且,该纹理的名称在当前的应用中不能被再次使用。
OpenGL.ES在Android上的简单实践:18(自定义Android-EGL)1、确定需求这次的项目需求总结下来是这样的:一个摄像头预览界面,一个按钮触发屏幕录制,录制视频带上水印效果。
1. 摄像头预览2. 屏幕录制3. 录制视频在指定位置附带上水印确定需求后,我们逐一分析模块组成并完成它。
So,Talk is cheap,Let me show codes!2、EGL+Surface=EGLSurface要想预览的时候增加水印(滤镜)效果,必须有EGL环境+shader的滤镜特效。
所以我们先从简单开始,第一步的需求就是创建EGL,并能正常显示手机摄像头。
首先创建我们这次的测试Activity->ContinuousRecordActivity并读取布局界面的SurfaceView,使用SurfaceView是方便直接获取Surface渲染表面。
1.public class ContinuousRecordActivity extends Activity implements SurfaceHolder.Callback {2.3.public static final String TAG = "ContinuousRecord";4.5.SurfaceView sv;6.7.@Override8.protected void onCreate(Bundle savedInstanceState) {9.super.onCreate(savedInstanceState);10.setContentView(yout.continuous_record);12.sv = (SurfaceView) findViewById(R.id.continuousCapture_surfaceView);13.SurfaceHolder sh = sv.getHolder();14.sh.addCallback(this);15.}16.17.@Override18.public void surfaceCreated(SurfaceHolder surfaceHolder) {19.Log.d(TAG, "surfaceCreated holder=" + surfaceHolder);20.//首先我们描述一下在这里即将发生的:21.// surface创建回调给开发者我们,然后我们创建一个EGL上下文,组成一个我们需要的EGLSurface22.// EglCore = new EglCore();23.// 把Egl和native的surface组合成=EglSurface,并保存下来。
Android中TextureView与SurfaceView⽤法区别总结SurfaceView和TextureView均继承于android.view.View与其它View不同的是,两者都能在独⽴的线程中绘制和渲染,在专⽤的GPU线程中⼤⼤提⾼渲染的性能。
⼀、SurfaceView专门提供了嵌⼊视图层级的绘制界⾯,开发者可以控制该界⾯像Size等的形式,能保证界⾯在屏幕上的正确位置。
但也有局限:由于是独⽴的⼀层View,更像是独⽴的⼀个Window,不能加上动画、平移、缩放;两个SurfaceView不能相互覆盖。
⼆、TextureView更像是⼀般的View,像TextView那样能被缩放、平移,也能加上动画。
TextureView只能在开启了硬件加速的Window中使⽤,并且消费的内存要⽐SurfaceView多,并伴随着1-3帧的延迟。
三、TextureView和SurfaceView都是继承⾃View类的,但是TextureView在Andriod4.0之后的API中才能使⽤。
SurfaceView可以通过SurfaceHolder.addCallback⽅法在⼦线程中更新UI,TextureView则可以通过TextureView.setSurfaceTextureListener在⼦线程中更新UI,个⼈认为能够在⼦线程中更新UI是上述两种View相⽐于View的最⼤优势。
但是,两者更新画⾯的⽅式也有些不同,由于SurfaceView的双缓冲功能,可以是画⾯更加流畅的运⾏,但是由于其holder的存在导致画⾯更新会存在间隔(不太好表达,直接上图)。
并且,由于holder的存在,SurfaceView也不能进⾏像View⼀样的setAlpha和setRotation⽅法,但是对于⼀些类似于坦克⼤战等需要不断告诉更新画布的游戏来说,SurfaceView绝对是极好的选择。
但是⽐如视频播放器或相机应⽤的开发,TextureView则更加适合。
把Android原生的View渲染到OpenGL Texture最近要把Android 原生的View渲染到OpenGL GLSurfaceView中,起初想到的是截图的方法,也就是把View截取成bitmap后,再把Bitmap 渲染到OpenGL中;但是明显这种方法是不可行的,面对一些高速动态更新的View,只有不停的对view 进行截图才能渲染出原生View的效果。
通过大量的Google终于在国外的网站找到了一个做过类似的先例。
不过经过测试该方法只能渲染直接父类为View的view,也就是只能渲染一层View(如progressbar,没不能添加child的view),当该原生Android View包含很多子view时(也就是根View为FramLayout、或者linearLayout之类),无法实时的监听到View动态改变,OpenGL中只能不停的渲染该view,才能渲染出原生View的效果。
但是这样一来不同的渲染会耗费大量的资源,降低应用程序的效率。
理想中的话,是监听到了该View的内容或者其子view 的内容发生了变化(如:View中的字幕发生滚动)才进行渲染。
经过接近两周的努力我终于完美地实现了该效果,既然是站在别人的基础上得来的成果,那么该方法就应当被共享,所以产生了此文,不过只支持api 15以上的步骤一:重写根View1.设置该View 绘制自己:[java] view plain copy 在CODE上查看代码片派生到我的代码片setWillNotDraw(false);2.监听View的变化,重写View,用ViewTreeObServer来监听,方法如下:[java] view plain copy 在CODE上查看代码片派生到我的代码片private void addOnPreDrawListener() {if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.HONEYCOMB) {final ViewTreeObserver mObserver = getViewTreeObserver();if (mObserver != null) {mObserver.addOnPreDrawListener(new OnPreDrawListener() { @Overridepublic boolean onPreDraw() {if (isDirty()) {//View或者子view发生变化invalidate();}return true;}});}}}3.重写该View的onDraw方法:[java] view plain copy 在CODE上查看代码片派生到我的代码片@Overrideprotected void onDraw(Canvas canvas) {try {if (mSurface != null) {Canvas surfaceCanvas = mSurface.lockCanvas(null); super.dispatchDraw(surfaceCanvas);mSurface.unlockCanvasAndPost(surfaceCanvas);mSurface.release();mSurface = null;mSurface = new Surface(mSurfaceTexture);}} catch (OutOfResourcesException e) {e.printStackTrace();}}步骤二:GLSurfaceView.Renderer[java] view plain copy 在CODE上查看代码片派生到我的代码片class CustomRenderer implements GLSurfaceView.Renderer { int glSurfaceTex;private final int GL_TEXTURE_EXTERNAL_OES = 0x8D65;long currentTime;long previousTime;boolean b = false;int frameCount = 0;DirectDrawer mDirectDrawer;ActivityManager activityManager;MemoryInfo _memoryInfo;// Fixed valuesprivate int TEXTURE_WIDTH = 360;private int TEXTURE_HEIGHT = 360;Context context;private LauncherAppWidgetHostView addedWidgetView; private SurfaceTexture surfaceTexture = null;private Surface surface;float fps;public CustomRenderer(Context context, LauncherAppWidgetHostView addedWidgetView, Display mDisplay){ this.context = context;this.addedWidgetView = addedWidgetView;TEXTURE_WIDTH = mDisplay.getWidth();TEXTURE_HEIGHT = mDisplay.getHeight();_memoryInfo = new MemoryInfo();activityManager = (ActivityManager)context.getApplicationContext().getSystemService(Context.ACTI VITY_SERVICE);}@Overridepublic void onDrawFrame(GL10 gl) {synchronized (this) {surfaceTexture.updateTexImage();}activityManager.getMemoryInfo(_memoryInfo);GLES20.glClearColor(0.0f, 0.0f, 1.0f, 1.0f);GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT |GLES20.GL_COLOR_BUFFER_BIT);GLES20.glEnable(GLES20.GL_BLEND);GLES20.glBlendFunc(GLES20.GL_ONE,GLES20.GL_ONE_MINUS_SRC_ALPHA);float[] mtx = new float[16];surfaceTexture.getTransformMatrix(mtx);mDirectDrawer.draw(mtx);calculateFps();//getAppMemorySize();//getRunningAppProcessInfo();//Log.v("onDrawFrame", "FPS: " + Math.round(fps) + ", availMem: " + Math.round(_memoryInfo.availMem / 1048576) + "MB");}private void getAppMemorySize(){ActivityManager mActivityManager = (ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE);android.os.Debug.MemoryInfo[] memoryInfos = mActivityManager.getProcessMemoryInfo(newint[]{android.os.Process.myPid()});int size = memoryInfos[0].dalvikPrivateDirty;Log.w("getAppMemorySize", size / 1024 + " MB");}private void getRunningAppProcessInfo() {ActivityManager mActivityManager = (ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE);//获得系统里正在运行的所有进程List<RunningAppProcessInfo> runningAppProcessesList = mActivityManager.getRunningAppProcesses();for (RunningAppProcessInfo runningAppProcessInfo : runningAppProcessesList) {// 进程ID号int pid = runningAppProcessInfo.pid;// 用户IDint uid = runningAppProcessInfo.uid;// 进程名String processName = runningAppProcessInfo.processName;// 占用的内存int[] pids = new int[] {pid};Debug.MemoryInfo[] memoryInfo =mActivityManager.getProcessMemoryInfo(pids);int memorySize = memoryInfo[0].dalvikPrivateDirty;System.out.println("processName="+processName+",currentPid: "+ "pid= "+android.os.Process.myPid()+"----------->"+pid+",uid="+uid+", memorySize="+memorySize+"kb");}}@Overridepublic void onSurfaceCreated(GL10 gl, EGLConfig config) { surface = null;surfaceTexture = null;glSurfaceTex = Engine_CreateSurfaceTexture(TEXTURE_WIDTH, TEXTURE_HEIGHT);Log.d("GLES20Ext", "glSurfaceTex" + glSurfaceTex);if (glSurfaceTex > 0) {surfaceTexture = new SurfaceTexture(glSurfaceTex);surfaceTexture.setDefaultBufferSize(TEXTURE_WIDTH, TEXTURE_HEIGHT);surface = new Surface(surfaceTexture);addedWidgetView.setSurface(surface);addedWidgetView.setSurfaceTexture(surfaceTexture);//addedWidgetView.setSurfaceTexture(surfaceTexture);mDirectDrawer = new DirectDrawer(glSurfaceTex);}}float calculateFps() {frameCount++;if (!b) {b = true;previousTime = System.currentTimeMillis();}long intervalTime = System.currentTimeMillis() - previousTime;if (intervalTime >= 1000) {b = false;fps = frameCount / (intervalTime / 1000f);frameCount = 0;Log.w("calculateFps", "FPS: " + fps);}return fps;}int Engine_CreateSurfaceTexture(int width, int height) {/** Create our texture. This has to be done each time the surface is* created.*/int[] textures = new int[1];GLES20.glGenTextures(1, textures, 0);glSurfaceTex = textures[0];if (glSurfaceTex > 0) {GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, glSurfaceTex);// Notice the use of GL_TEXTURE_2D for texture creationGLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGB, width, height, 0, GLES20.GL_RGB, GLES20.GL_UNSIGNED_BYTE, null);GLES20.glTexParameteri(GL_TEXTURE_EXTERNAL_OES,GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);GLES20.glTexParameteri(GL_TEXTURE_EXTERNAL_OES,GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);GLES20.glTexParameteri(GL_TEXTURE_EXTERNAL_OES,GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);GLES20.glTexParameteri(GL_TEXTURE_EXTERNAL_OES,GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);}return glSurfaceTex;}@Overridepublic void onSurfaceChanged(GL10 gl, int width, int height) {}}[java] view plain copy 在CODE上查看代码片派生到我的代码片public class DirectDrawer {private final String vertexShaderCode ="attribute vec4 vPosition;" +"attribute vec2 inputTextureCoordinate;" +"varying vec2 textureCoordinate;" +"void main()" +"{"+"gl_Position = vPosition;"+"textureCoordinate = inputTextureCoordinate;" +"}";private final String fragmentShaderCode ="#extension GL_OES_EGL_image_external : require\n"+"precision mediump float;" +"varying vec2 textureCoordinate;\n" +"uniform samplerExternalOES s_texture;\n" +"void main() {" +" gl_FragColor = texture2D( s_texture,textureCoordinate );\n" +"}";private FloatBuffer vertexBuffer, textureVerticesBuffer;private ShortBuffer drawListBuffer;private final int mProgram;private int mPositionHandle;private int mTextureCoordHandle;private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; // order to draw vertices// number of coordinates per vertex in this arrayprivate static final int COORDS_PER_VERTEX = 2;private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertexstatic float squareCoords[] = {-1.0f, 0.0f,-1.0f, -2.2f,1.0f, -2.2f,1.0f, 0.0f,};static float textureVertices[] = {0f, 0f,0f, 1f,1f, 1f,1f, 0f,};private int texture;public DirectDrawer(int texture){this.texture = texture;// initialize vertex byte buffer for shape coordinatesByteBuffer bb =ByteBuffer.allocateDirect(squareCoords.length * 4);bb.order(ByteOrder.nativeOrder());vertexBuffer = bb.asFloatBuffer();vertexBuffer.put(squareCoords);vertexBuffer.position(0);// initialize byte buffer for the draw listByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2);dlb.order(ByteOrder.nativeOrder());drawListBuffer = dlb.asShortBuffer();drawListBuffer.put(drawOrder);drawListBuffer.position(0);ByteBuffer bb2 =ByteBuffer.allocateDirect(textureVertices.length * 4);bb2.order(ByteOrder.nativeOrder());textureVerticesBuffer = bb2.asFloatBuffer();textureVerticesBuffer.put(textureVertices);textureVerticesBuffer.position(0);int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);mProgram = GLES20.glCreateProgram(); // create empty OpenGL ES ProgramGLES20.glAttachShader(mProgram, vertexShader); // add the vertex shader to programGLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to programGLES20.glLinkProgram(mProgram); //creates OpenGL ES program executables}public void draw(float[] mtx){GLES20.glUseProgram(mProgram);GLES20.glActiveTexture(GLES20.GL_TEXTURE0);GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture);// get handle to vertex shader's vPosition membermPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");// Enable a handle to the triangle verticesGLES20.glEnableVertexAttribArray(mPositionHandle);// Prepare the <insert shape here> coordinate dataGLES20.glVertexAttribPointer(mPositionHandle,COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);mTextureCoordHandle = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate");GLES20.glEnableVertexAttribArray(mTextureCoordHandle);// textureVerticesBuffer.clear();// textureVerticesBuffer.put( transformTextureCoordinates( // textureVertices, mtx ));// textureVerticesBuffer.position(0);GLES20.glVertexAttribPointer(mTextureCoordHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, textureVerticesBuffer);GLES20.glDrawElements(GLES20.GL_TRIANGLES,drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer);// Disable vertex arrayGLES20.glDisableVertexAttribArray(mPositionHandle);GLES20.glDisableVertexAttribArray(mTextureCoordHandle);}private int loadShader(int type, String shaderCode){// create a vertex shader type (GLES20.GL_VERTEX_SHADER)// or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)int shader = GLES20.glCreateShader(type);// add the source code to the shader and compile itGLES20.glShaderSource(shader, shaderCode);GLES20.glCompileShader(shader);return shader;}private float[] transformTextureCoordinates( float[] coords, float[] matrix){float[] result = new float[ coords.length ];float[] vt = new float[4];for ( int i = 0 ; i < coords.length ; i += 2 ) {float[] v = { coords[i], coords[i+1], 0 , 1 };Matrix.multiplyMV(vt, 0, matrix, 0, v, 0);result[i] = vt[0];result[i+1] = vt[1];}return result;}}步骤三:配置GLSurfaceView:[java] view plain copy 在CODE上查看代码片派生到我的代码片GLSurfaceView glSurfaceView = newGLSurfaceView(getApplicationContext());// Setup the surface view for drawing toglSurfaceView.setEGLContextClientVersion(2);glSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);glSurfaceView.setRenderer(renderer);//glSurfaceView.setZOrderOnTop(true);// Add our WebView to the Android View hierarchyglSurfaceView.setLayoutParams(newyoutParams(youtParams.MATCH_PAREN T, youtParams.MATCH_PARENT));<>。