YUV to RGB conversion by fragment shader

I don’t know if you solved your problem. I used your code and I solved in this mode. public class MyRenderer implements Renderer{ public static final int recWidth = Costanti.recWidth; public static final int recHeight = Costanti.recHeight; private static final int U_INDEX = recWidth*recHeight; private static final int V_INDEX = recWidth*recHeight*5/4; private static final int … Read more

What is state-of-the-art for text rendering in OpenGL as of version 4.1? [closed]

Rendering outlines, unless you render only a dozen characters total, remains a “no go” due to the number of vertices needed per character to approximate curvature. Though there have been approaches to evaluate bezier curves in the pixel shader instead, these suffer from not being easily antialiased, which is trivial using a distance-map-textured quad, and … Read more

Replicating MeshLambertMaterial Using ShaderMaterial ignores textures

three.js was designed to be easy to use, not easy to modify. This may change in the future… You need to set the material.defines like so: var defines = {}; defines[ “USE_MAP” ] = “”;. Then specify defines in the material constructor. var material = new THREE.ShaderMaterial({ name: “TerrainShader”, defines : defines, uniforms : shaderUniforms, … Read more

In a fragment shader, why can’t I use a flat input integer to index a uniform array of sampler2D?

[…] but can’t use it to index an array of samplers as expected because compiler sees it as “non-constant” […] In GLSL up to version 3.30 respectively GLSL ES up to version 3.00, the index of an array of texture samplers has to be a constant expression: GLSL 3.30 Specification – 4.1.7 Samplers (page 21) … Read more

How to render Android’s YUV-NV21 camera image on the background in libgdx with OpenGLES 2.0 in real-time?

The short answer is to load the camera image channels (Y,UV) into textures and draw these textures onto a Mesh using a custom fragment shader that will do the color space conversion for us. Since this shader will be running on the GPU, it will be much faster than CPU and certainly much much faster … Read more

How to calculate Tangent and Binormal?

The relevant input data to your problem are the texture coordinates. Tangent and Binormal are vectors locally parallel to the object’s surface. And in the case of normal mapping they’re describing the local orientation of the normal texture. So you have to calculate the direction (in the model’s space) in which the texturing vectors point. … Read more

Getting the true z value from the depth buffer

From http://web.archive.org/web/20130416194336/http://olivers.posterous.com/linear-depth-in-glsl-for-real // == Post-process frag shader =========================================== uniform sampler2D depthBuffTex; uniform float zNear; uniform float zFar; varying vec2 vTexCoord; void main(void) { float z_b = texture2D(depthBuffTex, vTexCoord).x; float z_n = 2.0 * z_b – 1.0; float z_e = 2.0 * zNear * zFar / (zFar + zNear – z_n * (zFar – zNear)); } … Read more

How to recover view space position given view space depth value and ndc xy

3 Solutions to recover view space position in perspective projection The projection matrix describes the mapping from 3D points of a scene, to 2D points of the viewport. It transforms from view (eye) space to the clip space, and the coordinates in the clip space are transformed to the normalized device coordinates (NDC) by dividing … Read more