How to set OpenGl Coordinates? - visual-c++

With following code I am getting my triangle in top right corner of the graph, which tells me that the 0,0 is in the center of the window. What should I do to bring it in the corner of the window, i.e. bottom left?
#include <GL/glut.h>
void displayCube()
{
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(1.0,1.0,1.0);
glBegin(GL_TRIANGLES);
glVertex3f(0, 0, 0);
glVertex3f(0.5, 0, 0);
glVertex3f(0.25, 0.25, 0);
glEnd();
glFlush();
}
int main(int argc, char *argv[]){
glutInit(&argc,argv);
glutInitDisplayMode(GLUT_SINGLE);
glutInitWindowSize(500,500);
glutInitWindowPosition(0,0);
glutCreateWindow("Cube");
glutDisplayFunc(displayCube);
glutMainLoop();
return 0;
}

OpenGL uses a set of matrix transformation to move from original model space to screen/window space.
In you example, there is default identity projection so you are 'moving' in box -1 to 1 in each direction.
point (0.0, 0.0, 0.0) is in the centre. (-1, 0, 0) is on the left side, (1, 0, 0) is on the right, (0, 1, 0) is top.
try to figure out the rest :)
http://www.songho.ca/opengl/gl_transform.html

Related

How to make elevated parts of a steep plane seem darker than the lower surface?

I made a plane in THREEjs using Mesh, PlaneGeometry and ShaderMaterial. It's a very simple/basic form.
I applied a simple phormula to make the plain more steep. Now I'm trying to make the lower surface darker than the higher surface. Here is what I tried.
Vertex shader:
varying vec3 test;
void main(void) {
float amp = 2.5;
float z = amp * sin(position.x*0.2) * cos(position.y*0.5); //this makes the surface steeper
test = vec3(1, 1, -z); //this goes to fragment shader
//test = vec3(698.0, 400.0, -z); I have tried this. first coordenates here are to normalize the vector
gl_Position = projectionMatrix * modelViewMatrix * vec4(position.x, position.y, z, 1.0);
}
Fragment shader:
precision mediump float;
varying vec3 test;
void main(void) {
vec3 st = gl_FragCoord.xyz/test;
gl_FragColor = vec4(st.xyz, 1.0);
}
Result:
This result is not desirable, since the contrast between top and down is too aggressive and I'd like the lower surface less white. What do I have to change to accomplish this?
If you want to create a brightness based on the height of the waves, then you'll need to only use the test.z value, since test.xy aren't really doing anything. The problem is that brightness needs a value between [0, 1] and due to the amplitude multiplication, you're getting a value between [-2.5, 2.5] range.
precision mediump float;
varying vec3 test;
void main(void) {
float amp = 2.5;
// Extract brightness from test.z
float brightness = test.z;
// Convert brightness from [-2.5, 2.5] to [0.0, 1.0] range
brightness = (brightness / amp) * 0.5 + 0.5;
vec3 yellow = vec3(1.0, 1.0, 0.0);
// Multiply final color by brigthness (0 brightness = black)
vec3 st = yellow * brightness;
gl_FragColor = vec4(st.xyz, 1.0);
}
That should give you a smoother transition from full yellow to black.
As an aside, to help me visualize the values I'm getting from GLSL functions, I like to use the Graphtoy tool. I recommend you give it a shot to help you write shaders!

How to change the bar color of FL_HOR_NICE_SLIDER in fltk?

I want to change the bar color of FL_HOR_NICE_SLIDER to green, so I tried the following code:
#include <FL/Fl.H>
#include <FL/Fl_Window.H>
#include <FL/Fl_Slider.H>
int main(int argc, char **argv) {
Fl::scheme("GTK+");
Fl::background(50, 50, 50);
Fl::background2(90, 90, 90);
Fl::foreground(255, 255, 255);
Fl_Window *window = new Fl_Window(400, 60);
Fl_Slider *slider = new Fl_Slider(20, 20, 300, 20);
slider->type(FL_HOR_NICE_SLIDER);
slider->box(FL_FLAT_BOX);
slider->color(0x00DD0000);
slider->color2(0xDDDDDD00);
window->end();
window->show(argc, argv);
return Fl::run();
}
The result of this code is shown below. The bar color remains white, but the "area" color changes to green, which is not my desired result.
What I would like to achieve is the following result:
(I use Fl::foreground(0, 255, 0); and delete slider->color(0x00DD0000); to get the result above, but I don't want to change the foreground color because this will change other colors as well, for example the default font color).
How can I achieve the expected result?
After checking out the source code of fltk, I disappointedly discovered that I can only change the bar color by changing the foreground color.
The source code of drawing the bar of FL_HOR_NICE_SLIDER is shown below:
void Fl_Slider::draw_bg(int X, int Y, int W, int H) {
fl_push_clip(X, Y, W, H);
draw_box();
fl_pop_clip();
Fl_Color black = active_r() ? FL_FOREGROUND_COLOR : FL_INACTIVE_COLOR;
if (type() == FL_VERT_NICE_SLIDER) {
draw_box(FL_THIN_DOWN_BOX, X+W/2-2, Y, 4, H, black);
} else if (type() == FL_HOR_NICE_SLIDER) {
draw_box(FL_THIN_DOWN_BOX, X, Y+H/2-2, W, 4, black);
}
}
Note that the color is either FL_FOREGROUND_COLOR or FL_INACTIVE_COLOR, depending on whether the slider is active or not.

Why isn't my OpenGL "hello world" rendering?

I've been hitting my head against the wall for two days on this. I'm trying to distill the simplest possible OpenGL Core ~2.0-3.2 drawing sequence so that I can build code off of it and really understand the API. The problem I'm running into is that the tutorials never seem to come with a helpful tag for what version of OpenGL they're using, and it's purely by luck I happened across documentation on how to even request a particular version from my context.
I'm certain that I have 3.2 core enabled now, as immediate mode drawing throws errors (that's a good thing! I want to leave immediate mode behind!), and I've tried to strip out anything fancy like coordinate transforms or triangle winding that might screw up my display. The problem is, I can't get anything to appear on-screen.
In prior iterations of this program, I did manage to get a white triangle on-screen sometimes, using random coordinates, but it seems to me like the vertices aren't getting set properly, and strange bit combinations produce strange results. Sign did not matter in where the triangles appeared - therefore my theory is that either the vertex information is not being transferred properly to the vertex shader, or the shader is mangling it. The problem is, I'm checking all the results and logs I can find, and the shader compiles and links beautifully.
I will provide links and code below, but in addition to just getting the triangle on-screen I'm wondering, can I get the shader program to spit text and/or diagnostic values out to its shaderInfoLog? That would simplify the debugging process immensely.
The various tutorials I'm consulting are...
http://arcsynthesis.org/gltut/Basics/Tutorial%2001.html
http://en.wikibooks.org/wiki/OpenGL_Programming/Modern_OpenGL_Introduction
https://en.wikipedia.org/wiki/Vertex_Buffer_Object
http://www.opengl.org/wiki/Tutorial2:_VAOs,_VBOs,_Vertex_and_Fragment_Shaders_(C_/_SDL)
http://antongerdelan.net/opengl/hellotriangle.html
http://lwjgl.org/wiki/index.php?title=The_Quad_with_DrawArrays
http://lwjgl.org/wiki/index.php?title=Using_Vertex_Buffer_Objects_(VBO)
http://www.opengl.org/wiki/Vertex_Rendering
http://www.opengl.org/wiki/Layout_Qualifier_(GLSL) (not present in
provided code, but something I tried was #version 420 with layout
qualifiers 0 (in_Position) and 1 (in_Color))
Code (LWJGL + Groovy)
package com.thoughtcomplex.gwdg.core
import org.lwjgl.input.Keyboard
import org.lwjgl.opengl.ContextAttribs
import org.lwjgl.opengl.Display
import org.lwjgl.opengl.GL11
import org.lwjgl.opengl.GL15
import org.lwjgl.opengl.GL20
import org.lwjgl.opengl.GL30
import org.lwjgl.opengl.PixelFormat
import org.lwjgl.util.glu.GLU
import java.nio.ByteBuffer
import java.nio.FloatBuffer
/**
* Created by Falkreon on 5/21/2014.
*/
class GwDG {
static final String vertexShader = """
#version 150
in vec2 in_Position;
in vec3 in_Color;
smooth out vec3 ex_Color;
void main(void) {
gl_Position = vec4(in_Position,0.0,1.0);
ex_Color = in_Color;
}
""";
static final String fragmentShader = """
#version 150
smooth in vec3 ex_Color;
out vec4 fragColor;
void main(void) {
//fragColor = vec4(ex_Color, 1.0);
fragColor = vec4(1.0, 1.0, 1.0, 1.0);
}
""";
static int vaoHandle = -1;
static int vboHandle = -1;
protected static int colorHandle = -1;
static int vertexShaderHandle = -1;
static int fragmentShaderHandle = -1;
static int shaderProgram = -1;
protected static FloatBuffer vboBuffer = ByteBuffer.allocateDirect(6*4).asFloatBuffer();
protected static FloatBuffer colorBuffer = ByteBuffer.allocateDirect(9*4).asFloatBuffer();
public static void main(String[] args) {
//Quick and dirty hack to get something on the screen; this *works* for immediate mode drawing
System.setProperty("org.lwjgl.librarypath", "C:\\Users\\Falkreon\\IdeaProjects\\GwDG\\native\\windows");
Display.setTitle("Test");
ContextAttribs attribs = new ContextAttribs();
attribs.profileCompatibility = false;
attribs.profileCore = true;
attribs.majorVersion = 3;
attribs.minorVersion = 2;
Display.create( new PixelFormat().withDepthBits(24).withSamples(4).withSRGB(true), attribs );
//Kill any possible winding error
GL11.glDisable(GL11.GL_CULL_FACE);
vaoHandle = GL30.glGenVertexArrays();
GL30.glBindVertexArray(vaoHandle);
reportErrors("VERTEX_ARRAY");
vboHandle = GL15.glGenBuffers();
colorHandle = GL15.glGenBuffers();
vertexShaderHandle = GL20.glCreateShader(GL20.GL_VERTEX_SHADER);
fragmentShaderHandle = GL20.glCreateShader(GL20.GL_FRAGMENT_SHADER);
reportErrors("CREATE_SHADER");
GL20.glShaderSource( vertexShaderHandle, vertexShader );
GL20.glShaderSource( fragmentShaderHandle, fragmentShader );
GL20.glCompileShader( vertexShaderHandle );
String vertexResult = GL20.glGetShaderInfoLog( vertexShaderHandle, 700 );
if (!vertexResult.isEmpty()) System.out.println("Vertex result: "+vertexResult);
GL20.glCompileShader( fragmentShaderHandle );
String fragmentResult = GL20.glGetShaderInfoLog( fragmentShaderHandle, 700 );
if (!fragmentResult.isEmpty()) System.out.println("Fragment result: "+fragmentResult);
shaderProgram = GL20.glCreateProgram();
reportErrors("CREATE_PROGRAM");
GL20.glAttachShader( shaderProgram, vertexShaderHandle );
GL20.glAttachShader( shaderProgram, fragmentShaderHandle );
GL20.glLinkProgram(shaderProgram);
int result = GL20.glGetProgrami( shaderProgram, GL20.GL_LINK_STATUS );
if (result!=1) System.out.println("LINK STATUS: "+result);
reportErrors("SHADER_LINK");
//Attribs
int vertexParamID = GL20.glGetAttribLocation(shaderProgram, "in_Position");
int colorParamID = GL20.glGetAttribLocation(shaderProgram, "in_Color");
while (!Keyboard.isKeyDown(Keyboard.KEY_ESCAPE)) {
//Intentional flicker so I can see if something I did freezes or lags the program
GL11.glClearColor(Math.random()/6 as Float, Math.random()/8 as Float, (Math.random()/8)+0.4 as Float, 1.0f);
GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT );
float[] coords = [
0.0f, 0.8f,
-0.8f, -0.8f,
0.8f, -0.8f
];
vboBuffer.clear();
coords.each {
vboBuffer.put it;
}
vboBuffer.flip();
float[] colors = [
1.0f, 0.0f, 0.0f,
0.0f, 1.0f, 0.0f,
0.0f, 0.0f, 1.0f
];
colorBuffer.clear();
colors.each {
colorBuffer.put it;
}
colorBuffer.flip();
//System.out.println(dump(vboBuffer));
reportErrors("SETUP_TRIANGLE_DATA");
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vboHandle);
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, vboBuffer, GL15.GL_STATIC_DRAW);
reportErrors("BIND_VBO_AND_FILL_DATA");
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, colorHandle);
GL15.glBufferData(GL15.GL_ARRAY_BUFFER, colorBuffer, GL15.GL_STATIC_DRAW);
reportErrors("BIND_COLOR_BUFFER_AND_FILL_DATA");
GL20.glEnableVertexAttribArray( vertexParamID );
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vboHandle);
GL20.glVertexAttribPointer(
vertexParamID, 2, GL11.GL_FLOAT, false, 0, 0
);
GL20.glEnableVertexAttribArray( colorParamID );
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, colorHandle);
GL20.glVertexAttribPointer(
colorParamID, 3, GL11.GL_FLOAT, false, 0, 0
);
reportErrors("VERTEX_ATTRIB_POINTERS");
GL20.glUseProgram( shaderProgram );
GL11.glDrawArrays( GL11.GL_TRIANGLES, 0, 3 );
reportErrors("POST_RENDER");
Display.update(true);
Thread.sleep(12);
Keyboard.poll();
}
Display.destroy();
}
private static String dump(FloatBuffer f) {
String result = "[ ";
f.position(0);
//f.rewind();
for(it in 0..<f.limit()) {
result+= f.get(it);
if (it!=f.limit()-1) result+=", ";
}
result +=" ]";
f.position(0);
result;
}
private static void reportErrors(String prefix) {
int err = GL11.glGetError();
if (err!=0) System.out.println("["+prefix + "]: "+GLU.gluErrorString(err)+" ("+err+")");
}
}
Not that it matters, but the card is an ATI Radeon HD 8550G (part of an A8 APU) with support for GL4.
I'll update with more information at request, I just don't know what else might be helpful in diagnosing this.
Edit: I've updated the code above to reflect changes suggested by Reto Koradi. I've also got a variant of the code running with an alternate vertex declaration:
float[] coords = [
0.0f, 0.8f,
-0.8f, -0.8f,
Math.random(), Math.random(),
//0.8f, -0.8f,
];
This does actually produce something rasterized on the screen, but it is not at all what I would expect. Rather than simply relocating the bottom-right (top-right?) point, it flips between nothing, completely white, and the following two shapes:
If I replace the second or third vertex, this happens. If I replace the first vertex, nothing appears on-screen. So, to check my assumptions about which vertex is actually appearing in the center of the window, I tried the following:
static final String vertexShader = """
#version 150
in vec2 in_Position;
in vec3 in_Color;
smooth out vec3 ex_Color;
void main(void) {
gl_Position = vec4(in_Position,0.0,1.0);
ex_Color = vec3(1.0, 1.0, 1.0);
if (gl_VertexID==0) ex_Color = vec3(1.0, 0.0, 0.0);
if (gl_VertexID==1) ex_Color = vec3(0.0, 1.0, 0.0);
if (gl_VertexID==2) ex_Color = vec3(0.0, 0.0, 1.0);
//ex_Color = in_Color;
}
""";
static final String fragmentShader = """
#version 150
smooth in vec3 ex_Color;
out vec4 fragColor;
void main(void) {
fragColor = vec4(ex_Color, 1.0);
//fragColor = vec4(1.0, 1.0, 1.0, 1.0);
}
""";
Simple, right? The vertex in the middle should probably be the first, "red" vertex, since it's the non-optional vertex without which I can't seem to draw anything on the screen. That is not actually the case. The half-screen blocks are always red as expected, but the left-facing triangle shape is always the color of whatever vertex I replace - replacing the second vertex makes it green, replacing the third vertex makes it blue. It definitely seems like both "-0.8, -0.8" and "0.8, -0.8" are so far off-screen that the triangle sections visible are effectively an infinitely thin line. But I don't think this is due to a transform - this behaves more like an alignment problem, with its arbitrary threshold around 0.9 that sends coordinates shooting off into the farlands. Like perhaps the significand of a value in the vertex buffer is winding up in the exponent of in_Position values.
Just to keep drilling down, I increased the amount of hardcoded GLSL to ignore the buffers completely -
static final String vertexShader = """
#version 150
in vec2 in_Position;
in vec3 in_Color;
smooth out vec3 ex_Color;
void main(void) {
gl_Position = vec4(in_Position,0.0,1.0);
ex_Color = vec3(1.0, 1.0, 1.0);
if (gl_VertexID==0) {
ex_Color = vec3(1.0, 0.0, 0.0);
gl_Position = vec4(0.0, 0.8, 0.0, 1.0);
}
if (gl_VertexID==1) {
ex_Color = vec3(0.0, 1.0, 0.0);
gl_Position = vec4(-0.8, -0.8, 0.0, 1.0);
}
if (gl_VertexID==2) {
ex_Color = vec3(0.0, 0.0, 1.0);
gl_Position = vec4(0.8, -0.8, 0.0, 1.0);
}
//ex_Color = in_Color;
}
""";
This produces the desired result, a nice big triangle with a different color on each vertex. Obviously I want to get this same triangle out of the vertex buffers but it's a really good start - with two vertices I can tell at least what direction the final vertex is shooting off in. In the case of the first vertex, it's definitely down.
I also figured out how to enable debug mode in the profile, and it's spitting color buffer errors at me. Good! That's a start. Now why isn't it throwing massive amounts of VBO errors?
Your code is not compatible with the core profile. Your reportErrors() should actually fire. In the core profile, you have to use vertex array objects (VAO) for your vertex setup. You will have to generate a VAO with glGenVertexArrays(), and bind it with `glBindVertexArray(), before setting up your vertex state.
The use of gl_FragColor in the fragment shader is also deprecated. You need to declare your own out variable in the core profile, for example:
out vec4 FragColor;
...
FragColor = ...
The answer finally came in a flash of inspiration from
http://lwjgl.org/forum/index.php?topic=5171.0
It's fairly wrong, so let me explain what finally went right. My dev environment is java on an intel chip. All of java runs in big-endian. I was mystified that the exponent of my floats seemed to be winding up in the significand, but seeing this post it finally struck me- the easiest way that happens if endianness is flipped! FloatBuffers are still going to be big-endian. The likelihood that OpenGL runs in anything besides native byte order is pretty much zero. Either none of the lwjgl resources I consulted mentioned this quirk, or I missed them.
The correct initializer for the ByteBuffers in this program is:
protected static FloatBuffer vboBuffer = ByteBuffer.allocateDirect(6*4).order( ByteOrder.nativeOrder( ) ).asFloatBuffer();
protected static FloatBuffer colorBuffer = ByteBuffer.allocateDirect(9*4).order( ByteOrder.nativeOrder( ) ).asFloatBuffer();
The important part being the ByteOrder.nativeOrder()

Is there a faked antialiasing algorithm using the depth buffer?

Lately I implemented the FXAA algorithm into my OpenGL application. I haven't understand this algorithm completely by now but I know that it uses contrast data of the final image to selectively apply blurring. As a post processing effect that makes sense. B since I use deferred shading in my application I already have a depth texture of the scene. Using that it might be much easier and more precise to find edges for applying blur there.
So is there a known antialiasing algorithm using the depth texture instead of the final image to find the edges? By fakes I mean an antialiasing algorithm based on a pixel basis instead of a vertex basis.
After some research I found out that my idea is widely used already in deferred renderers. I decided to post this answer because I came up with my own implementation which I want to share with the community.
Based on the gradient changes of the depth and the angle changes of the normals, there is blurring applied to the pixel.
// GLSL fragment shader
#version 330
in vec2 coord;
out vec4 image;
uniform sampler2D image_tex;
uniform sampler2D position_tex;
uniform sampler2D normal_tex;
uniform vec2 frameBufSize;
void depth(out float value, in vec2 offset)
{
value = texture2D(position_tex, coord + offset / frameBufSize).z / 1000.0f;
}
void normal(out vec3 value, in vec2 offset)
{
value = texture2D(normal_tex, coord + offset / frameBufSize).xyz;
}
void main()
{
// depth
float dc, dn, ds, de, dw;
depth(dc, vec2( 0, 0));
depth(dn, vec2( 0, +1));
depth(ds, vec2( 0, -1));
depth(de, vec2(+1, 0));
depth(dw, vec2(-1, 0));
float dvertical = abs(dc - ((dn + ds) / 2));
float dhorizontal = abs(dc - ((de + dw) / 2));
float damount = 1000 * (dvertical + dhorizontal);
// normals
vec3 nc, nn, ns, ne, nw;
normal(nc, vec2( 0, 0));
normal(nn, vec2( 0, +1));
normal(ns, vec2( 0, -1));
normal(ne, vec2(+1, 0));
normal(nw, vec2(-1, 0));
float nvertical = dot(vec3(1), abs(nc - ((nn + ns) / 2.0)));
float nhorizontal = dot(vec3(1), abs(nc - ((ne + nw) / 2.0)));
float namount = 50 * (nvertical + nhorizontal);
// blur
const int radius = 1;
vec3 blur = vec3(0);
int n = 0;
for(float u = -radius; u <= +radius; ++u)
for(float v = -radius; v <= +radius; ++v)
{
blur += texture2D(image_tex, coord + vec2(u, v) / frameBufSize).rgb;
n++;
}
blur /= n;
// result
float amount = mix(damount, namount, 0.5);
vec3 color = texture2D(image_tex, coord).rgb;
image = vec4(mix(color, blur, min(amount, 0.75)), 1.0);
}
For comparison, this is the scene without any anti-aliasing.
This is the result with anti-aliasing applied.
You may need to view the images at their full resolution to judge the effect. In my view the result is adequate for the simple implementation. The best thing is that there are nearly no jagged artifacts when the camera moves.

Why texture not applying?

I have a cube with texture, when I changed my data arrays to VBO(glGen,etc.) my cube renders with grey color.But if I use something like glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, GL_FALSE, 0, myBuf); all is ok. What's problem? Please help me.
- (void)render:(CADisplayLink*)displayLink {
glClearColor(0.0f, 1.0f,0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_DEPTH_TEST);
CC3GLMatrix *projection = [CC3GLMatrix matrix];
float h = 4.0f * self.frame.size.height / self.frame.size.width;
[projection populateFromFrustumLeft:-2 andRight:2 andBottom:-h/2 andTop:h/2 andNear:2 andFar:4];
glUniformMatrix4fv(_projectionUniform, 1, 0, projection.glMatrix);
CC3GLMatrix *modelView = [CC3GLMatrix matrix];
[modelView populateFromTranslation:CC3VectorMake(sin(CACurrentMediaTime()), 0, -2)];
_currentRotation += displayLink.duration * 90;
[modelView rotateBy:CC3VectorMake(_currentRotation, _currentRotation, -1)];
glUniformMatrix4fv(_modelViewUniform, 1, 0, modelView.glMatrix);
// 1
glViewport(0, 0, self.frame.size.width, self.frame.size.height);
// 2
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, texture[0].texID);
glUniform1i(uniformTexture, 0);
glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, GL_FALSE, 0, 0);
glEnableVertexAttribArray(_positionSlot);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, cubeIndexes);
glVertexAttribPointer(ATTRIB_TEXCOORD, 2, GL_FLOAT, 0, 0, texCoord);
glEnableVertexAttribArray(ATTRIB_TEXCOORD);
glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_SHORT, (void*)0);
//glBindTexture(GL_TEXTURE_2D, 0);
glBindBuffer(GL_ARRAY_BUFFER,0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,0);
[_context presentRenderbuffer:GL_RENDERBUFFER];
}
You need to use client states to apply details on your vertex buffer, see glEnableClientState and glClientActiveTexture manual entries.
Try to change your code in this way.
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
glEnableClientState(GL_VERTEX_ARRAY); // NEW! we need to enable client state.
glActiveClientTexture(GL_TEXTURE0); // CHANGED! We need to use textures in the client state.
See also How to call glDrawElements with static TexCoords and Dynamic Vertices
What version of OpenGL are you using? I hope OpenGL ES 2.0

Resources