I'm trying to use an element buffer object to render a simple rectangle in Golang using the go-gl bindings for OpenGL. The following is the main code:
package main
import (
"gogame/shaders"
"runtime"
"github.com/go-gl/gl/v4.1-core/gl"
"github.com/go-gl/glfw/v3.2/glfw"
log "github.com/sirupsen/logrus"
)
var vertices = []float32{
-0.5, -0.5, 0.0,
-0.5, 0.5, 0.0,
0.5, 0.5, 0.0,
0.5, -0.5, 0.0,
}
var rectangle = []uint{
0, 1, 2,
2, 3, 0,
}
func init() {
runtime.LockOSThread()
}
func main() {
window := initGLFW()
defer glfw.Terminate()
program := initOpenGL()
var vertexBuffer uint32
var elementBuffer uint32
var vertexArray uint32
gl.GenBuffers(1, &vertexBuffer)
gl.GenBuffers(1, &elementBuffer)
gl.GenVertexArrays(1, &vertexArray)
gl.BindVertexArray(vertexArray)
gl.BindBuffer(gl.ARRAY_BUFFER, vertexBuffer)
gl.BufferData(gl.ARRAY_BUFFER, 4*len(vertices), gl.Ptr(vertices), gl.STATIC_DRAW)
gl.BindBuffer(gl.ELEMENT_ARRAY_BUFFER, elementBuffer)
gl.BufferData(gl.ELEMENT_ARRAY_BUFFER, 4*len(rectangle), gl.Ptr(rectangle), gl.STATIC_DRAW)
gl.VertexAttribPointer(0, 3, gl.FLOAT, false, 0, nil)
gl.EnableVertexAttribArray(0)
gl.BindBuffer(gl.ARRAY_BUFFER, 0)
gl.BindVertexArray(0)
if err := gl.GetError(); err != 0 {
log.Error(err)
}
for !window.ShouldClose() {
gl.ClearColor(0.5, 0.5, 0.5, 0.5)
gl.Clear(gl.COLOR_BUFFER_BIT)
gl.Clear(gl.DEPTH_BUFFER_BIT)
gl.UseProgram(program)
gl.BindVertexArray(vertexArray)
gl.DrawElements(gl.TRIANGLES, 6, gl.UNSIGNED_INT, gl.PtrOffset(0))
//gl.DrawArrays(gl.TRIANGLES, 0, 3)
glfw.PollEvents()
window.SwapBuffers()
if err := gl.GetError(); err != 0 {
log.Error(err)
}
}
glfw.Terminate()
}
Theoretically, this should draw a rectangle. However, this is what I get when running it in Wireframe mode:
The code I left out is just to compile very basic shaders and initialize GLFW and OpenGL.
You are probably using a 64-bit operating system. At a 64-bit operations system the data type uint
has a size of 64 bits. See Go language data types or A Tour of GO - Basic types.
The coordinates of the rectangle ar specified like this:
1: -0.5, 0.5 2: 0.5, 0.5
x-----------x
| |
| |
| |
| |
x-----------x
0: -0.5, -0.5 3: 0.5, -0.5
The indices array is a array of 64 bit integer values:
var rectangle = []uint{
0, 1, 2,
2, 3, 0,
}
But it is treated as an array of 32 bit integers, when the geometry is draw (gl.UNSIGNED_INT
):
gl.DrawElements(gl.TRIANGLES, 6, gl.UNSIGNED_INT, gl.PtrOffset(0))
This causes that each index of the array is splitted to 2 indices, which 32 bit each, where the 1st value is the index of the array and the 2nd is 0:
[0, 0, 1, 0, 2, 0, 2, 0, 3, 0 0, 0]
So the first 2 triangles (first 6 indices) are
0 - 0 - 1
0 - 2 - 0
In the image you can see this 2 triangles, which are narrowed down to 2 lines, because 2 points of each triangle are equal.
Use the data type uint32
to solve the issue:
var rectangle = []uint32{
0, 1, 2,
2, 3, 0,
}