Generate interleaved Mesh/VertexBuffer


Is there a way to create an interleaved VertexBuffer from a non-interleaved one by script? I’ve searched the forum and github but didn’t find anything related.

Generating a mesh, setting positions and normals then reading vertexBuffer produces a non-interleaved vertex buffer naturally.

This is relevant in the context of using VertexBuffer with TransformFeedback, which only accepts interleaved vertex buffers if more than 1 attribute is used (more than just positions).

It would be great if VertexBuffer had a method to interleave its data when called.

Thanks for your help

Hi @movAX13h,

Calling @mvaligursky for this.

There is, but not using Mesh API directly, that one creates non-interleaved only.
You’d need to create a fill in vertex buffer manually, something like this:

Here’s a more complex example, setting up a mesh as well:

(search for ‘_vertexFormat’ and ‘SEMANTIC_POSITION perhaps’)

But my recommendation, as a more flexible solution (and compatible with WebGPU in the future), you might want to consider rendering to texture to process data on GPU - we use it in a few places in the engine such as morph target updates, or particle system.


Thank you! I’ll create the VertexBuffer manually then.

Regarding your recommendation: Do you mean rendering to a texture in a feedback loop instead of using transform feedback with attributes?

The transform feedback allows you to transform data stored on GPU into another buffer again on the GPU, that you can use as an input to vertex shader. For example this is often used to do skinning one time and then render the mesh as non-skinned.

You can do the same thing using textures (especially float textures). You store some data in the texture, and render them to another buffer, often of the same resolution. Fragment shader does the processing.

For example we do morphing this way here:

We store vertex offset for morph targets in individual textures, and when we need to morph the mesh:

  1. use few of these morph target textures as input to shader
  2. render them into target texture - fragment shader blends them based on weights
  3. at a later stage, the final texture is sampled in the vertex shader and morph offset are added to vertex positions.