Not sure about getting line rendering from fbx, but surely you can make procedural mesh made of vertices, and then draw it using different mode, in particular gl.LINES. That will render your vertices as couples with lines in between. You could use index buffer to link vertices, that will allow to have pretty much twice less vertices as well.
Second is not meant to be performant or used in real games, just for drawing debugging gizmos, while first link shows how to create custom model.
Then you can create pc.MeshInstance with that mesh, and create pc.Model and attach it to entity.
I do not know any specifics about your needs, but there are even extra things where you could modify position of vertices in vertex shader dynamically, that would make curves performance very-very fast and allow dynamically change it with no much effort.
you have an object that is build as curves object in houdini.
at the moment i have to convert it to poly but i would love to be able to export it directly as a spline,
and render it directly as a spline.
the point is that those curves are animated. so export an animated mesh of this will be very heavy.
in a perfect world something like the alembic file format would be perfect.
you can store with .abc
per point / vertex animation store in the object itself
per point attribute like color / transparency
connect those attribute to a shader
i know fbx can
per point attribute
but i am not sure it can store animation per vertex. UE4 and Unity for ex doesn’t offer this feature.
It’s more a VFX workflow than a Game workflow i guess.
A good cure workflow allow you to keep things very light by avoiding the poly convert step
If you have any more suggestion i would be happy to study anything that point me in the right direction.
Looking at scene you’ve posted, looks like it is a landscape, where lines are drawn only at certain elevation level?
What would animation be?
As for what you have now I can suggest if you actually have normal landscape, not lines, but full triangles, and then in pixel shader based on Y position of pixel can fill pixel with color or discard (transparent). That would make those lines, with way less geometry required. Animation wise, if you have landscape that changes, that can be tricky to be stored in a file, but can be done by proceduraly generating height map, and then moving Y of vertices of landscape based on heightmap.
That gives you procedural animation of landscape, you can drive parameters for height map generation based on some inputs, like music? Then you have lines rendering too using pixel shader.
This could be alternative direction to go. I do not know what you are trying to achieve, so it is hard to suggest directions. There are always creative ways to achieve different things, and sometimes little detail/feature might change approach a lot.
Manipulating buffer from CPU to GPU every frame - is expensive, very expensive if you have loads of vertices.
But it is cheap to do on GPU only, like using textures or calculating noise on gpu in real-time.
So start small - learn to make own shader or using shader chunks override vertex positioning part in standard shader. This will give you some idea.
Vertex shader - is essentially a conveyor that iterates through each vertex and allows you to pass some stuff to fragment shader and manipulate vertex position.
Start by stages, and it will get you pretty close to your needs