PBR Workflow Rundown/Tutorial?

Is there a workflow rundown or tutorial regarding PBR? I’ve forked the PBR testbed, but my models/textures look nothing like in the DCC app I’ve authored them in.

Thanks.

Ah, this is a start.

What app did you use to author the model? Can you show a comparison?

On this screenshot you can see Toolbag left and Playcanvas right. I converted the HDR to a vertical layout with HDRShop and then used ModifiedCubeMapGen to render out the MipMaps. Saved as PNG just as in the example, the PNGs look normal. Maybe a Gamma Issue? Is there somewhere to control Exposure?

Thanks!

I just noticed the example PNGs were 8 bit RGBA, while mine were 16 bit RGB. Although I can’t seem to get that format out of ModifiedCubeMapGen correctly. Could you potentially provide a screenshot with the settings?

Thanks!

[technical info]
PlayCanvas not just uses LDR PNGs for cubemaps, but encodes some HDR using modified RGBM: http://graphicrants.blogspot.ru/2009/04/rgbm-color-encoding.html
That’s why alpha channel is used.
The modification is just replacing 6 with 8 and doing pow(0.5) before writing to get better range/precision after gamma correction.
[/technical info]

So… you’d better wait a little before all encoding/prefiltering is integrated directly into the editor (hopefully it’ll be done this month).

Is there somewhere to control Exposure?

yes, it is in the scene settings

1 Like

Interesting! Can I write RGBM files form ModifiedCubeMapGen?

No. But if you really want to try it right now and don’t want to wait a couple of weeks, and you’re also OK with downloading exes made by some forum guy, you can use this command-line tool: http://geom.io/pc/hdrdds2rgbmpng.zip
Usage is: hdrdds2rgbmpng RGBA32F_DDS_FILE OUTPUT_PNG_FILE
(existing files are not overwritten).
So you can export a cubemap with full mipchain from CubemapGen as separate RGBA32F (floating point per channel) dds files and then write some script to batch process them and convert to pngs. For the sake of simplicity I added genRGBMPNG.exe which scans the current directory and uses hdrdds2rgbmpng on every dds it finds.
Files you’ll get will be ready to use.

1 Like

Pardon my ignorante, but does this https://github.com/playcanvas/engine/pull/141 mean I can now just load DDS Files?

Almost. You can load it, but material shaders will still try to decode RGBM, instead of using raw HDR, so it’s pointless.
But it’s fixed already and will be pushed very soon (maybe tomorrow).

(also there’s a little bug with offset in that commit, fixed too)

Now? https://github.com/playcanvas/engine/pull/143

How to load HDR/DDS in the editor/designer? Just add asset and it gets converted? Apparently not, as .DDS (RGBA32) says ‘Invalid asset type.’ at upload?

Thx!

:wink:

Designer isn’t patched yet to accept DDS files, but the engine is. So if, for example, you download the engine and use it from code, you’ll be able to load such DDS files.
Keep in mind though, that you need to have OES_texture_float_linear extension to load RGBA32F files. Mobiles sometimes lack the support of it, PCs are fine. So the idea is to load RGBA32F files at development time, filter using the new utility and get already RGBM-encoded results that you can use on a wide range of devices.
Hopefully Designer will be patched soon.

1 Like

I see. So what I wanna do is use prefiltered RGBM files anyhow for sake of compatibility. Well, like I said initially, a Workflow tutorial would be nice once everything is implemented and patched in. Thx!