Hi, sorry, didn’t see the notifications ! I hope this is not too late 
The canvas+base64 is just some hacking to be honest haha.
I use this to turn the text in my WYSIWYG editor, located outside of my PlayCanvas canvas, into an image and then a base64 string. This way I can send this data to the canvas through the onmessage event.
I have to do this to pass the cross-origin protection.
Inside my PC canvas, I have an IMG tag. I set my base64 string in this IMG source attribute, and then I use it’s load event.
form.querySelector("img").src = data.b64;
img.addEventListener("load", function (event) {
// 1- I first make an HTML canvas, and draw my image inside it
// 2- I edit all the pixels to make it grayscale and transparent (white pixel -> alpha = 0)
// 3- I turn this canvas in a base64 again! (with canvas.toDataUrl() )
// This string is stored for the next step below
});
Now that I’ve written this, it might seem a bit crazy haha, but hey… it works for me ¯\(ツ)/¯
But all this is irrelevant to your question. The part that could interest you is what happens next 
This is a heavily edited part of my code to extract the basic process
Ui.prototype.updateBladeTextureWithUserFile = function () {
//My image comes from the outside of my PC project, you wont need this loading part if it is already in your editor
var file = {
url: //my edited base64 image, can also be the path to your image
filename: 'something.png'
};
var asset = new pc.Asset('my image', "texture", file, null, null);
pc.app.assets.add(asset);
asset.resource.addressU = pc.ADDRESS_CLAMP_TO_EDGE;
asset.resource.addressV = pc.ADDRESS_CLAMP_TO_EDGE;
//The texture is ready to be used. Make sure it is correctly loaded. (not included in this example)
var lame = app.root.findByName("Lame"); //This is my model. Lame = blade in french, it is not "uninspiring" :)
var lameMesh = lame.model.meshInstances[0];
var mat = lameMesh.material;
mat.diffuseDetailMap = asset.resource;
mat.diffuseDetailMapChannel = "r";
mat.diffuseDetailMode = pc.DETAILMODE_ADD; //I didn't use this but it is often useful. I use the default MUL value
mat.update();
};
I hope this can help…
The hardest part is putting the text texture at the right place and finding the right blend mode.
You can use a second UV map including only a part of your robot’s head, and use diffuseDetailMapUv = 1
.
Or you can use the same UV map as your main texture, and place your diffuseDetailMap texture by changing the tiling/offset.rotation : diffuseDetailMapTiling.set(x,y)