I was thinking I could use the render buffers, to drive the material color, but not much luck so far. I just discovered the store extra buffers node.
I can't thank you enough for your help Denis, I am using your nodes left right and center on this project. That link was helpful.. I haven't quite solved my issue. I am trying to use an object with UV's, a image texture, and sample from that image and have my instances inherit colour from it. My initial tests showed I had to use a projection. That is not going to work on a human/complex object.
Do you have any other suggestions, or am I blind, and the answer is in that thread? I saw the blue and orange chess board, but it is flat and can conveniently be planar projected to.
I was thinking I might be able to use the buffers/filters pack and grab colour off the model that way.
You mean something like this kfinla ?
In this case all the little cubes that make up Basil are instances of one little cube.
I've been scratching my head all day how to sample the colour of a surface and pass that onto an instance based on instance location. The kicker being a UV'd object.
I would love to know. If you only wanna tell me thats fine too. LOL
Not a problem mate ... & apologies to anyone that's been waiting for that follow up. I've always meant to share it, but wanted to push the whole idea a bit more into something a bit more polished (my little corny proof of concept attached - Basil makes an appearance later on) before I got onto explaining what the heck I did.
I got about 1/3 of the way there before some urgent external & internal projects totally knocked my time out (still not done with those).
I'll get onto putting something proper & hopefully simple together in the next week or so for you & anyone else who is interested. In a nutshell it's one cube, instanced onto a point grid where a mesh's proximity to that grid (Basil in this case ) drives the location. The hard part is controlling the knock out point, where the new cube mesh overlaps the underlying mesh & texturing the sucker.
This is using instancing, DP's nearest point & a few simple texture projection techniques using the idea in that thread that Denis posted
Last edited by adk; 07-11-2012 at 11:43 PM.
... here's a basic explanation if anyone else is interested.
Nothing to do with Denis's Extra Buffer Nodes (sorry for the side track) but it does use his Replace Spot & Nearest Point node. Would be impossible without those. Cheers Denis !
Hey Guys, thanks for all your help. So I am still trying to work around the "replace spot" node not liking (or over writing UV position data)
I have a complexly UV'd object so using a traditional projection type to transfer colour to my instances will not cut it. I have successful examples on cubes etc.
I wanted to ask. Is it possible to sample the renderbuffer color, or raw color etc.. and pass that along to the replace spot node in a surface node graph?
So far I have had no luck with the store buffer, get buffer. But I also have no previous experience(success) with these nodes. I am under the impression the renderbuffer node only works in the image filter node graph not in the surfacing graph where I want it.
Should I be extracting the renderbuffer , store extra buffer data from the pixel filter graph?, and then piping into my material surface graph with the get extra buffer node.
Basically I want the final colour of my object on a per pixel basis, and I want to colour my instances to inherit that.
I don't have tons of time to sort this out, but I'm also curious if long term, the "replace spot" node could have a reference object drop down to optionally use the UV data from?
because UV coordinates are not really present or
transmitted in Surfacing, just accessible in a node
in its internal evaluation.
surfaces are evaluated, then Buffers are filled and Surfacing is no
more evaluated, DP Filter Pixel Filter node editor allows a kind of
post-process rendering but surfacing is setup in the pixel context.
What you describe looks like a job for compositing, but pasting
a surface over another is not 'replacing' spot at a "Surfacing" level.
Would it be possible to do for MBlur that has been done for DOF? LW has motionvector buffers, and I guess that a MBlur node has to be written to do the magic, but... would it be possible?
Massive - A Ubisoft Studio
Intel i7 970 "Hexa" 3.2GHz
GeForce GTX 780Ti 3GB
GeForce GTX 560Ti 1GB
Windows 7 x64
DP Filter has been updated, versions 9.6, 10 and 11, win32 yet,
in Image Filter node editor, with "Replace Buffer" Alpha mode, the root
Color input is no more rescaled by Alpha input.
Histogram, Limit, and Linear Tone Map nodes in "Filter" folder.
Vector Blur, Camera Response, Grain, Vignetting nodes in "Effect".
Thanks to Gerardo Estrada, for his intensive testings,
advices and great suggestions.
AWESOME filters, Denis!
Thank you very much!
wow. thanks so much Denis (and Gerardo)