PDA

View Full Version : Arion - First Hybrid Unbiased Renderer in the World!



InfoCentral
03-18-2010, 06:32 AM
First out the door with full GPU accleration was the low cost Octane Renderer. (http://www.refractivesoftware.com/) Now we have another unbiased render engine which is CUDA acclerated and uses both the CPU + GPU and is called Arion (http://www.randomcontrol.com/arion) and is from the same company that produces Fryrender.

:beerchug:

cresshead
03-18-2010, 06:48 AM
meh.....

pricing is less then inviting..
I would sooner spend the cash on Vray or Final render..

it looks like fprime but more expensive and 5 years late.

biliousfrog
03-18-2010, 07:34 AM
meh.....

pricing is less then inviting..
I would sooner spend the cash on Vray or Final render..

it looks like fprime but more expensive and 5 years late.

FPrime isn't GPU aware or unbiased. This is basically FryRender (which IMO is a more user friendly version of Maxwell) but with an interactive preview render. The ability to use multiple GPU's and the CPU to get physically accurate renders is very impressive...even more so when you can render over a LAN and utilize every GPU and CPU on the network.

The big drawback with many of the unbiased renderers is the time it takes to get a render...having interactive previews is something that I'm very excited about. I've been considering Fry for a while now along with Kray and Modo...Looks like my decision has just been swayed a bit more. Octane looks very promising but it's still a way off being production ready.

It's also impressive that you can preview all the various buffers such as material ID's, depth, AO, transparency etc. http://www.randomcontrol.com/downloads/showreels/arion/arion_tech_demo_003.mov

Amurrell
03-18-2010, 02:06 PM
Plus there is in development SmallLUX GPU and SmallLUXcpt and Open CL driven engine (a CPU+GPU an even works over a network.. unbiased render engine) from the developers/contributors of LuxRender... and it's free. Of course there is no exporter for LightWave, but maybe that will change with CORE, until then there are exporters for Maya, XSI, C4D, 3DS Max, Blender (2.4x and earlier) and Google SketchUp. It's fast and was developed using the tools from ATi stream technology but works with CUDA as well... Another one in development using the LuxRender engine is CUDAray.

Tartiflette
03-18-2010, 02:25 PM
For me Cuda = forced to use nVidia based cards = no choice = trash...
When they come with an OpenCL based application, then i'll be the first one to applause and be interested.


Cheers,
Laurent aka Tartiflette :)

InfoCentral
03-18-2010, 02:31 PM
When they come with an OpenCL based application, then i'll be the first one to applause and be interested.

LuxRenderGPU (http://www.luxrender.net/wiki/index.php?title=Luxrender_and_OpenCL) will be a hybrid OpenCL renderer. Currently in development with no near time completion date expected. Everything else is CUDA based.

Tartiflette
03-18-2010, 03:16 PM
LuxRenderGPU (http://www.luxrender.net/wiki/index.php?title=Luxrender_and_OpenCL) will be a hybrid OpenCL renderer. Currently in development with no near time completion date expected. Everything else is CUDA based.
Yeah, i know. :D
In fact i'm using form time to time LuxRender and even "beta-test" the weekly builds for Mac OS X. :)

It's still missing a LightWave exporter, but i quite like using blender, so that's not a big deal for me.

And SmallLuxGPU looks really interesting as the developper has already understood everything by providing an hybrid solution (CPU+GPU), all that with open solutions. :thumbsup:

Let's see where it's heading in the future.


Cheers,
Laurent aka Tartiflette :)

biliousfrog
03-19-2010, 03:31 AM
For me Cuda = forced to use nVidia based cards = no choice = trash...
When they come with an OpenCL based application, then i'll be the first one to applause and be interested.


Cheers,
Laurent aka Tartiflette :)

Look at it from a developers point of view though, ATI's drivers have been plagued with problems for years and it's almost universally accepted that you shouldn't rely on ATI cards in most 3d applications. Lightwave is very forgiving but CG forums are overflowing with display issues from using ATI cards.

Also consider that Nvidia created one of the first GPU accelerated GI renderers 'Gelato' several years ago, they know how to utilize the tech.

novawave
03-19-2010, 04:53 AM
kinda Out of topic ..but what the point of using external rendering engine at this moment for lightwavers?
considering the blazingly fast and accurate 9.6 renderer and support of impressive SG_Ambocc node.:D

biliousfrog
03-19-2010, 05:16 AM
kinda Out of topic ..but what the point of using external rendering engine at this moment for lightwavers?
considering the blazingly fast and accurate 9.6 renderer and support of impressive SG_Ambocc node.:D

Lightwave's animated GI is flakey, Instancing requires external plugins, linear workflow requires hacks and work-arounds, rendering passes is tedious without external plugins and unbiased renderers such as Maxwell and Fryrender offer a level of realism that is difficult to achieve without a lot of post processing.

Lightwave's renderer is great but lacks a lot of very useful features.

Tartiflette
03-19-2010, 04:07 PM
Look at it from a developers point of view though, ATI's drivers have been plagued with problems for years and it's almost universally accepted that you shouldn't rely on ATI cards in most 3d applications. Lightwave is very forgiving but CG forums are overflowing with display issues from using ATI cards.

Also consider that Nvidia created one of the first GPU accelerated GI renderers 'Gelato' several years ago, they know how to utilize the tech.
Let's say i forgot to mention i'm on a Mac ! :D
The other funny thing is that although nVidia has always marketed its GPU line as Computer Graphics friendly, etc, etc, the raw power of ATI's card is far better ! :eek:

But i'm with you when you say that ATI is almost always referenced as being problematic with a lot of 3D apps, LightWave being one of the rare exception though. :)
(and it's fortunate as it's the application i'm using ! :D)


Cheers,
Laurent aka Tartiflette :)

Nicolas Jordan
03-19-2010, 04:22 PM
I might have considered downloading it if it was free but even then I doubt I would take the time and effort to even look at it since what I already have with Lightwave and Fprime works great for what I need. This is yet another rendering program swimming in a sea of them.

Intuition
03-19-2010, 06:30 PM
I would sooner spend the cash on Vray or Final render..



I thought you already used Vray Cress?

If not... join the dark side. :devil: ;D

Lamont
03-19-2010, 06:35 PM
Holy price... but if it links well with LW.

novawave
03-19-2010, 11:13 PM
Lightwave's animated GI is flakey, Instancing requires external plugins, linear workflow requires hacks and work-arounds, rendering passes is tedious without external plugins and unbiased renderers such as Maxwell and Fryrender offer a level of realism that is difficult to achieve without a lot of post processing.

Lightwave's renderer is great but lacks a lot of very useful features.

that 's true i guess..but i always look at Mr.Iain's pure lightwave render and Mr.Gerard strada works as well Mr. Excerpt :D,.with all lightwave limitation compare to those ext. renderer...somehow that flakey GI and tedious crap are turns out to be impressive render result and fast!! on right hand:D..so at the moment..before buying an external engine all i want to do is maximizing lightwave pure capabilities as those mentioned respected lightwave artist :thumbsup:

InfoCentral
03-20-2010, 12:49 PM
Holy price... but if it links well with LW.

Yeah, I hear you but is it high priced? The problem lies in comparison to what? Since this is a new emerging technology there aren't a lot of comparisons to make. If we compare it to Octane Render then one would have to conclude that it was very expensive at 10X the cost.

OTOH, if we compare it to StudioGPU (http://www.studiogpu.com/) running at $5000/seat then one would have to conclude that it was relatively inexpensive. If Arion has the same or better features and output as StudioGPU then in the eyes of those who contemplated purchasing StudioGPU it would be a bargain and those who have already purchased StudioGPU would be envious. But this is comparatively speaking...