PDA

View Full Version : Unity 3d, a few questions from a newb



Elmar Moelzer
10-07-2010, 02:01 PM
Hello Unity users!
I know there are few among you here. So please raise your hands! ;)
Anyway, I got arround looking at Unity for a bit today and I also tried importing a few objects from LW.
I still have a few old game assets lying arround, so I wanted to give them a shot and see how they would turn out in Unity.
So far so good...
I managed to get my object exported from LW as an FBX file and it also imported into Unity.
Now there were a few minor problems:
1. The default Main color is black. I can readjust this, no problem, but it would be nice, if I did not have to do that with every surface in every file that I am importing. Is there something I can do to make my exported FBX- files import into Unity with the correct Main Color already set? If I wanted to share my FBX files with other users, this may be especially relevant.
2. I assume that in order to get hard edges /break the smoothing in a spot, I have to double the vertices, there, right? I guess there is no way arround that? Unity seems to only allow a single smoothing setting for the entire object, not per material (pitty).
3. Is there a way to export just the object without the lights and stuff from Layout? I heard that exporting FBX from Layout works better than from Modeler. So that is what I am using.
4. When I export the Asset from Unity, it generates a rar file but with a unitypackage extension. I can open it, if I rename it. It contains another .tar file that Winrar will open and in that is a group of folders with cryptic names containing other files. I am not sure, but I guess they are my FBX and BMP files, but without the extensions. Anybody here who is a bit smarter about that than I am? Maybe there is a way to export something a little less complicated?
Thanks!

Red_Oddity
10-07-2010, 02:26 PM
Not sure, the FBX exporter in LW has very few options in deciding what to export and what not.
Doing the export from Modeler might fix the light and camera problem, or exporting as Ascii and editting the file in a text editor probably works.

As for the smoothing with hard edges, we've done tests using normal maps which allows us to work around this problem pretty nicely.

I wish Unity would do LWO native, if you've worked with Maya and Unity you can see what a breeze doing assets can be (any change i do in Maya gets picked up immediately after saving by Unity, it's one of those things that just works like it is supposed to, which is quite amazing, considering it's Maya where talking about :) )

Elmar Moelzer
10-07-2010, 02:36 PM
I guess, I can always edit things in Unity and save it as an asset?
My goal is to make a small collection of objects for Unity users. Of course it would be even nicer, if I could just make it a FBX file and people can use it whereever they want to (other engines support FBX as well).
LWO- support in Unity would of course be best, but you cant have it all, I suppose and FBX will at least allow others to make use of it too.

geo_n
10-07-2010, 02:56 PM
This thread might help. I followed ramen sama tips. He also posted a sample file. Disregard the youtube video on that thread though.
http://forum.unity3d.com/threads/43652-Lightwave-axis-problem-tutorial

Elmar Moelzer
10-07-2010, 03:37 PM
So far I have not had any problems with the orientation, because I have only been doing static object so far, but thanks.
I will check the rest of the thread for other hints that might answer some of my questions.

Elmar Moelzer
10-08-2010, 09:06 AM
Red, my previous game development experience was 10+ years ago. So I am wondering: Do you normalmap just everything these days? How about for "slower" platforms like the iOS?

Tzan
10-08-2010, 09:55 AM
I export from modeler, so I dont get the camera and lights. This works fine for static objects, which is all I have done so far. I dont like having clean up all that imported mess.

When Importing into Unity I think the default FBXImporter>ScaleFactor was set as default to .01, I change this to 1. I think this happens when you export from modeler.

I figured out how to set up a second UV set in code. Its very easy, you just need to know what UV numbers you need and set that into the programming. Because I'm dealing with very simple objects this isnt a problem.

I also created a checklist of all the steps from importing to setting up the prefab. So I dont have to think about it and dont skip a step.

Elmar Moelzer
10-08-2010, 10:30 AM
I export from modeler, so I dont get the camera and lights. This works fine for static objects, which is all I have done so far. I dont like having clean up all that imported mess.

Yeah, that sums up my experiences so far too.
So I guess I am not missing much.


When Importing into Unity I think the default FBXImporter>ScaleFactor was set as default to .01, I change this to 1. I think this happens when you export from modeler.

I noticed that as well. Setting the Scale multiplier to 100 in the Exporter does not help, unfortunately (I guess it is simply not working?).
For reasons of compatibility with other engines, I would love my objects to have a correct size right away (among other things). So yeah, that kinda sucks. I hope someone writes a fixed exporter soon ;)


I figured out how to set up a second UV set in code. Its very easy, you just need to know what UV numbers you need and set that into the programming. Because I'm dealing with very simple objects this isnt a problem.
Hmm, do you still need a second UV Map, now that Unity has Beast Lightmapping (and from what I gather sets up its own UVmaps for that)?

What settings are you using for Normals and Tangents in the Importer?
I think that (with my limited tests anyway) "Calculate" for both looks best.
I use "Per Material" for "Material Generation". I guess that this is better than using the "Per Texture" setting, since it is closer to what LW does? But how do more experienced Unity Users think about that? What are you using? I mean, if I want to pass my prefabs on to others, I want them to have the most ideal settings and not the settings that I prefer.

Can anybody tell me what exactly "Split Tangents" does? I dont seem to see a differene with my example object.
I am sorry for asking such silly questions. I guess I should just go and RTFM ;)

Red_Oddity
10-08-2010, 10:45 AM
I haven't done any iOs game development yet, we have a couple of iPads lying around, but no iPhone, only Android and Symbian phones around here, and i've just recently started dabbling in Android development.
I'm no full time developer, i just 'muck about' with code basically, that's why Unity is such a great tool to me.

Waiting for Unity for Android to become available to everyone else (1000 euros is a bit much to spend on testing developing games for Android.)

Elmar Moelzer
10-08-2010, 02:28 PM
Is it normal for the Unity Forum activation email to take so long to arrive?
I registered this morning (Austrian time) and have not received an email yet.

warmiak
10-08-2010, 11:57 PM
Can anybody tell me what exactly "Split Tangents" does? I dont seem to see a differene with my example object.


It is a fairly technical issue but in essence, sometimes tangent values for a vertex will end up facing in opposite directions and the only way to deal with it is to split the vertex in two , duplicating all other values but selecting separate tangent values.

This is somewhat similar to issues with discontinuous uvs and smoothing angles - there is only one vertex as far as Lightwave is concerned but it has two separate uv or normal values ( depending which polygon this vertex is considered part of).
A typical graphic card used in gaming PCs can only deal with uniform vertices - i.e a vertex always contains position,normal,uv,tangent etc ... so the only way to deal with it when exporting lightwave models to DirectX or OpenGL friendly formats is to split the vertex in question into two vertices , duplicating all other attributes ( like position etc) while substituting separate uvs, tangents.

This also means that the way you create uvs for your models, can have significant performance implications in terms of game engine formats.
For instance, even if Lightwave reports your model as having 10 000 vertices, if your uv maps contain a lot of discontinuous uvs, the actual vertex count as used within the game engine can change significantly.

Elmar Moelzer
10-09-2010, 03:56 AM
Thanks Warmiak, so "Split Tangents" will basically allow me to have double vertices in the same position? I noticed that Unity would automatically merge them on import. Split Tangents prevents that from happening? Or does it split all Tangents?

warmiak
10-09-2010, 11:36 AM
Thanks Warmiak, so "Split Tangents" will basically allow me to have double vertices in the same position? I noticed that Unity would automatically merge them on import. Split Tangents prevents that from happening? Or does it split all Tangents?

It is not about allowing anything ..... the way graphic cards work, sometimes you will need do duplicate vertices to keep things working and "split tangents" essentially does that for you - it doesn't mean it will split all of them just certain vertices that require it.

Imagine this:

Vertex1 - position X1 , normal X1 , tangent = ?

Vertex1 belongs to say 2 polygons , polygon A and polygon B

The exporter calculates tangent and comes up with 2 different tangent values for Vertex1 - tangentA for polygon A and tangentB for polygonB.
Now you have only one tangent value you can attach to a vertex so what they do is this:


Vertex1 - position X1, normal X1, tangentA
Vertex2 - position X1, normal X1, tangentB

They create additional vertex (2) which has the same position, normal etc and the only difference is with its tangent value.

Elmar Moelzer
10-09-2010, 04:04 PM
Yeah, that is what I meant. Thanks for explaining it again though.

Elmar Moelzer
10-11-2010, 07:54 AM
Another question (since I still have not gotten access to the Unity Forums, I have to post them here, what else can I do?).
Is it better to have (a few) more polygons, or would be it be better to add a normal map? I mean, what does Unity deal better with? I have a couple of places where I might be able to remove two or 4 polys, or so (and equal that in the UVmaps), but that will only look good with a normal map. So question is, textures or polys?

warmiak
10-11-2010, 09:58 AM
Another question (since I still have not gotten access to the Unity Forums, I have to post them here, what else can I do?).
Is it better to have (a few) more polygons, or would be it be better to add a normal map? I mean, what does Unity deal better with? I have a couple of places where I might be able to remove two or 4 polys, or so (and equal that in the UVmaps), but that will only look good with a normal map. So question is, textures or polys?

Which devices (i.e PCs/Macs, iOS devices etc ) are you coding for ?

Elmar Moelzer
10-11-2010, 10:10 AM
None in particular right now, but like the devices, engines can have a preference too, at least that is the way it used to be last time I concerned myself with it.
I would probably be either going for the iPhone 3GS and later, the iPad or the PC though.

Tzan
10-11-2010, 12:05 PM
Hmm, do you still need a second UV Map, now that Unity has Beast Lightmapping (and from what I gather sets up its own UVmaps for that)?

I'm using the second UV set for something else.

I'm making a Lego style game so the terrain pieces could have different colors.

In LW I make a model, UV1 is a gray scale texture.
I made a shader in Unity Diffuse,Detail,Specular.

UV1 set in LW:The Detail lightens or darkens the diffuse, gray 128 does nothing. The alpha in the detail texture holds my spec data.

UV2 I create this in code, Diffuse image, For the image I have an array of 10x10 color swatches. So I can have 100 different colors and still Combine the meshes because it uses the same material. For the UV coords I just do some math to calculate the location of the color I want and set all the verts to that UV coord.



What settings are you using for Normals and Tangents in the Importer?

I just use the defaults,
Tangent Generate All
Material per texture

I'm just making simple quads and boxes now, so I dont think it makes any difference.

warmiak
10-11-2010, 07:57 PM
None in particular right now, but like the devices, engines can have a preference too, at least that is the way it used to be last time I concerned myself with it.
I would probably be either going for the iPhone 3GS and later, the iPad or the PC though.

Engines do indeed have their own preference but in this case it is the actual device that matters.

If you are going for iOS devices and shotting for a real game and not a demo, then you can forget about true dynamic per pixel lighting , normal maps etc ....
Even the latest iPhones with the their latest GPU are about 300-400 times slower than an average PC GPU so you gonna have to cheat your way to get decent rendering quality.

In other words, there is hardly any fillrate available on these devices ( at last by PC standards) and thus pixels shaders which do more than read a few textures (and have a decent screen coverage) will instantly bring your frame rate to almost nothing ( especially on the iPad and iPhone 4 )

BTW ... you can't go wild with geometry either as SGX based devices will let you render about 60-70 K polys at around 30 fps.

In general your best bet is precalculated static lighting ( static lightmaps, semi dynamic lightmaps etc) - you can do per vertex lighting but in general it doesn't look too good and a semi-dynamic lightmaps will look much better.

Elmar Moelzer
10-12-2010, 01:21 AM
In general your best bet is precalculated static lighting ( static lightmaps, semi dynamic lightmaps etc) - you can do per vertex lighting but in general it doesn't look too good and a semi-dynamic lightmaps will look much better.

Yeah that is what I thought. I would use vertex lighting for moving objects only. Everything else with static lightmaps.

Elmar Moelzer
10-14-2010, 12:26 PM
I have posted a little test object in the Unity Showcase. I am giving it away in return for some good C&C.
Maybe one of the Unity artists here could have a look?
Thanks!