7/29/2012

Object Replacer Editor Script

In my last posts I talked about several asset workflow issues we encountered in our projects. Although Unity is quiet mature when it comes to asset (import) handling I showed some existing limitations and presented some editor script examples to solve these. In this post I want to address a "special case" asset handling problem which in the past has been dealt with manually or with a "quickly put together script for that purpose only".

As it happens we have to deal with huge assets from time to time which consist of dozens of child objects. E.g. consider a huge stardestroyer consisting of dozens or hundreds of elements attached to its hull like different laser cannons, antennas, cargo elements and so on. In our project these elements are individual prefabs (one per type) while the stardestroyer asset lacks all these elements on import. We handle the elements as individual prefabs to benefit from Unitys prefab instancing mechanism which allows us to centrally change object parameters of all instances. Furthermore it follows the divide&conquer principle.
The star destroyer asset has gizmos (e.g. empty object nodes or primitives) attached where a certain element should be placed. This is done during design phase so that the designer has full control over it. These gizmos need to be removed from the final stardestroyer in Unity and replaced by a new instance of the actual element prefab.
After doing this manually or in a quickly hacked script for some time I felt it is time to create a general "Object Replacer" script which allows to manage workflows like these. Hence I wrote an editor script which works as follows:
  1. The user opens the editor script window and selects a prefab (or scene instance) containing child objects which should be replaced.
  2. In a list below object transitions can be defined which specify:
    • an object name supporting wildcards to handle typical pre-/postfixes added along the pipeline automatically
    • the prefab which should replace the gizmo
    • the transformation type which should be applied, namely the *source* (gizmo), the *target* (prefab) or the combined transformation
Additionally the user can preview all matching objects of the current selection in a list at the right side. Although this might be a bit misleading sometimes because the actual replacement list might differ (see "Known limitations") this helps to quickly check if the intended objects are treated.
Finally following the workflow of the "Material Replacer" script the user can save and load the transition list to/from an xml file to streamline the process for reoccuring tasks.

I will uploade the code to UnifyCommunity as usual as soon as it gets online again (link will be updated) and think this class will come in handy for cases like described above.

Edit: The code is now located at UnifyCommunity.

Known limitations
  • The script applies each transition one after another from top to bottom. If an objects name would match with several of the transitions the uppermost transition in the list will be applied only.
  • If an object replacement is attached for a matching object it might be matched by following transitions and replaced again.
Solution for both is to use clearly distinctable object names for your source and target object(s)
  • If a matching object has children they are removed implicitly ignoring them for further processing.
Solution here is to ensure that the hierarchy is kept simple e.g. if you replace leaf nodes only

7/24/2012

Material Scripts Revisited

Since we use the material scripts (MaterialAnalyzer and MaterialReplacer) in our production pipeline I´ve received a few notes regarding current limitations.

As it happens the MaterialAnalyzer has been recently used with a prefab with missing components. Although it can be argued that something is substantially wrong with an input like this I decided to just add a quick check to handle these components gracefully. Obviously, missing components are excluded from the analysis process. The changes have been added to the UnifyCommunity article.

A more interesting issue came up with the MaterialReplacer script. I´ve just added Allegorithmic Substances support which was missing in the initial version of the script. So now its possible to replace a material with a substance by name defined in the material transition file.This feature was not as straight forward to implement as the standard material approach since substances are structurally different. As the documentation describes each individual substances file ("*.sbsar") does not represent an individual substance (wrt Unity materials) but is actually an archive containing one or more ProceduralMaterials. You can think of it as class->instance relationship. A substance actually defines a class describing the resources and algorithms used as well as public parameters. A procedural material acts as an instance of the substance which has individual parameter settings which eventually define the look of the material in Unity.
To retrieve all procedural materials in the projects asset folder it is therefore not enough to just find all "*.sbsar" files like the script does with material files, but its necessary to actually open all substances and retrieve the contained instances. This is not as efficient as it should be but I did not find a better solution yet. Anyway having around 40 substances in the project took around 2 seconds extra on my laptop. Additionally the parsing only happens when loading material transitions from file.
The changes have been applied to UnifyCommunity. Drop me a message if you encounter problems with the script or if something could be improved.

7/11/2012

Material Replacer

During application development with Unity we often get intermediate 3d models or dummy models which are evolving over time with the rest of the application. This basically means that for every 3d model someone usually ends up importing it several times until the final version is done.
Looking at asset pipelines a lot recently (see previous posts) I found one general issue with this task.

Whenever a 3d model is updated in the Unity project updating materials references can become a hassle. If you are smart the correct Unity materials have been defined once and forever with the first import of the model. But if the material setup changed in the source file the material assignments have to be updated as well after import. Unity is flexible enough to search for materials with the same name on import before it decides to create a new one. But if the names are not identically Unity cannot do anything about it.

To aovid manually replacing materials one after another in arbitrary 3d models.I felt that a more general approach would help in many cases to avoid such work here and streamline the import. To achieve that it is necessary (and actually highly recommended anyway) to define a material name mapping which can be automatically applied to a 3d model in Unity. E.g. your designer creates a house which uses grey_concrete and a red_roof material. He probably also creates the corresponding materials in Unity as well which are then using special shaders, bump maps etc. These materials probably have a more general name since there might be a lot other materials with similar properties in your project. E.g. the corresponding materials would be fine_concrete_bright and roof_clapboard_red.

To make the mapping from source to target materials as easy as possible (without changing names) I have created an editor script which receives a prefab as input and then shows a list of all materials used. The list has an object field per material where the replacement material can be assigned. As soon as all material transitions have been defined pressing the "Apply" finally replaces the materials in the prefab itself. To automate this process its possible to save the defined transitions to a simple xml file once which can be loaded next time. Having these transitions set carefully correct material replacement is only one click away. :-)

Although merging individual transition files is not supported by the script it can be done easily manually if necessary. This would e.g. allow to store "all" material transitions from the modeling world to the Unity world.

To make this point clear: I think the smoothest strategy here is to use same material names in your modeling tools as well as Unity. But where this is not possible (e.g. external designers, different naming policys) this script approach comes in handy.

I´ve placed the script at UnifyCommunity with detailed instructions how to get started. I hope it saves others some time to focus on making better applications ;-)

PS: I´m wondering about adding an xml file location to the Model Import Preset script to automatically replace materials on import. Only issue here is that it would involve the automatic creation of a prefab here which might be unwanted in some cases...


EDIT: On request I have added the feature to select scene objects (instances of prefabs or not) as input to perform the replacement only on these objects "locally". The script at UnifyCommunity has been updated.

7/03/2012

Model Import Presets

Working with Unity3d is typically a pleasure but even after years of development some design quirks remain (beside some technical bugs, but this another story). One of them is the way Unity imports 3d models (typically in ".fbx" format). After adding the file to the project Unity imports the model using default import settings. After the object has been imported it is now possible to alter the import settings by e.g. changing the import scale, mesh optimization etc.

The problem here is that you first have to import the whole model before applying the preferred import settings. One typical issue for example is Unity´s strategy to define the import scale value based on the modelling tool used. If this doesn´t match the developers/designers expectations the import settings have to be adjusted and applied.
Importing huge models (e.g. urban environments) can take a few minutes in Unity. Adjusting and applying custom import settings can take another couple of minutes then.

My solution for this problem is to make use of Unity´s ability to intercept model imports with a custom AssetPostProcessor scripts. Such a script has direct access to import parameters and can alter them before import. Unfortunately this script itself cannot be configured therefore I implemented an editor script where the developer/designer can adjust the model import settings. These settings are stored using Unity´s EditorPrefs handle which stores the settings in the system registry. The AssetPostProcessor script reads the settings from there and applies them on import.

To make this process more flexible I finally added preset support which allows to define different configurations for different types of assets. I´m pretty happy with the result so far although I´m not exceptionally happy with the visual appearance of the UI. Anyway, since the presets are stored into the local registry this data is not shared in a team working on the same project. Therefore one improvement could be to put the preset data into a file instead which can be synchronized with repositories and can be shipped with a project.

I´ve uploaded the final code to Unify Community for all. I hope it helps to streamline the import process for others as well.

Material Analyzer

Working with Unity3D it happens from time to time that I´m confronted with 3D models (usually in ".fbx" format) which haven´t been optimized for real-time simulation environments. Although the triangle count is not that high I often find a substantial amount of materials applied to these models (typically resulting in a spike of draw calls whenever the object is visible).

Recently I got in touch with a car model which originated from a Sketchup pipeline. This single car had a few thousand triangles, which was fine for the setup. The performance significantly dropped after adding it into a test environment though. Since it consisted of about twenty different game objects (some of them were just empty nodes) it was hard to get an overview of how much materials where used by the model and which.

I finally decided to write an editor script to get a better understanding which now does the following:
  • it checks every selected game object and its children for MeshRenderer and ParticleRenderer
  • it list all materials used by these components alphabetically within the editor window together with their shader
  • clicking on one or more list items selects the game objects using the material(s)
  • clicking on a small button beside each item selects the material asset to view it in the inspector
  • the button "Dump Hierarchy" dumps the hierarchy of the analyzed selection to a text file including the materials used by each game object
  • the button "Dump List" dumps the current list of materials to a text file with the list of game objects using each material
This script does everything so far to analyze the materials used by objects I need. I´ve implemented the dump options especially for logging, e.g. if other people in a team need to become aware of certain configuration issues.

I´ve uploaded the "MaterialAnalyzer" editor script to Unify Community to help other developers struggling with similar issues. I hope it helps. :-)

Mirko

6/30/2012

Runtime Dependency Management

I have developed a core framework with a friend of mine some time ago which provides a plugin system where every plugin provides a set of extensions and interfaces for available extension points. The whole system follows the Eclipse Plugin System which provides a lot of flexibility for plugin writer. Because plugins may depend on other plugins to work correctly the resulting plugin structure can become a complex tree (actually a DAG) at runtime. Because its possible to register and unregister plugins at runtime a lot of effort went into managing the availability of a certain plugin depending on its dependencies. I decided to use shared pointers to manage the lifetime of all kind of resources including the plugin and extension instances themself. This decision leads to two major design issues I like to share with you together with my solutions:

1. Keeping shared pointer to direct runtime dependencies If object A depends (directly) on object B and C it seems to be straight forward that A holds shared pointers of object B and C during its lifetime to ensure that these dependencies exist. It doesn´t matter if this pointers are used by A to access its dependencies or not. In the latter case for example it is even possible to store all dependend shared pointer into an array of their base class type. If they don´t share a common base class wrapping them into any objects is a safe alternative. Since they are never accessed its only important that they are stored somehow. Now, as long as A exists, all dependencies exist as well because there is at least one handle left pointing on them. When A is destroyed all dependency shared pointer will be released which might invoke the actual destruction of the dependency if there is no further handle left. A nice and clean general solution, he? No, not if one of the dependencies is an instance managing a dynamic linked library which actually implements A. In my case particularly A is a plugin instance which holds a dependency instance which manages the load/unload status of the plugin DLL. Ofcourse its important that during the lifetime of A the DLL is loaded since the whole implementation of  A is located their. The problem with this design occurs when it comes to the destruction of A and iff A holds the last pointer to its DLL. During the destruction of A the last pointer of the DLL will be destroyed and this will corrupt the callstack even if this instruction is the (implicitly) last one in the destruction process of A. As soon as the next call is executed a strange error will occure e.g. telling you that some virtual functions could not be called (on Windows). What a pity! It could have been so elegant... Wait, there is a solution: The trick is to put the destruction of all dependencies after the destruction of A. Ok thats obvious, but how can we apply that without introducing another manager tracking the destruction of A and its dependencies from outside? Answer: Using a custom deleter. I use the Boost::SharedPtr implementation which offers an interface to define custom deleter which then takes care of deleting the dependencies after the destruction of A. The following snippet shows how such a class looks like:


// Type definition for a list of boost::any instances.
typedef std::vector DependencyList;

// Stores all dependencies
template struct DependencyDeleter
{
    // Constructor is initialized with the list of dependencies
    DependencyDeleter(DependencyList dependencies) : mDependencies(dependencies)
    {}

    // Is called on last SharedPtr destruction
    void operator()(T* ptr)
    {
        // delete our pointer first
        delete ptr;
        
        // delete the dependencies now
        while(mDependencies.size())
            mDependencies.pop_back();
    }
    
    // holds all dependencies
    DependencyList mDependencies;
};


The usage on creation of the object looks like this:

// create the dependency list
DependencyList dependencyList;

// (...gather all required dependencies and add them to the list...)

// create the plugin instance 
Plugin* plugin = createPlugin();

// create a shared pointer managing the lifetime of the plugin and its dependencies
PluginPtr pluginPtr(plugin, DependencyDeleter(dependencyList));


2. Distributing a shared pointer as "self" pointer It happens that a shared object is a factory (or manager) which is able to create instances. A problem with the shared pointer concept arises in this case if these instances should also know its creator via a shared pointer e.g. provided on construction as a parameter. Why? Because that implies that the factory knows its own shared pointer to provide it as parameter on construction. Obviously its not clever that the factory creates a new shared pointer of itself (this would mean having several shared pointer which want to manage the same objects lifetime independently, very bad!). The solution is to pass on construction of the shared pointer (see snippet above) the shared pointer as parameter to the object like this:

plugin->_setSelfHandle(pluginPtr);

It would also be possible that the constructor receives an empty SharedPtr reference which is initialised with "self" in the constructor and stored within the object. But that is a matter of taste... This concept only makes sense if the shared pointer implementation supports a weak pointer. Its mandatory that the shared pointer stored in the object itself is weak pointer which does not count as handle and therefore does not prevent destruction as soon as the last external pointer has been released. Otherwise the object won´t be destroyed and you have created a nice memory leak. Eventually the whole concept helps a lot managing resources and its dependencies elegantly after solving the described issues...

Greetings

Greetings,

So this is eventually my attempt to get into the Blogger world. Lets see how this turns out. I plan to keep this Blog specific to programming writing articles about different topics in the area of software architecture, computer graphics and programming in detail if it fits. Based on my current activities I will focus on real-time 3D simulation.

I´ve a lot of interest in this area and a few years experience in those areas. I have been an Ogre child for more than two years until projects went on which brought me to the Unity3D game engine until now.