Page 2 of 2 FirstFirst 12
Results 11 to 13 of 13

Thread: Usefulness

  1. #11
    Senior Member
    Join Date
    Aug 2004
    Location
    California
    Posts
    771
    Thank you Gregg.. and I appreciate your attention in this area of COLLADA development. We have several engineers at various companies around the world implementing exporters and importers and viewers, but not nearly as many take interest in the technical documents.

    I look forward to your contributions to the specification!

  2. #12
    Guest

    Re-importing Usefull ?

    Hey all,

    Just spend the morning going through all the documentation I could find on COLLADA. It's a nice design, quite similar in a couple of ways to our current format but with lots of design improvement. We've been discussing moving to an XML based format ourselves, so it's very interesting to encounter plans for a universal format.

    Reading through Gregg's comments did spark some concerns about the actual usefullness of DCC-package developers creating export and import tools.

    A large part of content creation is indeed defining what and how you export your content. "Baking" or "Flattening" the complex scene structures and animation set-ups of maya is simply a necessary step of optimalization.

    Just as Gregg described our own Maya set-up is full of flags like "collapse this hierarchy to one mesh", "bake the animation of these bones", "convert the hierarchical animation of these objects to a single skinned mesh", etc.

    These things are often quite specific for how the engine is constructed, our engine is much quicker dealing with very large skinned meshes then it is with dealing with 50 objects with their own transformations. (most engines are, state changes are bad).
    Ofcourse we can start defining these "baking" preferences, but I expect it'll be a lot harder to agree on such workflow related matters then it is to agree on the specifactions of a mesh description.

    In practical terms I fear it might probably come down to taking a look once at the source code of the DCC-dev's exporter/importer and then re-writing it to suit the needs of the game-dev's own personal little quirks and habits. Each revised version of the COLLADA specs would require a possible revision of the dev's code or run the risc of getting out of sync with the COLLADA standard and losing the described benefits.
    To allow this process to be as easy as possible there is not only a dependency on the COLLADA format but also on agreement on how the DCC-dev's set up their source code and libraries. Otherwise maintaining a plug-in for Max and Maya will be a potential nighmare.

    We wrote an importer for our own data format once. We used two tools, an in-house CSG-based editor and maya. At a certain moment we migrated to maya for all content and we needed to import editor-created data. The export format of a CSG-editor is ofcourse just a big polymesh, there is no CSG information in there anymore, so all that stuff was lost in maya. And vice versa importing a maya polymesh into an editor is equally useless.
    The loss of construction history when importing is even more crippling when dealing with animation though. "baked" bone animation has no relevenance to the original data in maya which is most likely created with a very complex animation rig.
    An importer in my experience is often limited to the functions of a one-way content migration tool where you have to except loss of construction history, a way of shifting around static content (like simple mesh structures) or a last resort tool to ressurect content of which the original source file was corrupted or lost.

    Jan-Bart van Beek | Guerrilla Games | Killzone | Lead Artist

  3. #13
    Member
    Join Date
    Aug 2004
    Location
    SCEJ - Tokyo, Japan
    Posts
    34

    Re: Re-importing Usefull ?

    Quote Originally Posted by flake
    Just as Gregg described our own Maya set-up is full of flags like "collapse this hierarchy to one mesh", "bake the animation of these bones", "convert the hierarchical animation of these objects to a single skinned mesh", etc.

    These things are often quite specific for how the engine is constructed, our engine is much quicker dealing with very large skinned meshes then it is with dealing with 50 objects with their own transformations. (most engines are, state changes are bad).
    Ofcourse we can start defining these "baking" preferences, but I expect it'll be a lot harder to agree on such workflow related matters then it is to agree on the specifactions of a mesh description.
    Assuming Collada is a "middle" format all I need to continue my current pipeline and still use collada is that collada exports 100% of the data in maya. That way I can read through my notes, blind data, plugin data and custom attributes that I currently use and use that data to decide how to collapse things into something for my game engine.

    So, in other words, we don't need to agree on all "baking" preferences as long as we can get consistant data from all packages. The typical pipeline would be

    DCC TOOL (DCC Provided exporter) -> Collada File - > Custom Conversion Tool -> Game Ready File

    In your custom conversion tool you'd read options you had either embedded in the original DCC scene or from a separate file and then for example collapse multiple objects to one object.

    My concern with baking as mostly for animation. Specifically (1) funciton curves do not cover many cases of animation so if all I get is function curves my artists will be limited in what they can do with collada (2) every package computes stuff differently, exporting baked animation data means I let the package do its voodoo math and I get the result, that result is guarnteed to reproduce the same image in my pipeline. If instead collada just lets each package store their fcurves and does not provide baked data then I'll end up having to make my pipeline check each file for the original DCC tool and attempt to do its calculations the same as that specific tool as in

    Code :
    if (tool = 3dsmax5_0)
    {
       CalcStuffLike3DSMax5_0();
    }
    else if (tool == 3dsmax5_1;
    {
      CalcStuffLike3DSMax5_1();
    }
    else if (tool = Maya5_0)
    {
      CalcStuffLikeMaya5_0()
    }

    etc.. Asking for baked data means I never have to worry about either of these cases.

    As for construction history and the like, Collada is pushing to be a source format as well as a common format meaning that each DCC vendor is supposed to define their own custom profiles to store all the same data they currently store in their own native files as well as storing the as much as possible in the common profile. Their custom area means you could load the file back into that DCC tool and get the exact same thing back (with construction history etc)

    So far no DCC has taken this goal seriously but unfortunately it is REQUIRED in order for collada to be useful. Why? Specifically because as you said, every team uses different features. There is no way collada's common bits will cover 100% of every DCCs tool's data (where's the Painter Effects part or the Metaballs part, data from custom plugins, etc etc...) But, teams do use that data. Collada becomes useful both by providing a way to get most data "the common part" from all packages but still providing a way to get all that unique data that each team needs. Only by being a source format will the unique data make it out.

    Hopefully the DCC vendors will understand this and we'll see that reflected in their exporters.

Page 2 of 2 FirstFirst 12

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •