Pyblish Magenta

From the pull request.

I’ve rushed into this chain of plug-ins, assuming a few too many things were obvious, but they are not.

There are a few things happening here that makes this whole process work smoothly that I will now try to illustrate.

TLDR; it’s about collecting via constructionHistory=True and extracting via constructionHistory=False. Together they provide symmetry and consistency.

Problem

Let’s first nail down the problem.

import pyblish.api

cmds.polyCube(name="myCube")
cmds.polySphere(name="mySphere")

cmds.group("myCube", name="cube_GRP")

# myCube now *depends* on mySphere
cmds.connectAttr("mySphere.outMesh", "myCube.inMesh", force=True)

instance = pyblish.api.Instance("MyInstance")
instance.add("cube_GRP")

def validate(nodes):
    """Only meshes and transforms are valid"""
    for node in nodes:
        if cmds.nodeType(node) not in ("transform", "mesh"):
            raise Exception()

try:
    validate(instance)
except:
    raise Exception("Validation failed before extraction!")
finally:
    cmds.file(new=True, force=True)

# Now we can safely export!
cmds.select(instance)
preview = cmds.file(exportSelected=True,
                    preview=True,
                    force=True,
                    constructionHistory=True)

# Make sure exported nodes are still valid
try:
    validate(preview)
except:
    raise Exception("Validation failed after extraction!")
finally:
    cmds.file(new=True, force=True)

And here’s the result.

# Validation failed after extraction!

Take a moment to consider what is happening here. Nodes are added ad-hoc to an instance, based on what we think will be considered during extraction, but really is not.

We would of course like to catch the problem before extraction.

Solution

Ensure relevant nodes are really collected, and don’t allow Maya to collect anything else during export.

import pyblish.api

cmds.polyCube(name="myCube")
cmds.polySphere(name="mySphere")

cmds.group("myCube", name="cube_GRP")

# myCube now *depends* on mySphere
cmds.connectAttr("mySphere.outMesh", "myCube.inMesh", force=True)

nodes = cmds.file(exportSelected=True,
                  preview=True,
                  force=True,
                  constructionHistory=True)

instance = pyblish.api.Instance("MyInstance")
instance[:] = nodes

def validate(nodes):
    """Only meshes and transforms are valid"""
    for node in nodes:
        if cmds.nodeType(node) not in ("transform", "mesh"):
            raise Exception()

try:
    validate(instance)
except:
    raise Exception("Validation failed before extraction!")
finally:
    cmds.file(new=True, force=True)

# Now we can safely export!
cmds.select(instance)
preview = cmds.file(exportSelected=True,
                    preview=True,
                    force=True,
                    constructionHistory=False)

# Make sure exported nodes are still valid
try:
    validate(preview)
except:
    raise Exception("Validation failed after extraction!")
finally:
    cmds.file(new=True, force=True)

And here’s the new result.

# Validation failed before extraction!

Ah, I actually missed you were collecting the rig with constructionHistory=True!

In this case I think it would be great to change the collect_model so that it also collects without construction history. What do you think? We could even skip the validation for constructionHistory since the output will never have it!

The collector in your current pull request has another issue, the fact that it’s filtering to shapes means it’s not including all nodes in the instance that contribute to the output. For example the output of the model still includes the hierarchy of nodes even though the instance doesn’t include those nodes.

This would already be a much cleaner way:

@pyblish.api.log
class CollectModel(pyblish.api.Collector):
    """Inject all models from the scene into the context"""

    hosts = ["maya"]

    def process(self, context):
        from maya import cmds

        if not os.environ["TASK"] == "modeling":
            return self.log.info("No model found")

        name = os.environ["ITEM"]

        # Get the root transform
        self.log.info("Model found: %s" % name)
        assembly = "|{name}_GRP".format(name=name)

        assert cmds.objExists(assembly), (
            "Model did not have an appropriate assembly: %s" % assembly)

        self.log.info("Capturing instance contents: %s" % assembly)

        shapes_in_assembly = cmds.ls(assembly,
                                     noIntermediate=True,
                                     exactType=("mesh", "nurbsCurve", "nurbsSurface"),
                                     long=True,
                                     dag=True,
                                     leaf=True)

        assert shapes_in_assembly, "Assembly did not have any shapes"

        with pyblish_maya.maintained_selection():
            cmds.select(shapes_in_assembly)
            nodes = cmds.file(exportSelected=True,
                              preview=True,
                              constructionHistory=False,
                              force=True)

        instance = context.create_instance(name=name, family="model")
        instance[:] = nodes

        self.log.info("Successfully collected %s" % name)

Note that this still has the issue that it will collect shaders, materials, groupId, displayLayers, expressions, lightLinkers into the instance. All elements unrelated to the Model data and what we want to extract.

Here’s one that excludes most of that:

import os
import pyblish.api
import pyblish_maya


@pyblish.api.log
class CollectModel(pyblish.api.Collector):
    """Inject all models from the scene into the context"""

    hosts = ["maya"]

    def process(self, context):
        from maya import cmds

        if not os.environ["TASK"] == "modeling":
            return self.log.info("No model found")

        name = os.environ["ITEM"]

        # Get the root transform
        self.log.info("Model found: %s" % name)
        assembly = "|{name}_GRP".format(name=name)

        assert cmds.objExists(assembly), (
            "Model did not have an appropriate assembly: %s" % assembly)

        self.log.info("Capturing instance contents: %s" % assembly)

        shapes_in_assembly = cmds.ls(assembly,
                                     noIntermediate=True,
                                     exactType=("mesh", "nurbsCurve", "nurbsSurface"),
                                     long=True,
                                     dag=True,
                                     leaf=True)

        assert shapes_in_assembly, "Assembly did not have any shapes"

        with pyblish_maya.maintained_selection():
            cmds.select(shapes_in_assembly)
            nodes = cmds.file(exportSelected=True,
                              preview=True,
                              constructionHistory=False,
                              constraints=False,
                              expressions=False,
                              channels=False,
                              shader=False,
                              force=True)

        instance = context.create_instance(name=name, family="model")
        instance[:] = nodes

        self.log.info("Successfully collected %s" % name)

Though this still includes the displayLayer… unfortunately.,

Actually… watch this:

nodes = cmds.ls(type='mesh', noIntermediate=True)

with pyblish_maya.maintained_selection():
    cmds.select(nodes)
    nodes = cmds.file(exportSelected=True,
                      preview=True,
                      constructionHistory=False,
                      constraints=False,
                      expressions=False,
                      channels=False,
                      shader=False,
                      force=True)
                      
print nodes
# [u'myTesting_GEOShape', u'myTesting_GEO', u'test_GRP', u'oh_no']

with pyblish_maya.maintained_selection():
    cmds.select(nodes)
    nodes = cmds.file(exportSelected=True,
                      preview=True,
                      constructionHistory=False,
                      constraints=False,
                      expressions=False,
                      channels=False,
                      shader=False,
                      force=True)
print nodes
# [u'myTesting_GEOShape', u'myTesting_GEO', u'test_GRP', u'polySurfaceShape1', u'oh_my', u'so', u'nasty_', u'oh_no']

Doesn’t help much either.

This would actually be much more succesfull with the wrapper we had around the exporter. :slight_smile: I quickly added a preview argument for it in my latest commit so it behaves closely like cmds.file.

A test-run:

import pyblish_magenta.utils.maya.exporter as exporter

nodes = cmds.ls(type='mesh', noIntermediate=True)

nodes = exporter.MayaExporter.export(nodes=nodes,
                                   constructionHistory=False,
                                   expressions=False,
                                   channels=False,
                                   constraints=False,
                                   displayLayers=False,
                                   objectSets=False,
                                   shader=False,
                                   includeChildren=False,
                                   preview=True)
print nodes
# [u'myTesting_GEOShape', u'myTesting_GEO', u'test_GRP', u'polySurfaceShape1']               

nodes = exporter.MayaExporter.export(nodes=nodes,
                                   constructionHistory=False,
                                   expressions=False,
                                   channels=False,
                                   constraints=False,
                                   displayLayers=False,
                                   objectSets=False,
                                   shader=False,
                                   includeChildren=False,
                                   preview=True)
print nodes
# [u'myTesting_GEOShape', u'myTesting_GEO', u'test_GRP', u'polySurfaceShape1']

What do you think?

Might even give it an argument like neverExpand, soloNodes or whatever you would call it so to make it never export anything outside of the set of nodes given, basically overriding the other arguments of the function! This could make the final extractor of the scene:

exporter.MayaExporter.export(path='C:/test.ma',
                             nodes=nodes,
                             neverExpand=True)

It’s an exporter that gives us much more control on the Extraction, and now with the same feedback as the regular cmds.file command.

Just checking, would you still be able to disable versioning up on a new publish? Or have control on what version is your output?

Under tight deadlines we publish models and rigs like crazy, or actually any department does that from time to time. Especially rigs tend to get published with features going from rough, draft, working in a matter of hours. Sometimes minutes. :wink:

For example today a character needed a specific hair rig implemented and the animator needed to have a go with it asap. So first push was the working controls without good weight-painting, basically so he could start timing simultaneously. Next version had better weight-painting. Then the next had the specialized rig connected to the variation of hair styles so the rig neatly disables its visibility when another hairstyle was chosen. In matter of an hour or so it underwent around 4-5 publishes. To me as the rigging artist for this particular quick rig it was clear that the draft publishes where just test versions of the final output and wouldn’t mess anything up if an animated rig would be automatically updated with those updated weights.

I’ve seen similar publish speed going on with animations where data sometimes grows into caches of gigabytes of data. It’s inconvenient to have it grow that fast if we would end up with 25 versions for one shot in a week or so? Especially if all shots should be on that latest version anyway?

Note sure if related, but here are also some publish statistics from @mkolar:

Just to give you some more numbers. We are averaging at around 8-10
publishes a day, in crunch we can easily hit 15-20 (simple comps based
on templates, animation combined from library of movements etc.)

Note also how @mkolar mentions having the published version be dependent on the work file version here, so the artist remains in control about whether what he publishes is supposed to be a new version. This would be similar to seeing a new version as a release and intermittent updates as commits or bugfixes. You could even go back for an older version and implement a minor bug-fix!

What do you think?

I think that sounds good, but can’t be entirely sure. Let’s try it.

You’re right, it would need one additional pass through cmds.file(preview=True) which would re-add the relevant hierarchy.

About this…

shapes_in_assembly = cmds.ls(assembly,
                             noIntermediate=True,
                             exactType=("mesh", "nurbsCurve", "nurbsSurface"),
                             long=True,
                             dag=True,
                             leaf=True)

We could do that, but again, this can only consider the immediate hierarchy, whereas cmds.file will consider connections and will reflect the true output from Maya. This is the important bit.

I took a longer look at what your Exporter class actually did and I think you’re right. But I would suggest we refactor it because (1) it doesn’t need to be a class (which suggests it’s an object of some kind, when really it’s just a function) and (2) it does things other than export which isn’t apparent from just calling export().

When I first looked at it, aside from the astounding number of PEP08 warnings…

…it struck me as an overengineered version of simply calling cmds.file, which isn’t the case. I think we could make the fact that it disconnects more explicit, possibly something along these lines.

with disconnect(include="constraints displayLayers"):
  # Disconnect everything, except constraints and display layers
  cmds.file(exportSelected=True)

with disconnect(exclude="shader channels"):
  # Disconnect only shaders and channels
  cmds.file(exportSelected=True)

It makes it more obvious what is happening, and doesn’t pull the rug out from maintainers, such as myself looking at your code, by still using cmds.file explicitly.

About that it disconnects at all, that’s very interesting and opens up a few doors. How well have you worked with this previously? Have you ever encountered a situation where something was able to disconnect, but not get reconnected?

I’m thinking a similar strategy could be use to bake keys.

I see how you mean, but this implies a “push” versioning system, as opposed to what we do now, which is “pull”.

Either is fine, neither escape the fact that versioning is hard. I’ll leave it to you to decide which route to take and now is a good time to make this decision.

Let’s not go with Milan’s workflow. Not that there’s anything wrong with it, but because it is ad-hoc and unfamiliar. Not something I think we should push onto anyone via Magenta.

Just to a tiny note on this. We actually discourage from doing this a lot. It is technically possible and very rarely happens, but artist has to jump through hoops on purpose (clearing the versions from database before he published it again for instance, validators telling him it’s probably not the best idea) to make sure he is realising what he is doing.

1 Like

Thanks for hopping in and contributing! I assumed that much. I guess the Artist really is only in control about when to increment versions then?

Not really. It’s more a mixture of both and you could argue it has the downsides of both. :wink: It’s just that 8 out of 10 publishes we do is supposed to be a mere replacement of whatever content is already out there instead of a new revision that becomes backwards incompatible or where we want to keep previous asset for some shots. Maybe it should automatically increment unless the artist knows what he’s doing and he disables it during publishing?

For simplicity let’s give it a go with always incrementing (like it’s implemented now) and see where we end up?

This isn’t a problem of publishing, this is a problem of staying up to date with the latest version.

You could, for example, automatically up the version on scene-open, which would give you many of the same pros and cons with a pull based system. But the responsibility falls on staying up to date, not with producing versions to pull.

I think we might benefit from sticking to 1 concrete approach to publishing, and handle version management separately. They are both very important and difficult problems to solve.

Auto-updating should be avoided. It must be critical that there’s a choice on whether to update but to ensure the artist is aware if assets in his scene are out of date and require updating. So yes, we’ll have to make it simple for the artists in Magenta to become aware and update whenever they choose to do so.

We could write it towards something like that so it becomes solely the stack of context managers. Sometimes the order is important + the extension of the nodes list towards the cmds.file command. For example we always include the parent hierarchy even if only shape names are provided. Having it as a complete wrapper does give us more control.

For example we also ensure that an output directory exists if preview=False and createFolder=True.

Sorry about that. This must have been the messiest code I’ve ever written. I took some context managers from my own library of tools plus wrote some new ones and basically left it as it was when it worked whilst doing some quick testing in Magenta. I’ve cleaned it up now.

I think the most visually annoying thing in the code now still is the # TODO and other comments, but I think they really help understanding the choices made if you start having to change/update the code.

But have a look at my latest commits for the exporter.


Actually this is an important step towards becoming explicit about what we want to export. In a perfect world our export wrapper only exports those nodes that are explicitly in the node list and never expands it further. So re-running a preview on the nodes list will never expand further. This would mean that we can exactly specify which contents should end up in the exported file. And I think we’re already a lot further than where we get with solely maya.cmds.file.

I use some of these context managers on a daily basis (and some were implemented based on the needs of Magenta) but I’ve always been happy with the experience. Things become tricky when referenced files get involved because they can’t be unparented or allowed to break connections so we would have to capture that as a clear error. Currently it will just flat out spam a Maya error, which isn’t always the clearest among stack traces. :slight_smile:

This is the thing. In a push-system, auto-updating is implied.

It’s not uncommon, and it does work. It just shuffles the problem around a bit and needs solving in a different manner. I wouldn’t look the other way on it, there are as many benefits to it as there are disadvantages, and some people swear by it, Disney for example.

Ah, that’s a good point.

Spontaneously I was thinking of utilising the Undo feature here, of importing a reference, making the disconnects, and then undoing once finished. I noticed you were doing some of that as well, but not sure to what extent.

Importing a reference in Maya can’t be undo’ed. :slight_smile:

Ah, well then. :slight_smile:

But in our case our Validators are already catching references for model. And I assume the same is a nice fit for a rig?

The exporter should work fine with references. It’s just more complex when you’re trying to break the hierarchy of the reference (eg. including only some of the reference’s nodes instead of all) since in the ‘context’ of a reference it’s not possible to break it up like that.

It also depends on how you want the output. If the command is used to export with preserveReferences=True than altering the hierarchy is a no-go either way since it’s not valid!

Yeah, that sounds about right. Anything referenced should already have passed through validation anyway and should probably be assumed to be valid.

I would argue that there is never a need to preserve references for published assets except space concerns, which is an optimisation. Do you, or when do you usually prefer to preserve?

We use it for building sets/collections which are a collection of props. For example when we build a book-shelve filled with books and the book individually are already important then we load in those props.

We do something like:

# props
prop/bookShelve
prop/book01
prop/book02
prop/book03

# collections
collection/bookShelveFilled01  # references bookShelve, book01, book02, book03
collection/bookShelveFilled01  # references bookShelve, book01, book02, book03

Similarly we build a complete set (like a film set) if it’s to be re-used throughout the film during multiple sequences.
For example we have the environment which is the barebones of an interior or exterior without any props in it.

Something like:

# props
prop/newspaper
prop/tree01
prop/tree02
prop/tree03

# environment
environment/gasStation

# sets (like a film-set, they are the filled environments)
set/gasStation_normal
set/gasStation_afterApocalypse

Naming conventions used here are a bit random, but they show the concept. Basically these collections are supposed to be pure collections of references and shouldn’t introduce new nodes to the scene.

That still looks like an optimisation to me. Couldn’t you just as easily not preserve references, and get the exact same behaviour?

For us it’s perfect because of the non-destructive behavior it exhibits. Whenever a prop gets updated it propagates through the collection as well. This also ensure the filled bookShelve and sets do not have go through lookdev since they preserve their reference to the original assets.

It’s the first step to having something only add data relevant to the publish. It doesn’t care what assets are contained, only about that it’s collecting them together at a certain position in the scene in its own hierarchy (just to have it tidy).

Once it becomes its own asset without those references it would need its own published lookDev and objectIds (not currently used in Magenta, but we likely will once we get to publish lookdev).

I suspected you’d say that. :smile:

I thought we just agreed that automatic updates was a bad thing? This is “push” behaviour at it’s finest. We can go this route, but I’d suggest we stick to one or the other.