Workflow for handling paths within a scene

But what about non-textures, like the model or asset being published? The point is that everything is extracted to the same temporary directory, and that the same also can apply to textures.

Yes I think this goes without saying. This is why the remapping happens in the stage directory, before being moved.

The same applies. The extractor would just pass on where it extracted to and store it in the same list. (Just look at this list as if it is the result of the os.listdir but the files are more scattered.)


@marcus isn’t your example integrator IntegrateAssets doing it wrong? It’s remapping the paths using the resources data that was created in the Extractor. That would infer that the Extractor already knew the destination path. Instead the path it would be set to (the destination) should be defined completely by the Integrator and not just read from the extraction data?

Yes, but you get the idea. The final path is known in the integrator where the remapping happens.

Sorry I can’t wrap my head around what you mean. Maybe post an updated example that does what you mean?

Like this?

from pyblish import api


class CollectModels(api.ContextPlugin):
    order = api.CollectorOrder

    def process(self, context):
        from maya import cmds
        instance = context.create_instance("wholeScene")
        instance.context.data["workspaceDir"] = "C:/"
        instance[:] = cmds.ls()


class ExtractModels(api.InstancePlugin):
    order = api.ExtractorOrder

    def process(self, instance):
        import os
        from maya import cmds

        stagedir = os.path.join(instance.context.data["workspaceDir"], "stage")

        try:
            os.makedirs(stagedir)
        except OSError:
            pass

        fname = "{name}.ma".format(**instance.data)
        path = os.path.join(stagedir, fname)

        cmds.select(instance, replace=True)
        cmds.file(path, typ="mayaAscii", force=True, exportSelected=True)
        
        # Append the extracted file to the results
        # Instead of remembering the stagedir
        files = instance.data.get("files", [])
        files.append(path)
        instance.data["files"] = files


class ExtractResources(api.InstancePlugin):
    order = api.ExtractorOrder

    def process(self, instance):
        """Store the linked resources"""
        from maya import cmds

        resources = instance.data.get("resources", [])
        for resource in cmds.ls(type="file"):
            file = cmds.getAttr(resource + ".fileTextureName")
            resources.append(file)

        instance.data["resources"] = resources


class IntegrateAssets(api.InstancePlugin):
    order = api.IntegratorOrder

    def process(self, instance):
        import os
        import shutil

        files = instance.data.get("files")
        resources = instance.data.get("resources")
        
        # Define final location
        versiondir = os.path.join(
            instance.context.data["workspaceDir"], "v001")
        
        def _to_destination(source):
            """Source file to destination path"""
            fname = os.path.basename(source)
            return os.path.join(versiondir, fname)
        
        # Define resources mapping (source -> destination)
        mapping = dict()
        for src in resources:
            mapping[src] = _to_destination(src)

        # Update .ma files with remapped resources
        for abspath in instance.data.get("files", []):

            self.log.info("Looking at '%s'.." % abspath)
            if abspath.endswith(".ma"):
                self.log.info("Updating..")

                new_file = list()
                with open(abspath) as f:
                    for line in f:
                        if src in line:
                            for src, dst in mapping.items():
                                self.log.info("Replacing '%s' with '%s'" % (
                                    src, dst))
                                line = line.replace(src, dst)
                        new_file.append(line)

                # Update file
                with open(abspath, "w") as f:
                    f.write("".join(new_file))

                self.log.info("Updated '%s'." % abspath)

        # Write to final destination
        try:
            # Overwrite the version, remove me
            shutil.rmtree(versiondir)
        except OSError:
            pass
        
        # Create the folder
        os.makedirs(versiondir)

        # Copy resources (those that are remapped)
        for src, dest in mapping.items():
            shutil.copy(src, dest)
            
        # Copy files
        for src in files:
            shutil.copy(src, _to_destination(src))


api.deregister_all_plugins()
api.deregister_all_paths()
api.register_plugin(CollectModels)
api.register_plugin(ExtractModels)
api.register_plugin(ExtractResources)
api.register_plugin(IntegrateAssets)

# Setup scene
from maya import cmds
cmds.file(new=True, force=True)
fnode = cmds.createNode("file")
resource = r"C:\temp.png"
cmds.setAttr(fnode + ".fileTextureName", resource, type="string")

This currently does not clean up the stagingdir that was used for the .ma extraction.

Also the remapped path is actually incorrect (ha!) because the \T from \Temp gets interpreted as a tab. The resulting file path therefore becomes: C:\v001 emp.png which should’ve been: C:\v001\Temp.png

Aren’t you still writing to a stage directory? I don’t see how that changes anything, especially not in regards to managing paths in a scene. My example is simplified in order to highlight a technique, there are better ways of doing each step but this is not the place to do that.

The technique being:

  1. Yes, you can extract resources used in Maya and still conform to CVEI
  2. Yes, you can post-process a Maya, or any app, scene file, without relying on anything but Python

This is not intermediately copying the textures. Of course the Maya scene itself needs to be written somewhere, but the textures already exist and are purely transferred from the source location instead of making an intermediate copy to the staging directory where they are never actually used. That’s all I was pointing out. :slight_smile:

Just wanted to point out another flaw in this particular workflow.

This would not allow to validate whether different source files would end up overwriting each other on the other end (in Integration/Extraction) since these locations are not know during Validation. Which makes me believe the best way for these “remapping” to occur would be during Collection as well?

The more pressing matters to me is this.

  1. Each publish includes a duplicate of the same resource.
  2. Updating a resource means re-publishing whatever you published to automatically include the resource, resulting in a duplicate of the model and updated texture.

Overall it sounds like you are making life immediately easier, while making it exponentially more difficult later on. Rather than fighting the publish of individual textures, I would find a way to make that route easier. Even if it means polling a folder for updates and automatically publishing downloaded files, or right-clicking in the explorer to fast-publish things without a GUI. Whatever mechanism that enables control over updates to files that change independently - like models and textures.