# Pyblish workflow manifest

i used the holidays to work on the pyblish plugin manager, and finished the MVP.
using it is simple:

# 1. create a pipeline config

create a json file: {[plugin_name]:plugin_settings, …}
or you can use the GUI tool i made for this.

# 2. apply a pipeline config

register plugins like normal, then register the config. (which is a pyblish plugin filter)

from pyblish_config.config import register_config_filter
register_config_filter('sample_config_folder.json')

import pyblish_lite
pyblish_lite.show()


your plugin settings are now applied, explitely.

the implementation uses pyblish filters, and compliments all current pyblish features.
no re-architecturing is needed

most time was spent on making the config creator intuitive/ userfriendly.
the code currently lives outside of pyblish base repo. we could bring the load config function into pyblish base. (a single function)

all other code is related to the config creator which like any pyblish UI has it’s own repo.

there likely will be overlap between the config creator, and the plugin manager (think plugin marketplace, where you read descriptions of public plugins)
for now i want to focus solely on explicit plugin management, so pretend the marketplace doesnt exist yet.

hello, i run an imaginary studio working on 2 game projects:

• BF (battling field)
• FF (final fantasia)

My team made a lot of plugins (2 !!) for the first project, FF:

• a collector. collects files in a (hardcoded) path
• a validator, ensure files start with the project’s prefix, FF (hardcoded)

Now that I started on my new game, BF. My studio would like to save time and reuse the plugins from the previous project, FF.
But it takes so much time to recode said plugins. (just pretend … )

when we run the plugins, we collect and validate files:

options A:
recode all plugins, maybe make it load from a config file, or inherit the plugin and overwrite the attribute in the child class

option B:
overwrite plugin settings externally without changing plugin code, using the new pyblish filters feature
advantage: the same plugins can be shared between studios, or projects. and we just change settings.

next steps:

• clean up repo (it’s a bit messy atm)
• decide how we want to move forward. (implement config load method into pyblish base)
• create pyblish docs on explicit vs data-driven approach
• work on more plugin sharing functionality, to work towards our goal of a plugin marketplace.

Happy new year.

I’ll play the opposing side and ask the difficult questions to hopefully solidify this concept.

1. What makes the plug-ins for FF specific to FF? Could they be made more general?
2. If they are so specific to FF, can they even be shared with BF without a rewrite?
3. If they can indeed be shared, then could FF simply not add the path of plug-ins from BF to itself directly?

For example, an FF plug-in might look like this.

class ValidateNames(...):
def process(...):
assert name.startswith("ff_"), "This wasn't an FF name"


Which would make sense to keep limited to FF, maybe even given a FF-based family, such as FF_validateNames. But this could not be shared with BF, since BF would like have a different naming scheme.

On the other hand…

class ValidateNames(...):
families = ["anyAsset"]
def process(...):
assert name.startswith("ast_"), "This isn't an asset"


This would already belong in a global repository of plug-ins and could be applied to any and all projects already, not needing any explicit sharing or config files.

1. Is this a scenario you are looking to solve?
2. What does your proposed solution do differently/better/worse?

Re-watching your GIF, which is very long and hard to follow. An mp4 would work better here, so one can pause, rewind and get a sense of where in the video you are chronologically.

You seem to tackle this issue with what I gather is the unique selling point; that the name isn’t actually in the plug-in but in this separate config file? If so, that’s a good idea!

Looking back at the original problem, I’d imagine we can re-reflect with this new (to me) information in mind.

The config file proposal also uses/requires a helper tool

For these final three, could they be solved by having plug-ins reach out to a config file explicitly?

class ValidateName(...):
def process(...):
with open("c:\path\to\config.json") as f:

assert name.startswith(config["nameSuffix"])


Where the path to your config could be coming from an environment variable e.g. CURRENT_CONFIG=c:\path\to\config.

If a studio already has a database, like Mongo, the config could even be an address, e.g. CURRENT_CONFIG=mongo://this/config whereby the interface to edit it is whatever interface you already use to manipulate a databaseli, like the Shotgun UI.

The advantage being that it’s natively Pyblish with no extras, and made explicit within the plug-in itself.

That’s right

##### 1. set plugin settings externally from a config (avoid changing plugins / code)

it makes reusing plugins easy.

the aim is download plugins from somewhere, and set their settings externally.
plugin authors can expose settings as public variables
the plugin itself can be a black box.

limitations:

• only works if the plugin exposes settings in a public variable
• it does not work with your hardcoded example assert name.startswith("ff_")

making settings a public variable of a class is pythonic, and therefor a realistic expectation of the plugin author. Pyblish docs should encourage this.

##### 2. explicitly register plugins for certain workflows

several people asked this on the forums, or created their own solutions, bootstrappers etc.
(myself included) highlighting the need for explicit plugin registration.

a example from my own experience:

i want to run a certain collector on several scenes,
but there is no garantuee the scenes are clean and do not contain any other data that other collectors might pick up!

• the data driven approach would be clean up your scenes!
• the explicit approach allows you to continue without doing so, a necessary evil when working with thousands of scenes made by others without any validation tools.
retro fit pyblish on an existing project (explicit), vs available from the start of the project (data driven)

yes this is a way to solve this. but the plugin author is now required to add support for this to their plugin.

instead of just writing a simple validator plugin

assert name.startswith(self.prefix)


you suggest to hardcode the config file handling in every plugin, if i understand correctly:

    with open("c:\path\to\config.json") as f:

assert name.startswith(config["nameSuffix"])


so now for example, we can’t change plugin settings of these plugins without recoding them: Collection of 15 reusable plugins for maya validation
(i’m assuming these have their settings publicly exposed)

the smaller and simpler a plugin the better it seems to me. the quicker to write one. the less things to break.

Ok, I see what you mean. So instead of the data being explicitly created and accessed, it can rely on data being created and accessed like family and label and other class variables already are being created.

And then this tool would load the plug-in, overwrite those values…

ValidateFilePaths.pattern_name = "Something_else*.ma"


…followed by re-loading the plug-in instance directly via e.g. api.register_plugin(modified_plugin)

Personally, I think we could take it one step further.

class ValidateFilePaths(...):
label = "check FileNames starts with"
order = pyblish.api.ValidatorOrder
families = ['paths']

options = [
pyblish.api.String("patternName"),
]

def process(...):
assert name.startswith(self.option("patternName"))


In this way, it could be made explicit what is meant for outside access, and each member can carry additional information suitable for authoring/editing via a UI.

options = [
api.String("patternName", label="File Prefix", help="A regular expression"),
api.Boolean("mustExist", default=False, help="Whether or not a file may not exist"),
...
]


A UI such as the one you’ve built could then generate suitable widgets with tooltips and what not to aid in the manipulation of it. I’ve built something like a few times in the past and it’s worked pretty great so far, such as for cmdx.

import cmdx

class MyNode(cmdx.DgNode):
name = "myNode"
typeid = cmdx.TypeId(0x85006)

attributes = [
cmdx.String("myString"),
cmdx.Message("myMessage"),
cmdx.Matrix("myMatrix"),
cmdx.Time("myTime", default=0.0),
]

affects = [
("myString", "myMatrix"),
("myMessage", "myMatrix"),
("myTime", "myMatrix"),
]


And qargparse.py

import qargparse
parser = qargparse.QArgumentParser([
])


A pattern that can be extended to external config files like what I do for Ragdoll.

Which ends up generating things like this.

yes that is correct

great suggestion! and ironic that my current UI is data-driven, and your suggested UI is explicit

• the current UI has implicit tooltips, it get’s them from the plugin’s docstring.
• it has implicit type for the attribute settings, based on the current value, which atm bugs when the value is None. or 1(int) but also supports 1.1(double)
ex. pattern_name = 'FF_*.ma' , type is string.

• explicit tooltips.
• explicit type

the disadvantage, slightly more work for the plugin author.
instead of making a variable public, they need to read the pyblish docs and understand how options work.

questions

• how do we set options for plugins that do not have custom options? ex. default pyblish options such as active, family, …
Can I assume we have those default widgets stored in the pyblish plugin baseclass?

• how would these options work with command line access?
ideally we want something like this,

ValidateFilePaths.options.pattern_name = "Something_else*.ma"


but this would overwrite our api.String stored in there.

api.String("patternName", label="File Prefix", help="A regular expression")


and having to do this seems quite cumbersome.

ValidateFilePaths.options.pattern_name = api.String("Something_else*.ma", label="File Prefix", help="A regular expression")


probably something like this, which still feels slightly unpythonic to me

ValidateFilePaths.options.pattern_name.setValue("Something_else*.ma")


if found in options use those, otherwise use implicit.

class ValidateFilePaths(...):
label = "check FileNames starts with"
order = pyblish.api.ValidatorOrder
families = ['paths']

patterName

options = [
pyblish.api.String("patternName", "pattername tooltip"),
]

def process(...):
assert name.startswith(self.patternName))


an option named pattername is detected, a matching variable of the plugin is found, use tooltip and type in the UI widget.
when accessing through commandline we can simply do self.patterName = "Something_else*.ma"

PS: this discussion is quite similar to the new python annotated feature (python 3.9+) which allows you to set type and docstring per variable https://stackoverflow.com/questions/8820276/docstring-for-variable/8820636

example when you want to change default plugin options:

One plugin runs on the meshes family , another plugin runs on the models family.
Both plugins are written in Pymel and would be compatible if they had the same family.
We can externally change families to meshes on both plugins, they are now compatible.

In the above example, it could look like this.

ValidateFilePaths.options = [
api.String("patternName", label="File Prefix", help="A regular expression")
]


The problem I have with overwriting class members is that it’s not clear which is meant to be overridden and what is not, along with there not being enough information to implicitly derive metadata from just a single value. For example, should that string be a regex, can it contain unicode, and especially numbers which often have min and max limits.

It also opens up for the user using a variable name that Pyblish already uses, like families. Or vice versa, when Pyblish expands its use of variables, suddenly breaking plug-ins that worked before. And it’s the worst kind of breakage, since there’s no way for a user to know what variables may eventually be used. Rez had/has this problem, where user and internal variables share the same scope. It works on day 1, but is a guaranteed problem in the future.

1 Like

that’s a good example. hadn’t considered that.

so before moving on, can i confirm:

• the config loading prototype i made, makes sense to you. and we now want to expand on that with making this explicit.
• to expand on this we add datatypes to pyblish, with data to create tooltips and widgets.
• the config file loading itself has no suggestions for change from you.

If that sounds okay, how about i start on a PR for just string type, to flesh this out.

I’m onboard with the overall idea of having plug-ins draw configuration from an external source; whether that’s a file, a database, environment variables doesn’t matter much and should probably be up to the user.

Having these new String and Bool types be part of pyblish_base seems appropriate. UIs can then be built similar to your prototype, or even Lite and QML, and/or integrated into those too.

The more we can avoid enforcing a particular syntax or schema the better. We can’t fit everyone into one schema anyway. Perhaps the best approach is having the config come from a function and/or plug-in itself. Something that can be programmatically generated, such that those who want a file in some format like JSON can load that file in that function/plug-in.

@BigRoy mentioned openPype has a preset system for plugins, and plugin attributes.
very similar to what’s developped and discussed in this thread

# completed: created test plugins

a varied collection of generic pyblish plugins, across dccs.
great for testing several generic setups, and will be usefull to start creating some kind of community pack for each dcc.

some are custom context based, some instance based.

• instance based plugins can be made to work with plugins from another author more easily, using convertors in between if data type is not the same.
• custom context ones are more like traditional data driven pyblish behaviour. and are very much locked into their own setup.

did some more work on the manager too:

# completed: pipeline configs

• added support for drag and drop arranging to order plugins
• delete to remove plugins.
• now works both in and outside maya
it’s very easy now to make pipeline workflows, and customise plugin settings if needed.
been testing this with some of the plugin collections i made and is working quite well.

# next step: convertors.

a validator that needs a pymesh, but the instance is a meshname.
the convertor handles the inbetween stuff.
will try make this as a plugin for now.

this would be the main thing to make community plugins compatible with each other.
i’ve got some pymel and some meshname plugins, that i want to make compatible with each other without simply recoding them.

# cool to have: actions browser

It would be cool to be able to add actions to a plugin when creating your workflow.
this would allow actions to be shared in the community/project.
an action could then, just like a plugin, be discovered and registered.

• but input from actions would need to go through a convertor.
• and actions cant be saved in a json since they are classes, so we would need to get them from a registered_actions list by name. which now needs to be coded
• and pyblish base would need to add register_actions functionality.

it’s a lot of steps and not as important right now. let’s get workflows working first.

• to not lock ourself into JSON and support database, file, YAML … we can just keep the whole json pipelineconfig outside of pyblish.
the basic TAs and artists can them use some kind of starter pack from another repo which includes this JSON pipeline tool.

• the custom gui for each plugin is great, but adds extra hassle for the plugin developper.
ideally writing a plugin is as simple as possible.
UI for custom options should be optional not needed.
and the biggest reason: 99% of the time we just want to change basic parameters. a int or string.
and we know the type of all default plugin attributes already

• adding the option to adjust external parameters for plugins in QML turns it into an asset validation tool for individual artists.
but for now i’m focussing on the pipeline aspect for bigger teams.
options are set in a pipeline config, and then consumed by the user/artist.
(i know it sound like default pyblish, the main difference is plugins are not unique to project/studio, promoting reusabilty, and lowering barrier to entry to pyblish)

## TODO

• we need a separate options attribute for plugins.
This is something that ideally goes into Pyblish base. And then in the docs we tell user to put exposed attributes in there.

could do

plugin.options = object()
plugin.options.prefix = "GEO_"


Great work. I’m cautiously optimisting about the idea of “converters”, it does seem like a steep hill to climb. Is node paths the only thing of interest? What about PyMEL attributes or data types like matrices etc? I’ve even heard of some using the MASH Python library, which is apparently a thing.

that’s why for now converters are plugins that can be made be the user.
want to support a new datatype? create your own converter
we can have a few basic ones for most common scenarios. now we support 90% of all plugins. the user can make the remaining ones themselves

easy ones for maya

• meshname -> pymel mesh
• transform -> shape
• meshname -> dagpath
• short -> long meshname

if there are any interesting checks on github that are worth creating plugins for let me know. would be funto test weird things such as MASH. but think the boring basics take priority for now.

most stuff i find is either meshname(cmds) openmaya and pymel which is easy to convert to each other.