Pyblish for Deadline

Oo, that is very interesting.

Would it be possible to have one of your excellent YouTube videos made of this workflow? :blush:

Would it be possible to have one of your excellent YouTube videos made of this workflow?

Sure, I’ll do that when its ready. Atm its just been experiments outside of Pyblish.

So I got something working with Pyblish and submitting remotely.

I’m currently working with a config.json file next to the plugin. The main reason for this, is that I can be sure that I don’t accidentally upload it.

What I’m thinking for a default behaviour is that the extension will try to submit remotely first, if the config.json file exists. If this fails, it will try to submit locally.
There is also the option of having two plugins, that the user can toggle. If the config.json file doesn’t exist, the plugin would just log a warning.

Ohh, and I’m using requests for making the remote calls. What would be the best way to package that with the extension?
Just tell the users to install it?

That’s great, sounds exciting!

About requests, you can bundle it.

The way other Pyblish projects do it is by making a child package of the main package called vendor that has requests in it.

▾ pyblish_deadline
  ▾ vendor
    ▾ requests

And then in code, you import it like this.

# Ideally
from .vendor import requests

# Or
from pyblish_deadline.vendor import requests

# Not
import requests

The last one is problematic as it could accidentally import a locally installed version and the top one guarantees you’ll get the bundled one.

Here’s an example from Pyblish QML.

So you don’t use submodules instead of copying the files?

Submodules are only for Git repositories I think.

There’s nothing wrong with keeping a local copy with your project, it ensures you’ve always got the same, working version for everyone using it. Sidestepping a lot of nasty bugs.

I might look into the submodules, similar to how pyblish-win is working. Could just point to; https://github.com/kennethreitz/requests

Sure, should prove a good experience.

I’d imagine you’ll need to go ad-hoc on how you actually use it in Python, as the top-level directory would need to remain a Git repository, and not a Python package. You will probably need to modify sys.path but hopefully not, as it’s global and might affect other modules.

Wanted to note this down, while thinking about it.

I have a vision for this extension to have a tighter integration with Deadline. Currently we rely heavily on Deadline to take over the responsibilities from Pyblish when submitting, which means you have to have a couple of custom event plugins to integrate with the pipeline.
I envision that we “just” send signals, with data, about the state of a render job from Deadline to Pyblish. The signals would be emitted with an event plugin for Deadline. This does involve an extraction queue system for Pyblish.

You could say that we would just be shifting the responsibility from Deadline back to Pyblish, as for integration with the pipeline, but this would mean your pipeline code would be more centralized and sharable. It would treat Deadline as just another tool/host, that Pyblish integrates with.

I’m not quite sure that I understand what situations you’d like this to be used in.

Could you provide some use case, or simple example of when this would be useful? Just so I can wrap my head around your thinking here.

Could you provide some use case, or simple example of when this would be useful?

The situation would be;

  1. User validates the scene, and passes.
  2. The “job” gets added to the Pyblish extraction queue, which submits it to Deadline.
  3. Deadline renders the job, and signals back to Pyblish.
  • This is done with a single generic event plugin, that just informs Pyblish about success or failures.
  1. Pyblish gets the signal, that the render/extraction is complete, and continue the extraction queue to integration with Ftrack.

So the amount of code for Deadline is just the event plugin that sends the data. The integration with the asset management, Ftrack, is held within the Pyblish scope and can be used whether you are using Deadline or not. Basically eliminating studio specific code from Deadline.

Does that make sense?

Ah I hear you.

Could be very useful indeed, however it has a caveat. I assume that pyblish tray would have to be running on the artist computer. If artist turns the machine off at the end of the day, or it get’s restarted you’d most likely loose this connection meaning that nothing would get published.

Or am I getting it wrong?

Could be pyblish-tray, but could also be something else.

Nope, that is very true, and one of the design issues with the extraction queue being artist based. It could also be a more centralized processing. Don’t know atm:)

But maybe those discussion should happen here; Extraction Queue

I think it might be time to get some proper integration for Deadline.

I’d suggest that we have an event plugin, that calls pyblish.util.publish(). There are five stages where you could call pyblish; OnJobSubmitted, OnJobStarted, OnJobFinished, OnJobRequeued and OnJobFailed.

I would leave it to the user when to call pyblish via the event plugins settings. Also in the settings the user could specify the install location of pyblish-base (maybe python as well), and the PYBLISHPLUGINPATH for each of the five stages.

The workflow would be for the submission in the hosts, to append any data needed to the job info for deadline-pyblish. This means we can leave any output settings contained within a scene file, and basically continue the integration when the render has finished.
In theory you could do other validations and extractions, if you wanted to.

What do you guys think?

I like it.

Spontaneously, I would attempt having a matching but separate series of plug-ins on PYBLISHPLUGINPATH server-side with compatibility to families on the submitter side.

  1. The submitter collects instances based on data in the scene, runs through all the way to integration.
  2. An integration plug-in triggers a remote operation
  3. The remote operation runs a set of different, server-side plug-ins, collects instances based on what was extracted during submission and completes the publish with it’s own integrators.
  4. Publishing finishes, and whatever happened appears in something like ftrack (or Deadline?) or wherever notifications are made.

That was difficult to articulate… does that make sense?

Not entire making sense to me:)

Where does the farm processing come in?

It kinda sounds to me that the submitter machine, has connection with a farm machine?

Farm processing would be the “remote” part.

 _____________                 __________
|             |               |          |
|  Submitter  |-------------->|  Remote  |
|_____________|               |__________|
  
  o- collectorA                o- collectorC
  o- collectorB                o- integratorB
  o- validatorA                o- integratorC
  o- extractorA
  o- integratorA (submit to remote)

In the case of rendering, the extracted file would probably be a Maya scene, and the collector on the remote end would be setup to collect a different family than what was collected during discovery, at that point a different set of integrators (or extractors?) would get run, those that perform actual rendering.

I might still not be getting exactly what you mean, but are you saying that the pyblish plugins are doing the actual rendering?