Hello, I was wondering who I should talk to if I want to start working on 3D (with Cycles rasterization perhaps) and distributed (network distributed rendering) node graph support for Natron. These are 2 separate things so maybe different people are working on it… Thanks. Vincent
I don’t know anything about coding, but I’d love to test
Hi! Glad to hear that. For the distributed processing, do you have more than 1 computer, ideally on a 10G ethernet that you can test that on? If so what OS/hardware configuration? I need to hear from the Natron dev if they think either distributed processing or 3D node graph is/are a good idea at all because Nuke, for instance, had to develop a completely, non-OFX API for the 3D operations : https://www.thefoundry.co.uk/products/nuke/developers/105/ndkdevguide/3d/architecture.html
And I do not know if the INRIA would be into it…
I’ll let you know when I myself know more about what this development could take place in this project.
I don’t have multiple computers at home, but here at work we have a farm of 10 that we use for rendering Blender animations. I’d have to ask, but I don’t think it’d be a problem to test.
We already have plans on how we are going to implement 3D so it is very efficient, but not had time yet. The thing is, the internal engine to support 3D is not so long to do and follows actually very closely how the 2D architecture works.
What will require work is the GUI to work with 3D elements and the 3D operators themselves.
What we are going to work on first is extend OpenFX with a few actions to support 3D effects. Mostly these effects will be able to read/write geometry and apply shaders as well.
We have no timeline yet on when we are going to do this, because we are busy on other things that need polishing before actually diving in the 3D stuff.
Here are the elements that need to be implemented for 3D as a first pass:
- Metadata support via OpenFX
- 3D viewer
- 3D cards (OpenFX plug-in)
- Camera support (via OpenFX) (using metadata)
From there we can then think about projection mapping and camera tracking (using libmv that we already use for the 2D tracker).
Then we can support reading geometry with alembic via an OpenFX plug-in and add support for basic shaders and a render node based on Cycles.
A useful tool would also be the implementation of Cryptomatte into [Cycles] (https://developer.blender.org/D2106).
Anyway all of this is quiet tied up to the Natron Engine and unfortunately unless you master the Natron architecture and code-base, I doubt you can implement the core Engine. But you can contribute on the external stuff and plug-ins, even in coding an experimental 3D viewer that we can then integrate in Natron.
Regarding network distribution, we do not work on that because 3-rd party tools already handle it such as CGRU Afanasy. Basically you only need to use the Python API to support that.
Supporting rendering of a single frame across different processing computers would require a lot of work and would probably be something to think of, even though some operators would most likely work a lot faster locally than over network (which would require an insane bandwidth btw)
Hello, for contacting the devs have you tried this : https://natron.fr/contact-us/
For distributed rendering, there is already natron support for CGRU/Afanasy renderfarm, that’s the best way to render across several computers. It’s may not be a good option to split the nodegraph calculation across different computers. Rather it’s better to calculate various frames on different pc .
For Cycles integration : having cycles can be great but maybe not the better way to bring 3D into Natron. As it open the door for having Natron doing the same job as a 3D software. Where compositing first purpose is to do things way faster than into a 3D application.
Maybe some opengl rendering can be enough ?
IMO the basic needs for 3D into a compositing application is having support for 3D planes with no shading. And good camera support (being able to match real world camera) .
An efficient viewport is also very important.
Being able to render 3D objects is great too, but generally you want to render them also without shading ( Camera mapping ) and output some passes like Z buffer, Normal, motion vectors, position pass ect…
I you are planning to extend the OFX spec to support 3D operators/nodes, then yes, that should come first, I guess. Are there some draft API aspects I could read or can I talk to you and the others that have been thinking about this and write it down for you that would be a great way for me to get up to speed. Textures, applied on cards or other geo should indeed be the entry point for the existing 2D pipelines into a 3D scene and AOVs (including Cryptomatte) and a standard beauty pass should be the way back in to the 2D compositing “context”. I will enquire with Lukas on what is missing from that Cycles patch (Brecht mentioned something about the code being “only prototype”), and whatever else is need it to “wrap” it in a OFX 3D operator (for which I would also need to see a spec).
For distributed processing, I was thinking more about automatic distribution of an interactive session rather than async batch render orchestration, but I think I would rather see more 3D in Natron first and then think about how to optimize rendering on multiple machines.
So please let me know if you can spare some time to talk to me (Skype in French, maybe) so I can start drafting on the OFX 3D spec.
So, while I’m waiting for the Cycles dependencies to compile so I can see about that Cryptomatte patch, I will propose this: how about I write a “normal” OFX plugin that dynamically sets its raster input slots based on a Cycles XML scene file. So its input would be determined by the content of the XML file which would be the only parameter for this plugin. My question for this is: does the current OFX spec and current Natron implementation support having a plugin dynamically change its inputs (count, labels, etc.)?
Once I have that running, I would probably want to use Natron’s private APIs to have a node graph load as a subnet of the main compositing graph to edit the XML scene when double-clicking on the Cycles OFX plugin instance node UI. Would that be possible?
We can chat by e-mail a little bit more in-depth if you want.
I would strongly suggest that you get a good grasp of how OpenFX works currently with 2D before you consider 3D.
OpenFX has limitations, which can be of course by-passed by proposing enhancements to the standard. But these enhancements become no longer standard when only a single software supports it.
Natron is 100% OpenFX and is the only plug-in format it supports, besides built-in nodes.
For instance, OpenFX does not support at all dynamic inputs, but there is a proposal for a standard change.
Natron internally supports it, however it might need some internal changes if we need to support it on OpenFX side.
The next major version has a revamped Engine, which should hopefully make it a lot easier to integrate a 3D engine into it.
For now, it’s not out yet so we cannot start drafting anything while it still is not stable.
I also strongly suggest you to take a look at how Fusion and Nuke interact with 3D.
I cannot give you any time estimate, but we are going to work on it. “When”, will be determined by our clients needs. It might be that we have to implement Deep imaging first.
I don’t know fusion but I know Nuke very well and worked with it extensively both in 2D and 3D and also recompiled its FFMPEG reader node to bypass its (then) blacklisting of some codecs.
I also know enough of OpenFX to make a crude Cycles “generator” plugin with no input. But I want to have a node-based GUI to edit its scene description format (for now, only XML).
I don’t want to turn Natron into a 3D package like Houdini, but I think that with a 3D environment, it would bring it closer to something like Clarisse and other look dev packages where you bring in all the Alembic and OpenVDB animations, OSL shaders and textures, etc. within the compositing session.
I am also interested in Deep so I wouldn’t want to slow you down or anything, but I still think I would like to have a look at compiling Natron myself and implement Cycles as a built-in node until you bring OpenFX dynamic inputs and the revamped Engine in the open…
And I proposed Skype only because a voice conversation is faster than text but it might not be necessary as it stands.
Hey mate I think this is great that you want to ingrate 3D
I just want to let you know if you don’t already that Blender has that new
viewport dev work going on which already is very promising and like crazy
VRay is for Nuke why not have Cycles for Natron
I can think of two Devs that you can talk to who are both French
These two fellas will know some people to help get you moving in the write
direction and you will be able to actually talk to them.
On a personal level I would say just go for it even if there is a bit of a
cold front here.
There will definately allot of people willing to use and trial out what
ever you create.
Also introducing 3D sooner than later is only going to encourage larger
studios to start picking up Natron and giving back to it in Dev work; this
will only speed Dev work up making Natron better.
So if your looking for permission than you have it from me.
Go for it!
Remember it’s not where you start it’s where you end up brother.
Merry Christmas and go kickass Rock Star becsuse your awesome.
I wanted to wait to talk to Alexandre before writing back on this thread. I skyped with him the day before yesterday and I think what came out of this conversation is good, so here is a small heads up.
I made my point that there is a balance between architectural elegance, economical consideration to be given by the Natron team to their paid clients, the features requested by the non-paying Natron users and those of unpaid Natron actual or would-be developers (who are also non-paying users). So for me, being the latter, it really comes down to whether or not the features I want (and am able to develop on an independent Natron for if need be) are aligned with the long-term architectural views of the Natron team.
And it turns out they are. It also cam to light that we also agree on what the first steps are for them and for me. For them, it’s that their beta code for Natron 3.0 currently in the tls-wip branch of the Git repository will see the light of day in the next couple of weeks. This Natron 3.0 architecture contains the necessary abstractions to further develop the features that you and I want. The first step for me will be to understand and document and represent those 3.0 changes to both the OpenFX community in the form of wiki RFCs and official proposals for change in the OpenFX spec itself and also, since I am from Montreal and the VFX community is very much alive here, I want to do a presentation intended to R&D leaders in the various studio as to why Natron 3.0 is the best platform to develop all those in-house projects. Then, it will make sense for me to publish my code contributions, the first one will probably be a 3D framework with Cycles as the rasterizer.
So here it is. Hope it kind of makes sense. Stay tuned. I will talk to the Natron team in the next couple of days and “interview” them about Natron 3.0 and will publish all of this as I go.
Thanks. Happy new year!
That sounds so positive and good luck with RnD meet ups cause that sounds
like a bloody great idea.
Seriously well done can’t wait to hear more about this in the future.
I will be letting people know as well. Maybe you can do a youtube live
I’m looking forward to seeing how this goes. Like I’ve said, I can’t code, but I’m always willing to test