Disclaimer: This is an internal pipeline experiment, built with AI-assisted Python scripting. It’s a work-in-progress for testing my own productions — not a finished or production-ready tool.


08-15-2025
While updating my Blender add-ons in preparation for a new practice scene, I saw a big update for Shot Manager and decided to experiment with a pipeline that included its new features along with Blender 4.4.
I’ve been a big fan of this add-on for a while, and I highly recommend trying it out, and purchasing if it fits your workflow. My goal for this project was to both sharpen my cinematic design skills and broaden my collaboration skillset with tech art, specifically building tools to smooth out my cinematic workflows.

The initial challenge
As I explored Shot Manager’s latest updates, I realized I wanted to export specific layers from Blender and bring them into After Effects for compositing.
These layers included:
- Characters
- Ships
- Grease pencil VFX (smoke, trails)
- Background smoke and back trails
- Ships
- Grease pencil VFX (smoke, trails)
- Background smoke and back trails
Having these on separate layers would let me add targeted filters and effects in AE.
But before I could get there, I needed to set some naming conventions and organization rules.
This was my starting point:



The growing problem
As the project grew, so did the folders. My current process was slow:
1. Go into each folder manually.
2. Select a PNG sequence.
3. Check and adjust the import settings.
2. Select a PNG sequence.
3. Check and adjust the import settings.
It also got more complicated when switching machines, I discovered my default AE ingest framerate was different on each computer. Files were importing at the wrong speeds, forcing me to go back and fix them one-by-one.

Planning for scalability
This made me realize:
Framerate management needed to be baked into the pipeline.
I wanted the system to handle multiple scenes or even an ongoing series.
I needed a process that would catch errors early instead of fixing them later.
I wanted the system to handle multiple scenes or even an ongoing series.
I needed a process that would catch errors early instead of fixing them later.
I started documenting my pain points, mistakes, and ideas for standardizing the process.

A naming epiphany
While figuring out how to script the first batches of shots, I started playing with terminology to describe these collections of files. I kept circling back to the word lode:
Lode: a vein of metal ore in the earth.
The analogy felt right; these batches were like valuable veins of data to be mined and processed. I landed on Shot Lode as the name.
Naming may seem minor, but having a clear, logical, and evocative naming scheme early on helps avoid confusion when expanding a toolset.

Next steps
With ChatGPT, I began mocking up early versions of a Shot Lode Loader pipeline tool. The scope was clear:
Locate exported PNG sequences from Blender’s Shot Manager output.
Automatically structure them for After Effects ingestion.
Provide room for user adjustments before the final export.
Automatically structure them for After Effects ingestion.
Provide room for user adjustments before the final export.
This was the start of my journey toward a unified media file manager that can handle both the creative and technical demands of cinematic design.
Images below show progress comparisons from start to current build.
Future updates coming soon.
Shot Lode Builder.
Shot Lode Loader/Inspector.




One of the features I was able to get into the Shot Lode Loader is that it has the ability to set the order of the layers so that they will be set up properly for my custom layer needs in Adobe After Effects.
This order in the example below will make sure the Foreground is the top layer comp, the MG layer is the center layer comp and BG is the bottom layer comp in a custom new Main Comp that the ingest script for AE creates on play for each of the numbered main layers in the viewer.
IE: 001, 002, 003 each have their own folders as seen below as well as a comp for the _prime layer, and now a new prime_main layer comp that is the comp made of the FG, MG, BG comps which should match up to the _Prime comp if you were to bring them into a composition together. The Prime layer is the exported composite of all layers from Blender. The others (FG, MG, BG) are each a sub-layer with an alpha meant to be stacked and have various effects and changes per layer.



Disclaimer: This post describes the creation of an internal tool I am developing for my own cinematic pipeline experiments. It is both a practical exercise in systematic problem-solving and a way of exploring how large language models (LLMs) like ChatGPT can assist with Python scripting for technical art workflows. This tool is actively being tested only on my own productions, and I make no claims about its completeness or reliability. The work shared here is meant as an exploration of what is possible for non-expert coders when using AI to simulate a tech-art dialogue, rather than as a finished or production-ready solution.