Why My AI Workflow Starts on the Page, Not the Prompt
The AI conversation in filmmaking has become almost theological—salvation or damnation, depending on who’s preaching. But for creators actually building films, that debate often skips the most practical question:
Who’s steering the ship?
There’s a misconception that using AI means surrendering creative control—that you push a button and the machine “dreams” for you. My process is the exact opposite. I don’t ask software to invent concepts for me. I build constraints, then I force the tools to execute a vision that already exists.
That’s why I treat the screenplay as the source code.
I wanted to share my personal pipeline for a recent mood reel to make one thing clear for fellow filmmakers: AI is not a slot machine. It’s a high-fidelity brush. To get a result that feels cinematic rather than synthetic, authorship has to come first.
The Pipeline: From Graphite to Generative
My workflow respects the traditional hierarchy of filmmaking, but adapts it for this new era. The tools change, but the order of operations does not. Intent first. Execution second.
The Analog Origin (The Sketch / Previsualization)
Every shot begins as a hand-drawn frame. Composition, silhouette, and emotional weight are decided in graphite before a single GPU is engaged. This isn’t nostalgia. It’s control.
When you commit to a visual decision on paper, you create an anchor the algorithm cannot casually override. The soul of the shot exists before it becomes data.
The Script (The Blueprint and Constraints)
That visual concept is then codified into the screenplay. Every beat in the 50-second reel was written before it was rendered. I pull descriptive language straight from the page—texture, atmosphere, tempo—and translate it into shot constraints.
I write shots as rules: what must not change (silhouette, mood, camera intent) and what may vary (micro-texture, lighting nuance, atmospheric noise). The AI isn’t hallucinating a world. It’s following a shot list.
The Execution (The Hybrid Finish / Look Development)
Next, the drawing and script constraints move into the digital space. This is where I work like a cinematographer. I refine lighting, control specular highlights, adjust fog density, and tune depth and material behavior until the image matches the promise of the page.
I’m not discovering the image here. I’m finishing it.
The Motion (The Performance Pass)
Once the frame is solid, I introduce motion through generative video tools. But motion, like blocking, is dictated by the script. If the scene calls for creeping, heavy dread, I don’t accept default animation. I iterate until the physics, pace, and weight feel emotionally correct.
Movement is treated as performance, not decoration.
The Finish (Editorial Is the Final Author)
After the renders and motion passes, the real finishing begins.
AI can generate clips, but it cannot assemble meaning. That last mile is editorial, and it’s entirely human.
I select takes, build the sequence, time the cuts, and shape transitions so the reel breathes like a single thought rather than a stack of outputs. I’m cutting for rhythm, not novelty.
I also do what I call “pitch-to-zoom” finishing: subtle reframing, gentle crops, and small keyframed movements that guide the eye and control tempo. Sometimes the camera leans forward. Sometimes it holds still. Sometimes it needs to listen.
This is where authorship becomes undeniable. The final piece isn’t the model’s taste. It’s my taste, applied frame by frame, in silence.
Vision as the Ultimate Guide
The result can look expensive and vast, but the real success is that it feels intentional.
If you see a specific creature silhouette, a deliberate camera drift, or a moment of unsettling stillness, you’re not seeing a random algorithmic decision. You’re seeing a directorial choice made on the page and in the sketchbook long before the render finished.
The Future of Independent Authorship
When AI is used without authorship, it produces content. It feels like a dream without a dreamer.
But when AI is used to execute a screenplay you’ve bled over—when it becomes the lens rather than the eye—it turns into the most powerful tool for independent world-building we’ve ever had.
The necessity of this era isn’t better software. It’s better authorship. The machine can render pixels, but only you can render intent.
1 person likes this
Yikes! I didn't think about anyone using AI to review. The good news is that the agent who finally read my script gave me excellent notes. I do like Patrick's suggestion that would give Stage 32 scree...
Expand commentYikes! I didn't think about anyone using AI to review. The good news is that the agent who finally read my script gave me excellent notes. I do like Patrick's suggestion that would give Stage 32 screenwriters the opportuinty to score the reviewer. After putting our heart and souls into our writing, I think we all want our work to be taken seriously.
1 person likes this
greenlight is actually pretty decent coverage. I wouldn't qoute from it but it's reviews have been fairly accurate and have helped me with professional reviews like blacklist
1 person likes this
Lori Jones : Getting AI to review your work is as good a feedback as many, at no cost.
I will try AI for that, however, since I don't know anyone in the business, when I reach out to industry professionals for a review, I'm also looking for contacts or interest in my screenplay.
"I'm also looking for contacts or interest in my screenplay." Getting a review gone, does not gain a contact, or interest in your work. Two different processes. You make contacts by chatting to people. (Like this)