Today we want to talk to you about EbSynth V2, a tool that has fascinated us ever since its first version, released for free a few years ago. The original version won us over immediately thanks to its ability to handle interpolations and creative transformations applied to our footage.
We can say that EbSynth in some way anticipated the world of generative AI applied to video. Today the software returns in Version 2, which introduces a renewed interface and, above all, the ability to manage the entire workflow directly online through a dedicated platform.
With the launch of EbSynth V2, creative video editing takes a leap forward. This software, developed by Šárka Sochorová and Ondřej Jamriška, allows you to transform an entire clip by modifying just a single frame.
What is EbSynth?
In the landscape of creative VFX tools, EbSynth occupies a fascinating niche: it allows you to transform entire videos starting from the modification of a single keyframe. You load a video, paint or retouch one of its frames, and let EbSynth propagate that aesthetic across the entire sequence.
The software has also become known through productions like Undone or Wednesday (and is even mentioned in connection with Doctor Who).
What’s New in EbSynth V2
Interface with real-time preview
One of the most evident innovations in EbSynth V2 is the real-time preview: you can immediately see the result of edits, without waiting for long processing times.
Use Cases & Practical Examples
- Undone: the “painted” effect is achieved by intervening on keyframes and propagating the style with EbSynth.
- VFX and motion design studios use it for local retouches, selective color grading, cleanup, and removal of unwanted elements—working only on keyframes and letting the rest follow.
- 3D renders can be “broken” from their plastic effect by adding “painted” textures in an artistic style and propagating them with EbSynth for a more organic look.
Timeline, layers and brushes
The new interface includes a timeline for navigating frames, support for layers for non-destructive edits, and brush tools directly on the video. This means you don’t need to prepare external masks: you can retouch directly from within the application.
“Non-AI” algorithm: example-based texture synthesis
It’s important to emphasize that EbSynth is not based on pre-trained generative models. Frame propagation uses an example-based texture synthesis algorithm, preserving detail and structure from the original material without arbitrary “inventions.”
How It Works: Workflow and Operating Flow
Keyframe and propagation
The starting point is the keyframe, which the artist paints or modifies. EbSynth analyzes the guiding video (optical flow, patch matching) and propagates those modifications to neighboring frames, respecting motion, structure, and detail.
If the movement or orientation of the subject changes too drastically (angles, rotations, occlusions), it is advisable to add intermediate keyframes to maintain quality.
Generation via AI prompt (optional)
An interesting novelty is the integration of AI features to generate keyframes via textual prompts. You can ask “make it in watercolor style” or “pen sketch,” and obtain a base which you can then refine. The strength parameter helps adjust how closely the generated result should adhere to the original video frame, so as not to lose recognizability of the subject.
Finishing and export
After propagation you can intervene with fades, opacity, smooth transitions, and selective adjustments. Once the result is satisfactory, you export the completed video with a few clicks.
Advantages, Limitations, and Best Practices
EbSynth offers many advantages, but it is useful to know the critical points to use it effectively:
Advantages
* Preserved detail: the content of the keyframe remains coherent in propagated frames, without “hallucination” artifacts typical of generative AI.
* Maximum artistic control: the artist decides what to modify and where.
* Rapid iteration: with real-time preview it becomes easier to correct mistakes.
* Flexibility: browser support + an offline Studio version.
Limitations & critical points
* Scenes with heavy occlusion or complex motion may generate artifacts.
* Elements that appear new (not present in the keyframe) cannot be coherently “invented.”
* Hardware performance matters: 4K videos or long sequences require a capable GPU.
* It is essential to choose representative keyframes and add intermediate frames where needed.
Community & user best practices
* On forums like Reddit many users report that contrasting the keyframe against the footage helps EbSynth “favor” the desired detail.
* Isolating elements in separate layers gives finer control (e.g. face, background).
* Using soft lighting and rich textures helps improve optical flow tracking in the guiding video.
Final Thoughts & Prospects
EbSynth represents a fascinating bridge between manual editing and guided automatic propagation: it does not aim to replace the artist, but to empower them. With EbSynth V2, the flow becomes more fluid, interactive, and suited for professional contexts, while preserving the original philosophy: no “magical AI,” but texture synthesis guided by the artist.
Usage modes include both a browser-based version (useful for prototyping and lightweight workflows) and a Studio on-premise version with offline processing, without uploading external data.
The message is clear: experiment, share results, and contribute to the evolution of a tool that has the potential to become a benchmark in the fields of motion design, artistic VFX, and advanced video retouching.
