
Hoplite is a new VFX and 3D animation studio making a strong entrance right from its launch with a Star Wars fan film created by a team of passionate artists.
True to the spirit of the saga while showcasing exceptional visual quality, this short film is much more than just a tribute—it’s a true demonstration of their technical and artistic abilities.
Star Wars 212, the movie:
Maurine, thank you for taking the time to speak with us, and congratulations on this amazing fan film!
To start, how many people contributed to the film? How did you divide the roles between in-house teams and freelancers? How long did the entire production take?
Hoplite is a small structure, and except for the management team, everyone worked voluntarily, approaching this project as a personal adventure. As for the distribution of roles, each artist focused on their specialty while also adapting to other tasks based on their skills.
The project was overseen by one of the co-founders, who handled directing, team coordination, as well as a large part of the modeling, rigging, and animation. Other artists supervised their respective areas of expertise, from texturing to the final rendering.
As for me, I mainly contributed to scheduling, overall organization, and marketing aspects.
In total, there were 14 of us, including the composer, but this number varied depending on the production stages. The short film was completed over approximately 1.5 years.

I imagine the pipeline was somewhat customized?
Did you establish a solid, well-thought-out pipeline from the start, or did it evolve over time?
The pipeline didn’t change much over time. Since the artists are in different countries, we had to find a solution with significant storage capacity and fast upload and sharing speeds.
We opted for the cloud, which allows for smooth usage adapted to our production needs. It’s a very simple but effective architecture. This project also helps us identify which tools need to be implemented for an even more seamless production, and we’re already working on their deployment!
Did you follow the classic process: storyboard, previsualization, production, post-production, or did you take a different approach?

We started with a storyboard, quickly followed by the layout phase. However, this project required numerous back-and-forths between layout, animation, and shot validation for editing. Everyone had the opportunity to give their input throughout the process.
This was important to us because these exchanges helped identify what worked or didn’t in the film while enriching it with the team’s ideas and creative vision. The final edit was fully locked when the sequences were sent to the composer.
What were the main software tools used? In what contexts, and why did you choose them?
The list of tools we used is quite long!
For modeling, rigging, and animation, we primarily used Maya, with some Motion Capture through Move.ai and Motorica for certain animation bases. Regarding environments, the choice between 3DS Max and Maya was left to the artists’ preference.
Textures were created with Substance, while Houdini was essential for FX. For lighting and rendering, we combined Blender and Houdini. Finally, compositing was done in Nuke, and editing in DaVinci Resolve.
Although these software tools are commonly used in production, the choice of Blender for rendering is worth mentioning. Free, comprehensive, fast, and easy to deploy, it proved to be a major asset. Meanwhile, Houdini perfectly complemented the workflow for complex VFX rendering.

On the production side, did you use classic tracking tools like Flow or Kitsu? Or did you take a more unconventional approach with Google Sheets or Miro, for example?
Since this fan film had no production budget, we kept it simple by using a Google Sheet for production tracking.

Can we see any pre-production elements like mood boards, sketches, or other visual references?
Yes, here are some images from the storyboard. Our references are, of course, inspired by the entire Star Wars universe, including the films, original concept art, video games, and animated series.
We were also influenced by Syama Pedersen’s work on his Astartes project, particularly in terms of framing and visual rendering.



Did you film any reference footage to choreograph the shots?
We filmed some reference footage, mainly to study the biomechanical aspects of certain animations rather than to choreograph the shots. The shot composition itself was mostly developed using existing references.
Did you create the environments based on the desired animation, or did the environment dictate the placement of characters and cameras?
It depended on the shot. For the early sequences, a fairly large base environment was created, allowing the spaceship animation to adapt to it. For the closer shots that followed, we focused more on framing and repositioned environmental elements based on the desired animation.
For example, if a trooper needed to jump onto an object, we simply added it to the environment to match the action. This flexible approach allowed us to adjust each scene to achieve the precise look we wanted.
As for references, the animators filmed numerous takes but also watched certain droid sequences repeatedly to stay true to the original animation style.

We’re lucky to have this amazing breakdown:
That’s an impressive shot! You can clearly see the evolution of the animation from the layout to the final render. The character uses their jetpack to gain height, adding dynamism and enhancing the epic feel of the scene.
How did this modification come to the team’s mind? At what point in production was this idea integrated?
The choice to use the jetpack was made from the start. After two or three layout tests, we noticed that the shot lacked dynamism. Cody has always had a jetpack, yet it was never really used in the Star Wars universe, and we were determined to highlight it. This shot was the perfect opportunity. Fans' reaction to the jetpack in the teaser was incredibly enthusiastic!
You also mentioned the use of Mocap. Did you use software like VirtuCamera for handheld camera movements, or did you animate the camera with keyframes?
The cameras were entirely animated with keyframes, with added noise and shake cam effects to enhance the illusion of a handheld shot. Motion capture was used as a base for some character animations through Move AI.
Lasers are flying in all directions! How did you handle this aspect? Did you use a particle system or animate the geometry directly?
We implemented two distinct systems to handle the laser shots, both entirely created in Houdini. The first system randomly generates particles on each droid’s weapon, projecting them in the direction of the barrel. The second system follows the same principle but includes an on/off control system, allowing for precise keyframing of the shots. This approach provides both a dynamic result and greater control over the animation of battle sequences.
For this shot, what was your biggest technical challenge and why?

The biggest challenge was the complexity of the visual effects: laser shots, smoke, fire, jetpack effects, etc., which led to resource-intensive render times.
The second challenge, which applied to the entire project, was managing the rendering of different layers. We used Blender, Maya, and Houdini to compute the images, and certain parameters—like motion blur—were difficult to handle because each render engine processes it differently. This created slight variations between renders, making compositing more complex.
For this particular shot, we had to manage multiple layers for characters, VFX, and other elements, requiring a highly meticulous rendering process.
Where can we follow you and see your work?
You can follow us on Instagram and LinkedIn for studio and production updates, and contact us at hoplitevfx@gmail.com.