This month’s VJ clip reward on Patreon is CoreTech, prepare to get electrified:

Available On Patreon

CoreTech is a collection of 5 VJ clips available in 1080p and 2160p DXV formats. It features a sci-fi device comprised of beat-synced coils and humming grills, captured from multiple angles. These clips could help superfans transform their event into a futuristic hi-tech environment. This reward will be available to superfans on Patreon until June 30th, 2023.

In addition, superfans will also receive the blissful Resolume Arena project (v7.15) that I dedicated hours to perfecting. I meticulously tinkered with positions, rotations, mirrors, cloners, and more to capture the full electric charge hidden within this device.

Creation Process

CoreTech was created using assets from the Sci-fi Kitbash Mega Elements, along with a sphere and a couple of ring objects.

I prepared the asset in Cinema 4D, utilizing the knowledge I acquired while creating the InnerBeats robot head. After importing it into Unreal Engine, I assigned metallic and emissive materials to the different parts. This immediately illuminated my scene, making it look interesting right away. (I explained how to create flashing emissive materials in the Metaverser creation process post.)

I created a couple of rings, each in different sizes, directly within Unreal Engine using the modeling tool. I applied constant emissive materials to them and, in a newly created sequence, animated their position above the Core. I wanted some dashed rings but couldn’t find a way to create them within Unreal Engine, so I made them in Cinema 4D and imported them as well.

I played around with the size, position, and colors of the sphere and rings until I was satisfied with the animation.

I positioned a camera in front of the Sci-fi device to capture a shot that would include the entire device, which I called a “full shot”. I tested a few different camera focal lengths, such as 35mm, 24mm, and 16mm, and ultimately chose the 16mm because it created a distortion effect that, in my opinion, made the parts appear to protrude out of the screen. I rendered a test clip using the Path Tracer renderer, which handled the metalness and emissivity of the materials exceptionally well. I was pleased with the result. Here’s a useful tutorial about rendering with the Path Tracer.

Next, I duplicated that camera and positioned it closer to the center of the device. After testing, I determined that a 35mm focal length worked well for the shot as there was no need for the distortion effect. I then duplicated the first sequence (where I had already animated the sphere and rings), named it “Close-up”, brought it into the level, opened it for editing, deleted the “full shot” camera, and replaced it with the “Close-up” camera.

I repeated this workflow two more times, each time creating a different camera: one for the side of the device and another at an angle to the Core.

While creating the angled shot, I felt I could add some flair with a dynamic camera shot. So, I duplicated the camera and connected it to an empty actor that I created and positioned at the center of the Core. This allowed me to use the actor as a “null” object, functioning as the camera’s pivot, similar to how it’s called in Cinema 4D or After Effects. In retrospect, I realize that I should have used the Camera Rig Crane, which also has a pivot point. However, this method also worked effectively.

I added this pivot actor to another duplicated sequence and animated its rotation to make the camera rotate 360 degrees around the core. I also animated the position and rotation of the camera to move closer or further away from the core. For this shot, I wanted the focus of attention to be on the core and not the background, so I activated the camera’s Depth of Field feature, opened the iris to 1.8, adjusted the exposure, and set the focus to track the sphere at the center of the core.

However, when I rendered a test clip, I noticed a strange halo around the emissive rings. Upon further investigation, I discovered that in the post-process settings, under the rendering/path tracing dropdown, I needed to check the box for “Reference Depth Of Field.”  I learned how to do this from the a tutorial titled Path Tracer Working With Depth of Field. I didn’t even know this was an issue.

Creating The Deck In Resolume Arena

As usual, once I had a test clip rendered, I took it to Arena and experimented with various effects. I checked how the hues rotated and scaled, applied colorizing to clips, used mirrors, cloners, and even sliced the clips with keystone crop and other tricks. I thoroughly enjoyed this process and gained some insights that I applied to the device in Unreal Engine before the final renders. For instance, I added small objects that collide with the side parts of the device to emphasize the beat better and made more parts of the device flicker with lights in sync with the beat.

I’m delighted to share this project with superfans and hope that by allowing them to inspect this project with the different effect stacks, it can inspire, teach, or simply assist people in replicating the effect stacks on their own clips.

What's Next?

I have started collecting ideas for future rewards, along with links to assets and images, on a Trello board. Oh my, this list is filling up with beautiful ideas. Choosing the theme for the next reward is difficult, but not impossible ))

I am always happy to hear from you, visual enthusiasts. If you want to share anything with me, you can email me at [email protected] or DM me on Patreon, Facebook or Instagram.

If you haven’t already joined the ranks of superfans, consider becoming a part of the wild bunch who are already receiving monthly VJ clip rewards from us. They bring fresh content to their shows and leave their audience (and clients) amazed by their performances. 🤘