fortnite-travis.jpg

Unreal Engine is bringing tons of tutorials, from virtual concert halls to immersive eye candy fx


Some events translate poorly online. But making worlds in Unreal Engine has always involved sitting behind a display – so these tutorials prove essential. And they might just help audiovisual producers create new immersive creations.

Unreal is hosting Unreal Fest, but since we’re largely interested in new visual effects and event production, I’ll stay away from the gaming topics. What I think is promising is that Unreal may prove a virtual venue for both artists and technicians. It suggests solutions both for online experiences, but also new skills, new media, and new delivery tools and ways of standing out for years ahead.

unreal concert
unrealrig

As usual, Unreal tends not to tilt these things for artists, but that doesn’t mean there aren’t tools to discover.

Oh and – update. While it has no direct bearing on Unreal Engine itself, Fortnite was booted from both Apple and Google app stores after Epic Megagames circumvented the payment mechanisms of those two companies. It does seem like a reckoning is coming with closed stores, between developers unhappy that the 30% cut is too high, and conversely Apple facing the danger of losing the entire Chinese market if it can’t distribute popular apps like WeChat. In both cases, Android is less impacted – you can simply sideload those apps. But Epic is suing Google, as well.

Virtual events

Here’s the big one – Grayson Edge talks about how events like the Travis Scott concert were pulled off in UE4:

For reference:

But it’s also possible that Unreal Engine can work as a way of creating virtual venues for concerts, with the kinds of acoustics a real-world space would have. 4.25 added convolution reverb and ambisonic soundfield rendering to help you pull that off. And clearly with the present situation in mind, the folks at Unreal show off how that might work to produce a virtual concert space:

Live control, live events

Working with live control, real-time control opens up performance possibilities – even if the performance winds up being virtual.

DMX ins and outs? Yes! MIDI control? Sure! And here’s where this gets crazy – you might be rigging both real-world and virtual lights, together. It’s like being inside and outside the Matrix at the same time. (Just in case you decided to take both the red and blue pills. I shudder to think what you’d do with the Marshmallow Test.)

And yeah, I see an 808 and a Sensel Morph in there! Live external control:

Game jams

Learn from others and make a game jam successful – for the uninitiated, that’s a way of cranking through quick development and experimenting with ideas, hackathon-style:

3D production and vfx

Working with Blender will be a big boon to artists developing in that tool. Unreal of course isn’t open source, but the combination of free as in freedom with free as in beer (for most uses) sure isn’t bad. (There are also other tutorials on importing material from Blender to Unreal, so you can make your models in Blender.)

Volumetric effects are just spectacular in Unreal now, with their Niagra system:

And Niagra in general is worth a look – a serious VFX particle system in a game engine:

It even integrates with Houdini:

This stuff is transforming in-camera effects even on TV production. Sure, you may not be HBO, but that also suggests a blurring between what happens in real-time effects and what you watch in a TV or film:

Audio crash course

It’s good to see that Aaron McLeran is still on the audio team. Here, now you can feel good that you know a lot about audio already.

Of course, tons more online information on the Unreal Engine site.





Source link

Share this post