UK | USA
+44 7714657666
support@onsetfacilities.com

Octane and Unreal for Realtime and VR Architectural Visualisation

Octane and Unreal for Realtime and VR Architectural Visualisation

On-set Facilities

We took a look at realtime rendering for architectural visualisation, looking towards a future of archviz in VR. If your into architecture, 3D, and thinking about space, you’ll enjoy this walk into the future, along with the tools and methods that are emerging.

VR Architecture Visualisation

Very nice can I go there? You can now.

First we take a quick look at what 3D architectural visualisation is:

3D architectural visualisation is what it says it is, it’s using 3D and technology tools to visualise space and form where humans will inhabit, work, live and play. Architecture has many stakeholders, especially in a public project where there will be engineers, architects, council planners even government and of course non specialist folk from the public to local media, taste leaders and financial institutions. Yet no matter where you are in this, you need to see what’s being proposed long before a spade hits the ground (I know no-one digs with a spade anymore this is not 1950 but you get my point).

How is architectural visualisation being done, up until now?

3D architectural visualisation (Archviz) has become a vital part of the development process.The visualisation team will work closely with the architects to gather as much information as possible, plans, sketches, mood boards, models, anything that will give the visualisation team a good start when they try to replicate all this in 3D. Most often a model will be given from the architecture team to the 3D visualisation team to start working on.

But no building is ever born in a vacuum. Buildings, homes, public spaces, they all live somewhere and this is where the art of the 3D visualisation comes in, artists texture, light and bring the visualisation to life.

Three generations of my family ran construction businesses, I grew up on building sites, non of them was ever dull, a bit dirty maybe but they are surrounded by life, neighbourhoods, wildlife, the sky and of course in my case the UK weather.

Once the model has been textured and placed in a living world, the render team will often propose to a number of points of view and they may run a virtual camera to create moving images of the proposed site. This would then be sent in low-res to the stakeholders for them to sign off. Once the low-res points of views have been signed of its time to hit the render button and get those GPU’s humming. Eventually, after a lot of high-res render work the final points-of-view and animations are then delivered, colour corrected edited together and presented. So where are we heading, whats realtime rendering and how can architects use it?

Bring in Unity and Brigade / Unreal and Data Smith the arrival of the game engines.

Enter stage left, the game engine, stage right Maya, 3D Studio Max and Cinema 4D. Combining DCC tools with the game engines it’s now possible to create high resolution virtual experience that render in real-time, or as it’s being called realtime architectural visualisation. And it’s happening now. Download Unity and Otoy Brigade or download Unreal Engine and Datasmith

Photo Realistic London Apartment 4K Unity Engine | RTX 2080 Ti | i9 9900K 5.1GHz

How do you start rendering in realtime?

We are not talking about running virtual reality architectural visualisations on your xbox or playstation. Where we are heading requires serious GPU power and computing. The kind of computers that are usually the domain of AI development and deep learning are now powering VR architecture experiences. Below is the spec of a middle weight realtime VR / gaming rendering machine that will drive a virtual visualisation. Running high end simulations in VR takes a minimum of 90fps and direct SLI links to your ideally multiple GPUs. But you can build VR visualisation machines from anywhere between £5,000 and £35,000.

  • Full Specification
  • Intel i9 9900K overclocked to 5.1Ghz No Delid
  • Custom Waterloop
  • EK CoolStream XE360 Radiator
  • Swiftech Apogee XT CPU water block
  • XSPC Dual 5.25″ Drive Bay Reservoir V2 with Single D5 Vario Pump
  • Gigabyte Z390 Aorus Master Motherboard
  • NVIDIA Palit Gaming Pro OC RTX 2080 Ti Hybrid Modded
  • NZXT Kraken G10 Bracket + Kraken X41 AIO 140mm
  • NVIDIA GeForce 417.71 Driver
  • 32GB of Corsair Red LED Vengeance 3200Mhz DDR4
  • Corsair Graphite 780T Black
  • Corsair HX 1000i PSU
  • Samsung 850 EVO 500GB
  • ScanDisk SSD Plus 480GB
  • Samsung 970 EVO 250GB
  • TeamGroup L5 3D Lite 480GB
  • Toshiba 3TB HDD
  • WD Blue 3TB HDD
  • Acer XR342CK 3440×1440 Ultra Wide
  • Philips 436M6VBPAB Momentum 4K HDR monitor
  • Windows 10 Professional 64Bit Version 1809
Ronen Bekerman

Image curtesy of Ronen Bekerman.

The early stages of delivering VR architectural experiences are the same as delivering static 3D visualisations, you gather as much information as possible from your team, then ideally you visit the proposed site. You’ll gather as many photos, textures, even sounds, as possible to bring back to the studio. You then take the architects model, you start creating in 3D, add textures, animate people, maybe add cars and life.

But then you have to get your new assets into your engine of choice, and this can be tricky, as you will want to retain all that lovely rendering detail. At present game engines are not exactly up to the same level of say V-Ray or Octane, not yet anyway.

This is where the real technical knowledge comes in and you better be prepared for some serious testing. We’ve spent months dare I say even years perfecting how to get game engines to behave themselves when rendering photorealistic 3D assets, one thing you will need in plenty of power.

There are a number of good videos on youtube, that will help you to create UV maps and export them, and show you how to bake your 3D lighting and textures in Unity or Unreal. We use data smith is a plug in for Unreal Engine that enables you to import 3D objects into unreal. We like to use Octane for photo-real renders but be warned it’s a considerable time suck pain in the backside, you’ll be going backwards and forwards between C4D and UE like never before.

But stick with it. You wont be sorry.

The idea, to me at least, is to use VR to create truly immersive architectural visualisations where stakeholders can explore and view a proposed environment, seeing and hearing what it will be like to be there in person. The good thing is, you can still deliver traditional 3D visualisations along with the VR experience, its what we call a “create once exploit to many” content creation model. You can find out more about what we do at OSF here.

 

%d bloggers like this: