Loading build...
Loading...
Your Shops
Unreal Engine 5 tech demo was just as demanding on the GPU as Fortnite, according to Epic
By Neil Soutter on July 1st, 2020 at 02:30pm - original article from game-debate

Last month had quite the reveal from Epic Games when they announced the Unreal Engine 5 with a stunning tech demo. There was a lot of nifty tech being used in the demo, but most importantly it allowed developers to include photorealistic textures and high quality geometry without sacrificing much on performance (though we’re pretty sure the demo ran at sub-30fps)

But according to Nick Penwarden, the Vice President of engineering at Epic Games, that tech demo was actually just as demanding on the GPU as ¬¬g_id:4521[Fortnite]¬¬: 

I can say that the GPU time spent rendering geometry in our UE5 demo is similar to the geometry rendering budget for Fortnite running at 60fps on consoles.

Basically, the time spent rendering the game’s geometry with the GPU in both the Unreal Engine 5 tech demo and Fortnite are pretty much the same. That’s a pretty big deal when it comes to performance optimization especially in upcoming next-gen titles.

Obviously there is more that the GPU does than just geometry rendering, which is why the UE5 tech demo still didn’t run exactly as well as Fortnite would overall, but it is a large chunk of it at least. Fortnite is a very well optimized game, no matter what you think of it, so being able to get a photorealistic image to run as well as Fortnite would on lower-end hardware could potentially be a game changer.

This is all thanks to the new Nanite technology, also known as Nanite virtualized micropolygon geometry, which allows artists to create geometric detail as they want. The Nanite geometry is then scaled and streamed in real-time, completely eliminating the need for artists to bake details into maps and manually author LODs (level of Detail).

Essentially, as a player gets closer to an object, more detail is streamed in, and vice versa when moving away from an object.

As I stated previously, this could be a game changer for lower-end hardware, as no longer will you need the biggest and most expensive GPU in order to get the kind of graphics and performance you want. Those higher end cards will still be better and help with higher frame rates, but in terms of making games more accessible (especially with the worries over CD Projekt Red’s upcoming graphically-stunning ¬¬g_id:4614[Cyberpunk 2077]¬¬) this is clearly a big step.

What do you think? Could we see a hardware revolution next year? Will we need higher-end cards anymore? Or will they still have their place? And how would this affect the games and hardware industries overall? Let us know!