Hello all,
Hello all,
I don’t really agree, the video is from a gamedev or rather a gamer’s perspective.
Gamedev in general is known for terrible WLB and low pay. Combine that with quarterly revenue cycle for big corps you have absolute trash built to hit deadlines.
People who call UE4/5 trash should try using the alternatives (guess how many exist?!). There are major issues with certain tech implementations but devs are free to not use them.
Cryengine was fun to use but it had major issues back in the day. O3D is just a broken mess than Amazon offloaded to open source, it’s so shit that the engine editor crashes at idle.
Unity was and probably still is ok for indie devs but it’s been years since I last tried it.
UDK was genuinely fun and so was UE4.
Source/Hammer is just too old for me personally, idk about licensing it either.
Godot, hmm. Not a fan, let’s say. But a good starting point for indie devs in the open source scene.
Also devs being lazy and putting all the enhanced graphics under RT to make the gpu do all work. Older games had quite nice of those graphics as per their time without needing it under some special name and offloading it to the hardware fully to do all the work.
Using the extra hardware on the GPU is great, no?
And lazyness is a byproduct of corporate culture. Do you really think the likes of Ubisoft/Activision etc. do not have the in-house talent to address these technical limitations? They have access to the complete source code and the license to modify it.
A lot of these engine features are built in collaboration with big studios and hardware vendors.
Maintaining an in-house engine can be a pain and requires a fairly high investment. It is usually deemed not worth it by most.
Majority of indie games fail btw, there is no incentive to optimize.
Yes, if it’s used for a genuine enhancement without taking away something from you.
What has been happening of late is that the dev effort to build pre encoded raster models for lighting has been scrapped altogether in many games.
What that means for an end user (regardless of the hardware they own) is that RT becomes mandatory.
So eg in a game like cp2077, you could choose to keep rt off. The game would still look good so that you could bump up from qhd to uhd on a 5080.. or have it perfectly playable at fhd on a z1 extreme.
Alternatively you could keep rt on for the 5080 , if that’s what you prefer for a small visual boost at the cost of lower res
In current gen games (eg Star Wars outlaws), the lack of raster lighting = mandatory rt
Which ends up making the game look terrible on the same z1e and takes away the choice of higher res on the same 5080
That’s a business and technical decision due to timelines. Not really an issue with specific game engines.
Do you remember the CP2077 launch on last gen entry level consoles? They couldn’t run it at all iirc.
Never said it’s a game engine problem.
It’s more of a business decision to reduce efforts that go into hand coding elements that can now be offloaded to the new game engines (but at a big cost to the end user)