If you know this and understand the history of the game engine then you know your prior comment concerning “bad optimization” has zero to do with it. It’s a game engine built in a different time that carries with it certain limitations in regards to the modern multicore trend. Optimization has nothing to do with the circumstances surrounding the engine’s innate limitations in this regard as there isn’t much to be done short of a total engine rewrite or move to a different modern engine, which is likely way more hassle than any dev wants to deal with.
Anyway, we’ll see if the DX11 improvements they are working on can help ease the pain some feel.
A 7700k @4,9GHz and 32GB of DDR-4 3600 15-15-15-36 are not bottlenecking anything. And if they do, I know the culprit is in fact this games optimization. rofl
@powbam: In this context, Im not going to argue about semantics. Youre probably right from a technical pov, but it is irrelevant if its engine limitations or bad optimization. This game is running like crap at even the fastest systems and the wrong people to blame are the players. I just wished people would stop with “your CPU is limiting you” etc. Its hilarious.
At this point Im starting to think you do not know what you are talking about.
Tip: This is the fastest CPU available for gaming, with the highest single-core performance. And its overclocked. I cant upgrade here. I can add more cores for which GD doesnt give a damn about.
While I can agree with the CPU argument since clearly enough people experience the issue that it shouldn’t be lightly shrugged off, I feel that the “semantics” are important. Saying it’s badly optimized is placing blame on the devs and frankly insulting them while most of us who do know the truth and status of the engine’s innate limitations can only roll our eyes every time we see it said. Would I like to see these issues resolved? Yup. But I will understand if they are unable to due to feasability, cost, and time.
I don’t blame people who say it tho. It’s said out of ignorance in most cases. Ignorance can be cured with knowledge except in the case of people wracked by bullheadedness/stupidity. It’s much harder to cure those types.
The simple truth is in 2009 they licensed (and effectively inherited) a game engine that first saw the light of day in 2006. They worked with what they had and were successfully funded by TQ fans, most of whom probably knew full well what they were working with. They made it thru Kickstarter and Steam EA and released the game. Not bad for building off an aged engine that came about in a time when dual-core was still a crying baby.
and that is where you are wrong. A game is more than just graphics, you have enemy AI, pathing, physics etc.
Much of that is on the CPU and if a game is more intense in that aspect it can slow things down, regardless of graphics.
Also how old the original engine is, is not very important. Crate changed a lot compared to TQ, I doubt you mind that the Unreal engine started out almost 20 years ago…
In Cenos history of gaming maybe, but the real world gaming community uses a variety of games and game engines to benchmark their hardware. Benchmarks imply a lot about the capablities of your system.
GD is a CPU heavy game and unfortunately most of the load is still carried on a single core, my guess is this is inherent from the original engine design when most people were running dual core processors and quad cores were considered an enthuiast chip.
The GPU can only process as much information as the CPU and PCI-E bus can feed it. The big tell for cpu bottlenecking is that as your frame rate drops your gpu usage also drops, if the gpu was the bottleneck then framerate would drop while the gpu usage would be maxed out.
I did a bit of testing with Jiacos xmax mod early on in closed testing and watched the massive mob spawns slow gpu usage down to like 10% till the game finally froze and crashed. Meanwhile 1 core on the cpu is working its little heart out but cant process all the info fast enough to feed hungry gpu…
I am not to harsh on Crate about this, they have really cleaned up and pushed the limits of their engine heaps, considering where it started. When you compare this with PoE GD has continued to make forward progress while PoE had to scale back most of their arua effects etc to compensate for the limitations of their engine.
To be fair tho, Unreal is a totally different case and an actively developed, upgraded, modernized thing with full intention of being pimped out for use by devs.
Youre nicely contradicting other users like powbam here on how much the age of the engine is a limiting factor (keyword: multi-core).
But it is irrelevant anyway. I cannot upgrade to anything better in regards to gaming-cpu-performance since I already got the absolute maximum here. Shadows and lighting at only “high” it is then. Vsync off makes everything stutter on my monitor so its not an option for me.
But please guys, refrain from those snarky “upgrade your CPU” advices. These are BS in my case and they really trigger me.
I don’t see anyone telling you to do that. I only see people telling you that the CPU is where most things slow down in Grim Dawn, and that being able to run ‘better looking’ games at higher/more consistent fps implies nothing about the capability of your performance here.
RE sunandsteel: …yes? I don’t recall bringing up benchmarks/benchmarking software.
How many of you people who are getting unacceptable FPS are using Win10?
This game uses an updated version of an engine where the recommended OS was XP, it simply doesn’t run great on W10.
I see someone mentioned he was running W8, but the latest updates for that have Telemetry and other junk baked into them as well.
Another person mentioned that installed software doesn’t affect the performance, I wouldn’t take that for granted.
In W10 people are constantly complaining about unplayable FPS in their games after an update (and there recently has been a big one), or how they get 100% disk usage.
This is because the OS is doing a lot of things in the background.
Did you know that by default it’s a p2p client that uses your bandwidth to fuel updates for other users?
This is one of the things that should be opt-in but you actually have to turn it off, it’s hidden under a vague name like “Delivery Optimization.”
And then there are features like indexing services, Telemetry, Win Defender scans, Win Update, Xbox DVR and others: they all use resources and tax your storage drives.
Furthermore, are you running an AV program with active protection and daily scans? Many of them are huge resource hogs as well.
So even if you’re “not running anything,” your system might be.
As several other people have pointed out in this thread, they manage to get playable FPS with far worse specs than some of the people who are having trouble.
So clearly it’s possible to have the game in a playable state.
Note that I agree that this game is very demanding, but if you’re stuck on 5-30 FPS with beastly rigs, perhaps it’s an idea to look into your system software and configuration more closely: you might be pointing fingers at the wrong culprit.
And sunandsteel is right. When your system can run todays most demanding software no problem and an old-ass fugly (in comparison) ARPG gives you headache, then theres just something off. The input in computational performance compared to the output in optics, physics and AI is completely out of proportion in GD.
I love this game to bits no matter its limitations. Just hoping this will change a bit in the future.
Edit: @Weyu: Good points, but no for me. No matter if I run it on W7, on Ubuntu with Wine, or W10. The moment I put on Vsync together with all max, FPS go down to the 30s together with some stuttering. Theres just no helping it.
Use adaptive v-synch from nvidia panel.
And set lightning on low, it helps but still have 30 fps drops sometimes, so it’s definitely better.
There’s nothing else to do unfotunately :undecided: .
slightly, but not completely. The main difference is that UE got more and more frequent changes. The point that the age of the original engine is not important and that the GD engine got upgrades remains.
I am saying the age of the original engine is completely irrelevant to this, there are many older engines that have been upgraded over the years and work just fine. GD’s engine also has major differences to TQ’s.
I agree that distributing the load better across cores would obviously help (but is not easy…)
From that perspective I also do not see DX11 helping in most cases
But it is irrelevant anyway. I cannot upgrade to anything better in regards to gaming-cpu-performance since I already got the absolute maximum here. Shadows and lighting at only “high” it is then. Vsync off makes everything stutter on my monitor so its not an option for me.
what resolution are you using ? for me everything is at a near constant 60 FPS (due to vsync) on my laptop with almost maxed setting (4x AA instead of whatever max is, and I am not even sure how much difference that would make)
2560x1600. And yes, you do have heavy drops too, im nearly 100% certain. Theres just no way of you not having them if indeed you have everything fully maxed. Some people are more sensitive than others however.
Edit: Options other than lighting and shadows (and vsync, obviously) generally seem to have a rather negligible impact on my fps.
That’s unfortunate since your specs are clearly good enough to run the game.
Like you said, maybe you just have to live with slightly lower settings if it’s too hard to pinpoint the actual issue, and hope its mitigated in the future on the developer’s side.
It’s a known issue, even this very forum has has a sticky about issues with W10 and a potential solution.
And there are countless reports of people having 60+ FPS prior to upgrading and having significantly worse performance after that (on the exact same machine), you can see it everywhere from these forums, to Steam and Reddit.
So clearly there is a correlation between W10 and some games performance.
Yes it is obvious that game is poorly optimized( probably due to engine but still devs can be lazy). Actual problem comes from win 10au + Nvidia drivers 37x.x + new cards combo. People are crying on Nvidia forums.