CPU requirements to run Grim/Grimmest smoothly

Let’s talk Grimmer and Grimmest gameplay, and performance issues.

This is, I guess, what a lot of modmakers and mod-players struggle with: framerate issues. Grim Dawn, we know it, is not optimized for multi-core CPU’s. If you check your cpu-activity during GD, you will see 1 core running at high load, and the others nearly idle. Turning down graphic options does not solve this problem.

The Grim/Grimmest mod puts an even greater strain on that single core, and on a lot of PC’s, grim dawn will become near unplayable, perhaps not in Foggy Banks, but on the Plains of Strife. A good way to test this is going to Fort Ikon, and visit the area where your stash is.

Using my Phenom II X4 955BE cpu (running at 3.2ghz), framerate drops to 40fps, just standing still. When i get out there, things get worse. If i set my resolution to 800*600 with everything off/lowest/disabled/simple, its still 40fps, because the CPU is the bottleneck here.

The above is meant as info for those who didn’t know about this.

What I am wondering is: what can we do? Any experiences with affinity settings (in OS) of CPU’s? Editing options.txt? Any tricks?

And above all: if all else fails: what CPU can really handly Grim & Grimmest? It will have to be a CPU with a great single-core performance, and thus it does not neccesarily mean it has to be an i7 or even an i5. Perhaps the i3-6100 is a good bet for a gamer on a budget because high Ghz i5’s are quite costly.

And above all folks, stop lowering your video settings because, honestly, check your GPU usage during grim dawn :wink: Its probably only at 20% load.

i5 [email protected].
Every time I launch GD I set affinity to last two cores and set priority to realtime. Feels a bit smoother.

Holy crap, are you kidding me? So even with that CPU (quite a beast @ 4.2ghz) you are not getting smooth gameplay @ grimmest spawnrate?

Thats pretty sad to hear. There are not much faster cpu’s out there, especially not affordable ones. The fastests i7’s around only are marginally faster than your 4670K, when looking at single core performance.

And are you sure the CPU is still the bottleneck? Doesnt lowering graphics settings (and then i’m talking ultralow res and ultralow settings) solve the problem?

I just said how to improve smoothness a bit. There are places and/or moments when some little micro-stuttering or fps drops happens. But overall it runs great. In fights 60-80fps with 970GTX.

AA 4x, rest maxed.
Vsync/triple buffering off
Deapth of field off.

Most fps costly setting for my system seems Shadows. There is big performance difference between High and Very high. But visual quality is worth it.

For engineering reasons beyond my level of understanding, GD is not optimized well. I sometimes struggle to maintain 30 FPS while I can run the Witcher 3 on MAXIMUM with 75 FPS.

Difference being Witcher 3 is more GPU intensive whereas GD is more CPU intensive. Comparing apples and oranges, there.

Well, CPU-intensive in the sense that it is has very poor optimisation.

Thanks for feedback, Dex. Some stuttering (perhaps not even caused by cpu) and also some slowdowns at certain moments - thats not too bad.
Shadows are indeed a FPS-eater for as long as I remember in games, that is, over 20 years :wink:

You have one of the top 10 - 15 cpu’s though, still costing 250E at least, and you have a nice overclock on top of that (1000mhz above the default non-turbo setting if i’m right). I’m curious to findings of other people with some more mainstream gaming cpu’s, like the i5-6400 or or i5-4460 or Athlon FX 8350, at or about their regular clockspeed.

Very curious too. I had an i7 16gb with 7970m amd in an alien laptop. I could play at mid high settings. My current rig is an old dual xeon workstation 128gb with new 960 nv. Plays at max settings with hiccups as frequent as vanilla, but some harder hiccups when certain combinations come. This sort of shocks me as this rig does not benchmark well and the cpus rate very bad compared to todays chips.

I think I even hosted MP a bit in grimmest without issue with current rig.

Isn’t this the wrong subforum?

Negative, actually the optimization is pretty good from what I can tell. You try coming up with a better way to make all the pathing calculations (which seemingly would be the most CPU-intensive, and would only be worsened by Grimmer/Grimmest) work fluidly.

I do not think so, because this is specifically about the mod Grimmer/Grimmest and the reason why this mod, incorporated into many other mods and hugely attractive to play, is for most people a gamebreaker.

I am nothing but a hobby programmer at best but - how about using all available cores, for starters? :wink:

Some background information. Since at least the 90’s, cpu performance was mainly tied to its frequency. So there they came and went: the 486dx2-66, the Pentium 100 (mhz), the Pentium 133, the Pentium 166, etcetera. Things just got faster and faster (and hotter and hotter). However, somewhere around or shortly after the time that this game’s engine was released, chip manufacturers realized that they were bumping into a Ghz-wall, and dual-core cpu’s started to become commonplace. Since then (thats about 10 years ago), multi-core cpu’s started to become commonplace, even in cellphones. And most software nowadays is written to make use of those multiple cores.

Grim Dawn does not support multiple cores or multithreading. That is where the problem is. Start task manager, open up the performance tab, play some grim dawn, alt tab back, and you can see for yourself. About pathing calculations: my 5 year old PC with its 3.2 quadcore -easily- handles games like warcraft of age of empires, where the number of units for which pathing has to be calculated lies within the hundreds.

In fact, most pc’s from 15 years ago have been able to calculate pathing instructions for 100+ units without much problems.

Now I’m standing in Fort Ikon, looking down on the fields of strife, with perhaps 10 or 15 monsters in sight, and my framerate drops down to 40.

But hey I was not starting a GD-bash thread: its an old game revived by indies, and we have to deal with it. I’m just curious which CPU’s can handle Grim/Grimmest mods.

Yeah, thanks for pointing that out.

GD has multicore support.

That seems to be a semantical issue. What I meant is that Grim Dawn tends to put nearly all of its load on a single CPU core, leaving all other cores close to idle, resulting in gamebreaking framerate issues when going past Fort Ikon, near the Undercity, or when trying out one of the most popular mods: grimmer & grimmest.

(pic not from me, but check the resource monitor and framerate)

Interesting. I don’t see that behavior on my system.
I see 2 cores used almost equally if I set affinity to those cores only.

You don’t see that behavior on your system? :wink:

Then why does everyone else? And then what, in the first place, incited you to manually assign cpu cores using the OS affinity settings, if not in an attempt to fix the fact that GD was only using one core?

Nah we’re not getting anywhere with stating that there is nothing wrong with GD’s multicore handling. No matter who is quoted. The fact that the Devs deny the problem in fact worries me: because then there will never be a fix :cry:

I’d gladly upgrade to an I3-6100 or I5-6500 if I knew that was enough to counter the very poor multicore optimization. But it’s probably not.
And what the common majority already knows, namely that you need a fucking beastly spaceship machine to try Grimmer/Grimmest if you don’t want a 10fps slideshow, is. i guess, proven true…

And thus, Grim Dawn’s modding community will be severely crippled (cuz everyone likes more mobs and more action), and thus, Grim Dawn will die out a lot sooner than it has to.

I really don’t get it. Why would a dev deny that GD’s multicore support is flawed? Just look at the image i linked to earlier. Its blatantly clear and leaves no room for doubt that there is something badly wrong.

I can play cities skylines on this PC just fine. There are THOUSANDS of units getting pathing instructions in that game. Don’t fool anyone by claiming GD handles this well. If even a i5 on 4.2 ghz with force affinity experiences slowdowns caused by cpu bottlenecks, in a single player arpg, just by playing a mod that increases the spawnrate a little, then there is something very badly wrong.

Its not right. I can have 25 units on screen and dont have fps drops. Yet when standing in for ikon (as in that screenshot), and looking at perhaps 10 or 15 mobs at best, cpuload goes to 100% and my fps drops.

Meh, i’m talking to a wall.

cough I don’t >.> <.< cough

But I can run Grimmer and Grimmest more or less fine. i7-4790K @ 4.4GHz, GTX 770, 16GB of DDR3 1600MHz RAM, running GD off a 512GB SSD. Have not dipped below 30fps for G&G

Okay, not everyone :wink: It is however commonplace in the most popular mods around, like DAIL. After all - this is the mods forum.

Thanks for sharing your rig + G&G experiences!

However, you have an even more expensive chip than the other guy that replied: a 350 euro i7-CPU, turbo’d to 4.4 (!) ghz. And still, I understand, the game drops below 40 fps. Wow.

That rules out upgrading to a i5-6500K or something to fix the issue. Thanks for saving me the 500 euro upgrade: GD was the only game I’m playing atm that needed more power - now I can wait till another game comes by that requires an upgrade.

Thanks all for the feedback: it is as was actually already suspected: G&G requires a high-end PC/workstation to make up for bad optimization.

I too play cities skylines and you need to understand that multi core support is much much harder for game developers than you might think. The Cities Skylines devs actually had an amazing post on Reddit discussing the matter since many people complained of sup par performance, blaming mainly the engine Unity. The fact of the matter is true parallelization is a task that is too large for even the largest developers, so what they do is actually assign specific functions to specific cores.

Since we are both familiar with Skylines lets use that as an example:

1 core is dedicated to just AI
1 core is dedicated to just water physics
the other 2 cores I believe work somewhat together, trying to balance misc functions

A game like grim dawn is pretty “On Demand” in terms of what is calculated. In Skylines, vehicles are always moving, the water is always being rendered, in Grim Dawn not so much. In terms of pure CPU instructions we might have Pathfinding and AI on one core, and physics on another, while the rest is left to sort misc functions. The issue here is that both of the things I mentioned only “spike” when you fighting mobs, not all the time, since monsters off screen are not roaming around, or have set instructions.

What I am trying to get as is some games no matter what you think are capped in terms of performance. I have quite the system myself and I recently needed to turn down the corpse decay time as a large pile of mobs would really hurt my FPS.

The devs of Skylines already said that the game is as optimized as it can be based on the design and the only performance that can be had is to increase the CPU speed, not the number of cores.

From a Reddit AMA:

“The game is best optimized for a 4 cores CPU (being the most mainstream and quite fitted to our simulation needs), we have the main thread, audio, pathfinding, simulation & water flow. Unity does under the hood also use threads quite heavily (main and rendering) and we do use some extra worker thread during the loading/saving process, so 8 cores should yield a noticeable performance improvement but it will not be in the 2x faster figures.”

I myself have a 6 core 5820k and it works marginally better than my old i5 due to the fact that most games do not make use of more than 4 cores. I chose that specific CPU to give me some breathing room as game programming becomes more advance, and in addition I needed the extra cores for 3d rendering that I do as a hobby.

Please do not blame the developers for not trying to optimize the game, since it really has come a long way since I first started playing it some time ago.

If you would like further explanations or someone to debate with, I am happy to oblige :slight_smile: