Game is not optimized properly

There are more links. When i say “accidentally pay” i mean in the heat of moment ( moment of need, you decide to pay before you do your research).
It is ill advised to update drivers other than graphic cards driver because most of the time the latest drivers are NOT always performing better than old one including graphic cards. Majority of end users even dont know any driver than graphic driver exists. They just run their pc(bios driver), connect the internet(ethernet lan driver ) and do their job. Until now, i never installed any monitor driver because i haven’t needed and they didn’t come with cd anyway. Most of the utilities they call ( the driver cd of the product) are resource wasting programs…( look at msi motherboard cd oh gosh). Windows 10 or 8.1 handles %95 of work needed for drivers most of the time. Even geeks ( i mean hardware websites and reddit users) accepted the fact that windows did a good job when it comes to drivers. If rarely a driver clash occurs, then you must look for a driver in THE MANUFACTURER’s WEBSİTE not any tool that force you to pay because it finds problems that do not exist so then you can pay. You can keep using softwares but i can assure you it is waste of money and resource.

To prove my argument, i downloaded most famous ones and did a driver search on my computer. They said 3-5 intel drivers are out of date. Then i went to website of intel and downloaded their driver utility. For some reason, that intel’s utility software couldn’t find any updates… So it is sure hell of scam because in order to update it forces you to pay.

Well more power to ya :wink: None of the ones I use make me pay. I wouldn’t use them if they did.

Sent from my Z812 using Tapatalk

it is worse than, probably a malware is already on your computer anyway.
Good luck


format C:

A free solution.

I hear that system32 is where all your computer’s malware gets saved; all you have to do is delete that folder and you’ll never have to worry about malware again!

don’t actually do this why EVER would you think this would work

:wink:

@Serimert thanks for the concern. My comp has been malware free since, ever. I know what I’m doing. I tend to avoid the kool-aid they serve up at reddit, which helps too.

Sent from my Z812 using Tapatalk

I had a look on Samsung’s website and at the monitor you have in general (2010) and its “TV-centric” design may be the limiting factor to what you can do with it.

It’s a shame. I’ve no doubt in my mind that if this wasn’t a stumbling block, you’d see what it is you’ve been missing in many games in past years.

My rig is over 4 years old now(with the exception of the GTX 970 I got last summer), and the monitor is just a plain ASUS 60 hz model. I have every setting maxed with zero issues. Of course I get some dips below 60 FPS when large groups are present, but nothing as bad as what some of you guys are reporting. I don’t think I have the latest nVidia driver, but it’s not very old either.

I do use a monitor preset that makes the screen darker because I like the way the game looks like that. That darker preset also helped D3 look better than the candy land graphics it shipped with. Hopefully you guys can get your shit lined out because GD is a great game and it sucks to not enjoy playing it because of graphics issues.

Never noticed any fps drops or anything. But i do find it weird that my pc makes so much more noise when playing GD than when im playing BF4 or any other much more graphical game.

Well, the game’s engine is pretty old now and I don’t think it plays well with multi-core CPU’s. I started playing Titan Quest and IT back in 2009 on my brand spanking new i7-920 and having a few similar issues then as I am now, though it was mainly stuttering with fps drops in intensive fights.

On my current rig (i7-6700k, GTX 1070 Strix, 16GB DDR4, 144Hz G-Sync monitor), I get 200-250fps in town or outside at night/interiors, dropping to maybe 150 fps fighting a few mobs, however, go above 25-30 mobs and I drop to 50-70 fps, which, is kind of OK. With big groups of mobs plus heroes…that can tank my fps to 30-40, which is not OK.

You have yet to understand two simple problems here:
a) uncapped frames
b) PC architecture

PC architecture has one nasty problem that has yet to be removed. Neither the CPU nor the GPU can access the same space in memory, at the same time.
Uncapped frames in Serial API code is just going to make this worse.

The kind of architecture that doesn’t run into these problem, if the coder knows what they’re doing, are the Xbox One and the PS4. Their SOC is linked to one single memory system; with the SOC being able to address the same space in memory at the same time. It’s the reason that such heavily limited systems can get away with Driveclub [PS4] (which can’t easily be done on any PC to date - if at all) and Quantum Break [Xbox One].
Not to mention, they use non-serial APIs and Async Compute.

“With big groups of mobs plus heroes…that can tank my fps to 30-40, which is not OK”

Now you’re simply being silly.

And what exactly is wrong with that? If you spent that kind’ve cash on a computer you should be able to expect better. My FX6300 @ 4.5 gets bogged the fuck down and goes down to 20-25 fps, is that also not ok even though no other games gets me even close to that unless I’m cranking the gfx entirely too high?

I love GD and I would play it if the fps was even worse :smiley: but this shit is not okay

Er…please elaborate. I play The Witcher 3, DOOM, Fallout 4 and Dragon’s Dogma quite a bit and they are all pretty much maxed out (they are tweaked for best performance) and rarely do I drop under 80 fps.

But the one thing they have in common…they can utilise my 6700k’s core better than GD. And Witcher 3 and GTA V seem do a good job of using both the physical and logical cores.

I’m still going to play the game 'cause it’s a damned good game, just annoying that it has these fps issues.

I updated to 970 from 570 recently and now i can play 60 fps with max settings most of times, but in big fights fps dipped at 20s, so i had to lower particles and lighting a bit. It’s still doing it but not that badly, don’t want to lower any more settings or the game starts to look like tq :stuck_out_tongue: i’m running i5 4460, not overclocked.

Just wanted to let you know this worked for me. I had max resource threads on 8 because, well, figured my i7 with hyperthreading would use all 8 cores. I’d previously tried changing which cores were checked in the affinity in the details of the process but I would take off the virtual cores, not core 0. I also had to go into nvidia control panel and create a custom resolution at 59hz, as it wasn’t there by default. This fixed up the game wonderfully on my and my buddy’s machines (both skylake i7s and he has a 1080 while I have a 970).

For some reason it isn’t capping the frame rate at 59, however I’m getting hundreds of FPS in town and in light combat I’m still getting ~70-90. I haven’t seen a single drop below 40 fps since I did the changes though, which only happens in the heaviest waves in crucible.

I appreciate you posting this stuff! I feel silly because I feel like I should have known better, but you clearly have programming knowledge, right?

Thanks again, highly underrated suggestion! Grim dawn is actually a pleasure to play now! :smiley:

Interesting. What if we have a 144hz monitor?

Note that, with a 4790k and 980, I managed to correct my FPS drops by setting lighting to low and shadows to medium. Everything else is maxed out. Just curious as to if the above would make my performance even better because, as we all know, this game needs all the performance help it can get!

Still has a permanent spot on my SSD though :stuck_out_tongue:

blackshark I’d give it a try. does your monitor have g-sync? I’d imagine it does if it is 144htz.

All I know about 144hz monitors is that the gamer sides with the understanding that they like fast-paced games like QuakeLive and seem to be able to detect the different between 90fps and 144fps.
That’s about all I can tell you.

Grim Dawn’s engine design is not the problem.
We’ll have to wait a few years before some pretty significant changes start to get adopted by game developers.
It’s hard to change when you’re currently making a lot of money, if you get my drift.

Correct. I bought the monitor specifically for FPS games such as CSGO, Overwatch, etc. As long as I am at 60fps+, I don’t really care about frames other than those games.

My question is more of “If I have a 144hz monitor, what should I set the refresh rate to in the ingame system of Grim Dawn or Nvidia control panel?”. I’d prefer to not have to declock the refresh to 60hz if at all avoidable.

Also, what is the default Maxresourcethreads value and why do we set it to 2 if default does not equal to 2?

Thanks!

Nope. I bought the monitor almost 2 years ago when Gsync was in its expensive infancy.