Ultra low quality textures - 8k not ready - 4k is okay'ish

Facts so far:

  • last dlc ever for grim dawn
  • vram usage by game is pathetic

low amount of vram is good for people who have computers from 15 years ago… But this is a last dlc - why not give an optio to have the best textures possible?

On 4k resolution the textures look okay’ish! honestly, im quite sure people will play at 8k resolution in few years and they will want to play this game. How it will look at 8k monitors? ultra old thats how. This game isnt going anywhere it will be played for years.

10 years ago gpu already had 8gb of vram.

Please, for the sake of last dlc - make textures amazing and 8k-proof. No reason to not make game use 8GB of vram. The general trend is majority of gaming 16gb of vram is almost there. Steam survey shows people are having large vram.

other feedback: no complains, all looks really awesome stuff.

You also have projects such as Complete Remaster of Grim Dawn's vanilla textures, I bring you GrimTex already achieving this, no?

1 Like

Just wanted to say 8k requires 4x more computing power and 4x more ram than 4k.

I’m already extremely happy they updated the UI to work decently at 4k.

2 Likes

Charm of Grim Dawn is being accessible and majority of player base run low specs

5 Likes

My 8 years old 1060 with 6gb would like to have a word. Some people still play with even worse gpus, I believe. Yeah, let’s just show them middle finger by bumping system requirements to AAA level, that definitely would help FoA sales.

2 Likes

4k is still the minority
and 8k likely not becoming mainstream if it has so little traction even the screen manufacturers are abandoning the concept :sweat_smile:

2 Likes

Going back through 10 years of assets to increase their fidelity makes 0 sense unless it was coupled with a paid remaster. Especially if we had to make it a video option, which would mean maintaining 2x the assets and bloating the installation for everybody.

8K resolution for consumers is basically dead. All the major display makers are pulling out of developing them further.

6 Likes

Even if you guys wanted to make a Grim Dawn Remastered, the old engine would of not handle it, right? hence better making Grim Dawn 2?

You really have to explain yourself with that “computing power”. Words are not some magic spells, and computing power does not exist.

Textures only need bandwidth which is practically irrelevant thing on any gaming gpus… Never in any benchmarks it was told that higher vram usage reduces fps or reduces fps significantly. Nobody even test this thing because it is not a thing.

How is your 1060 only 8 years old? I had 1070 10 years ago. And it did had 8gb of vram.

Also not sure why people with ancient gpu so anti high textures. You can play on ultra low settings, nobody is forcing you to pick modern textures, which this game kind of lacks.

ok buddy

1 Like

Ive seen modders add textures for skyrim, which they seem to upscale in “bulk” somehow, and skyrim i would say has seriously amount of textures, and yet modders did it somehow just like that.

Bloating installation? That could be an issue but i have small ssd and i just install one or few games at a time - i never complained that “i have only 100gb of space” raving in forums on every modern game that has been released, i just quietly uninstalled few largest games and played one game at a time.

p.s. sorry for replying, i never use these forums, probably been many years since i typed here, but seeing insane pushback i suppose i wont visit much here after this post.

Well, you know what they say about assuming. Your initial post seemed to assume both that going over textures is somewhat trivial (which a team member confirmed is not) and that most people had better hardware today than when the game went out (which is true to an extent, but really doesn’t apply to everyone and doesn’t match expectations of this game’s community).

GD was never a game about graphics, the engine it is built on was kinda aged before the game was even finished!

Being told you’re wrong never feels nice, but I don’t think people were coming at you specifically there. Bottom line is, the game is highly unlikely to see much graphical changes beyond fixes, QoL and accessibility. And uh, snow apparently.

3 Likes

Simple, I bought it in 2018? And even if whole gtx 10 series was released in 2016, it doesn’t really matter because only high end gpus actually had more than 6gb vram. Talk about system requirements again.

1 Like

Who can even tell the difference at a certain point?

The University of Cambridge’s display resolution calculator, which is based on an study published in Nature in October from researchers at the university’s Department of Computer Science and Technology and Meta, funded by Meta, suggests that your eyes can only make use of 8K resolution on a 50-inch screen if you’re viewing it from a distance of 1 meter (3.3 feet) or less. Similarly, you would have to be sitting pretty close (2 to 3 meters/6.6 to 9.8 feet) to an 80-inch or 100-inch TV for 8K resolution to be beneficial. The findings are similar to those from RTINGs.com.

4 Likes

it’s not about telling a difference, it’s about having bigger(est) number,
gief 32k or riot :triumph:

1 Like

This.

Hell, even 4K’s penetration is pretty poor still, simply because most PC gamers don’t have the cash free to get a GPU capable of driving 4K well enough. Especially now the LLM-bro’s have created the dumbest bubble since the Tulip Bubble and driven up GPU and RAM prices massively. Maybe in a decade 4K will be more standard, but yeah, from memory the 2 main resolutions are still 1080p + 1440p.

Fortunately, supporting 4K resolutions is doable with GD, despite the engine’s age, which was made for CRT’s and early non-1080p LCD’s with weird resolutions. Alas, it still required more work than OP seems to understand, because this stuff has to be coded and tested due it being originally non-scaling. As in the old days, every resolution had to be manually coded in…

As for 8K, in the PC space that’s going nowhere anytime soon due to the GPU side of things. I can see uses for it in home cinema etc, but that’s far more easy to pull off with projectors or tiling smaller res monitors that trying to make a single 8K panel. Due to the failure rate a panel of that density can have.

Anyhow, as per usual, this thread is a classic example of “early adopter syndrome” where people with more money than sense proceed to proclaim their fancy new hardware is totes the future and must be supported NOW. Irrespective of actual adoption rates or costs therein and so ignoring the economic costs/benefits balance that’s firmly not in favour in the present and near term.

Same thing happened with 4K very early on and was a constant issue with SLI etc due to the difficulty in supporting that at the game level. As neither company’s managed to create a layer that would simplify using 2 GPU’s at once for any app. Which is why SLI never became mainstream and has always been niche for gaming.

4K managed a bit better of course, due to adoption for TV’s and consoles. But to be frank, unless you dump enough money for the top Nvidia GPU, you aren’t doing 4K natively, you’re going to upscaling because everything else lacks the raw computing power and memory to do native 4K and have all the eye candy on. And thanks to the LLM bubble of total idiocy that’s not going to change anytime soon, because it’s eaten up so much money we’re doomed to a global recession.

3 Likes

Doesn’t matter how bad a game’s graphics is. When you start playing you get used to it in few minutes.

1 Like

Well said.

This one really makes me sad, especially given how dumb AI is. It’s like the biggest bullshitter ever aka very good sales person (the kind that’s nowhere to be seen once you’ve given them the money).
I hope it dies down soon and we can go back to using pc’s for normal pc stuff again.

2 Likes

Okay, someone got salty at me calling a spade a spade, so here we go with an “edited” take again:

facepalm_

There’s so much chutzpah in this comment it boggles the mind, also, fun fact, that whole “bandwidth doesn’t matter” bs only applies to consoles and very high end PC PCIe-5.0 m.2 SSD + top end GPUs. For us mere peasants bandwidth is an issue and 8GB of GPU memory is still the norm. Especially in these times of GPU prices going insane, never mind SSD and RAM prices. As swapping textures etc out of memory comes with an obvious overhead when limited to 8GB.

But I guess you’ve never had to deal with lower end PC gaming, otherwise you’d know this and wouldn’t have proceeded to make an utter schmuck out of yourself in this thread. But such is the fate of those with too much money, and no sense.

Oh and fun fact, lower end GPU’s these days only come with 124bit buses, which does impose a bandwidth limit. PCIe 4.0 means that’s less of an issue, but unless the GPU comes with 24GB of memory, there will be I/O limitations pushing 4k textures natively. Though to be frank, you’re not going to bother with 4K on those anyhow, due to the limitations of the GPU cores. As the lower end ones, bar Intel’s offerings, sadly these days represent crap value for money. Better off really buying a 2nd hand mid ranged one :confused:

Well, if the prices hadn’t gone to complete shit that is.

Main issue though is storage space vs download speeds, since not everyone has decent internet, especially in the USA. Or Australia lawl, NBN for the fail. Which means not everyone wants to have to download massive games regularly with assets they’re unable to make use of. Hell, I can still only get 40mbps off steam myself internationally and I’m on fibre to the house with a decent router here in NZ. So it can take most of a day for some games to download unless steam’s hosting them here in NZ. Contrast with Gamepass which goes up to 140mbps.

Anyhow, in future, it would pay greatly for you to think critically and take into account other’s situations and the economic realities of PC hardware, especially when prices have gotten so obscene. This would thus prevent you from getting slammed by others with facts of this issues.


Right, that should do, probably lawl. Though on steam I’d still get hit because it’s “argumentative” :upside_down_face: or whatevers the steam mod “decided” was wrong.

4 Likes