Indiana Jones and the Great Circle (2024)

Amazed people rate the actual game so highly. It’s as generic as it gets and has the same dumb low effort stealth system that all AAA games have where all the AI are seemingly blind.

Graphics pretty, that’s about it though, generic AAA gaming with no real depth to anything other than the visuals.

But @Nexus18 says GOTY.

Does seem a bit marmite from user feedback I am seeing. Only thing that is for sure outstanding is the graphics.

Will pick it up with a 6070Ti :p
 
I mean it's not going to be doing anything unique that is different from every other game where stealth, figuring out puzzles, learning new skills to enhance your combat and so on but the most important thing is it is just fun and if you like the films, you'll like the game.... It is nothing like uncharted, the new tomb raider games. It does require you being an indy fan to fully appreciate it though. If you also liked the game on the xbox back in the day, you'll like this as it's probably the closet game in terms of mission design etc.

 
Amazed people rate the actual game so highly. It’s as generic as it gets and has the same dumb low effort stealth system that all AAA games have where all the AI are seemingly blind.

Graphics pretty, that’s about it though, generic AAA gaming with no real depth to anything other than the visuals.
I'd agree with this. It's more to satisfy the tech head in me. My gamer side is also confused to all the game of the year shouts.
 
It's liked so much because most people are Indy fans and it's like you're playing in one of his movies.
 
I get the nostalgia for the movies, I'm an indy fan myself. But again atmosphere doesn't equal gameplay. Goty needs great gameplay too.
Imo Ballatro with 8 bit gfx has better gameplay than Indy :D and I'd rate a better "game".
 
Yea, this game looks nice and atmospheric but the gameplay is a pretty generic "stealth" game.

Theif 2 from 2000 has better stealth gameplay, AI and mechanics.
 
Last edited:
Ooooooooof this is buttery smooth with MFG x4 and looks + feels great too :cool: Seems nvidias latency indicator is broke though as it definetly has extra input lag over no FG (yet is reporting 2.3ms compared to the no fg latency of 45ms :cry: But doesn't feel bad at all especially when using a controller.
MFG turns my game into an instant stuttering mess! as in dips to 3-4 fps every other second :confused:
 
MFG turns my game into an instant stuttering mess! as in dips to 3-4 fps every other second :confused:

What gpu and cpu and res? Only time I see a drop is when there is a save and cutscene happens (fps drops to 60), aside from that, buttery smooth.






Absolutely stunning, finally getting the most from the aw32 240hz refresh rate.

JdeAGIz.jpg
 
Last edited:
What gpu and cpu and res? Only time I see a drop is when there is a save and cutscene happens (fps drops to 60), aside from that, buttery smooth.

Absolutely stunning, finally getting the most from the aw32 240hz refresh rate.
5070Ti & 5600x. I can max out all the settings (apart from RT) at 4k. Really smooth.

I turn on MFG at any level, unplayable!

By the way is that the 4k QD-OLED monitor you have? I am so tempted to get one
 
5070Ti & 5600x. I can max out all the settings (apart from RT) at 4k. Really smooth.

I turn on MFG at any level, unplayable!

By the way is that the 4k QD-OLED monitor you have? I am so tempted to get one

Your cpu might be bottlenecking you somewhat especially if you're using dlss perf (where render res is much lower). Make sure your vram isn't being over utilised, which if not using full path tracing, you shouldn't be. Maybe worthwhile dropping the texture pool setting as even without rt, at supreme, it will try to offload as much as possible into the vram (frame gen/mfg adds to vram usage)

And yup, that's it, aw32 qd oled 3rd gen, great monitor especially now that we have gpus which can get close to the refresh rate.
 
At 4K, an X3D chip shouldn't be too much of a bottleneck for a 5070ti. However, the 5600x is quite a drop down from the 5800X3D in gaming, so I wouldn't be surprised if that's where the bottleneck is at the lower end if DLSS is used to significantly reduce the render resolution.

On average, the difference between a 5800x3d and 9800x3d is less than 10% with a 4090 in 1440p. That lowers even further at 3440UW and even more at 4k.
 
Last edited:
At 4K, an X3D chip shouldn't be too much of a bottleneck for a 5070ti. However, the 5600x is quite a drop down from the 5800X3D in gaming, so I wouldn't be surprised if that's where the bottleneck is at the lower end if DLSS is used to significantly reduce the render resolution.

On average, the difference between a 5800x3d and 9800x3d is less than 10% with a 4090 in 1440p. That lowers even further at 3440UW and even more at 4k.

Really I think it depends on the game. RT especially is heavy on the CPU and needs a fast processor to keep the GPU busy.

I had a 5800x3d and upgraded to a 9800x3d as I was bottlenecking my 4080.

Also new titles these days have a recommended CPU of a 7800x3d.

Even at 4k there are games that will heavily tax the CPU.

It depends on the title. Some titles are weighted more towards the GPU and less to the CPU some vice versa and some need both a beefy CPU and GPU.

I personally think we are emerging in to this era.

Like I said I see a lot of the new titles have a recommended CPU of a 7800x3d at 4k resolution so I use that as a benchmark.

At 1080p a 9800x3d is 40-50% faster than a 5800x3d so given a title that has more demands on the CPU side what ever the resolution then the gap between the processors will be evident.
 
Last edited:
Really I think it depends on the game. RT especially is heavy on the CPU and needs a fast processor to keep the GPU busy.

I had a 5800x3d and upgraded to a 9800x3d as I was bottlenecking my 4080.

Also new titles these days have a recommended CPU of a 5800x3d.

Even at 4k there are games that will heavily tax the CPU.

It depends on the title. Some titles are weighted more towards the GPU and less to the CPU some vice versa and some need both a beefy CPU and GPU.

I personally think we are emerging in to this era.

Like I said I see a lot of the new titles have a recommended CPU of a 7800x3d at 4k resolution so I use that as a benchmark.

I'm aware of all this, which is why I said it's around 10% on average at 1440p, I used that because it's a rough middle ground between 4k and 1080p, and with DLSS used at 4k it brings the render resolution down to those levels.

While it depends on the title, the majority of titles are GPU bound and the 10% difference you see in benchmark suites is skewed upward a bit with the inclusion of a few CPU weighted titles.

As for how many titles recommend a 7800x3D, I can't say I've seen many. I just checked the requirements for this game and there's no mention of any x3d CPUs, even with full raytracing at ultra with a 4090 as the recommended GPU.
 
I'm aware of all this, which is why I said it's around 10% on average at 1440p, I used that because it's a rough middle ground between 4k and 1080p, and with DLSS used at 4k it brings the render resolution down to those levels.

While it depends on the title, the majority of titles are GPU bound and the 10% difference you see in benchmark suites is skewed upward a bit with the inclusion of a few CPU weighted titles.

As for how many titles recommend a 7800x3D, I can't say I've seen many. I just checked the requirements for this game and there's no mention of any x3d CPUs, even with full raytracing at ultra with a 4090 as the recommended GPU.

These two recommend a 7800x3d at 4k and I imagine there will be more in 2025.

KCD2
Spiderman 2
 
So by a lot of the new titles you meant 2 :p

Also, I would never use those requirements as a benchmark. Using KCD2 as an example, even if you make it more CPU bound at 1080p, there is no difference between something like a 12600k and a 9800x3d with a 4090 as the GPU.
 
Last edited:
So by a lot of the new titles you meant 2 :p

Also, I would never use those requirements as a benchmark. The likes of KCD2 even when you make it more CPU bound at 1080p, show no difference between something like a 12600k and a 9800x3d with a 4090 as the GPU.

Possibly. But for me it was Space Marine 2 that prompted me to upgrade. I say my GPU usage bouncing around 85-95% with big drops down to 30% at times. (5800x3d)
 
Last edited:
Latest patch or DLSS or driver (take your pick) has fubared FG for me, anyone else?
 
Possibly. But for me it was Space Marine 2 that prompted me to upgrade. I say my GPU usage bouncing around 85-95% with big drops down to 30% at times. (5800x3d)

I haven't seen my 5080 gpu usage ever drop to 30% (including with space marine 2 on launch [not got it installed anymore]), do sometimes see it drop to 95% but to be expected since I mainly use dlss performance mode with the transformer model.
 
So by a lot of the new titles you meant 2 :p

Also, I would never use those requirements as a benchmark. Using KCD2 as an example, even if you make it more CPU bound at 1080p, there is no difference between something like a 12600k and a 9800x3d with a 4090 as the GPU.

There will be others in the future.

As time progresses CPU requirements creep up.

Of course a 5800x3d or 5700x3d will be fine for a while yet.

But for me I decided there would be no GPU upgrade untill the 6000 series so I would be on the same hardware for the next 2 years.

So I decided to upgrade the CPU as as I said in SM2 I saw I was being bottlenecked... plus the recommend specs of some new games being a 7800x3d.

I thought why not if I am gona be on this for the next 2 years.
 
Last edited:
Back
Top Bottom