• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Sounds like a lot of marketing speech. A bit like all the Apple saying why Apple is better than Google(Android) or Microsoft(Windows). In the end all that matters is really the generational uplift per pricing tier and price/performance. It also seems the worse the generational price/performance uplift the more the marketing slogans get wheeled out.

What is quite clear,neither Nvidia or AMD seem to be doing much in this regard this generation.

I think he meant AMD vs Nvidia in regards to pricing. We can probably all agree pricing has been very poor from both of them on the whole this gen.
 
Sounds like a lot of marketing speech. A bit like with Apple saying why they are better than Google(Android) or Microsoft(Windows). In the end all that matters is really the generational uplift per pricing tier and price/performance. It also seems the worse the generational price/performance uplift the more the marketing slogans get wheeled out.

What is quite clear,neither Nvidia or AMD seem to be doing much in this regard this generation.
It's probable half and half. But things like FG and improvements to shader execution as well as dedicated path tracing hardware have allowed RTPT to be possible at over 100fps. Whether those 100fps are "real" or not makes 0% difference, the reality is that the latency is low and the frame quality is high, this is all that matters and this is exactly what we get even up to 4k resolution. That in itself is great progress this generation. I've now seen it in front of my very eyes and can see the potential it has for some amazing things and only later this year once UE5.1 and 5.2 games start to roll out will we see where GPUs really stand I guess.

Now the issue is game developers, if game devs are too lazy because they know that hardware can now just work around shoddy optimisation through using FG/DLSS/brute force etc, then they will just cut corners and slowly patch up games over many months because the executives at publishers only care about release deadlines.

I see no major advances from any other GPU vendor, Intel is still in its early days but they are steadily improving performance with each driver maturity level, whilst AMD seem to be asleep at the wheel. Nvidia still has yet to release a bunch more features that will further improve quality and performance on 40 series cards, and we still have no word on FSR 3 from AMD, so once again they will be two generations behind once it finally comes out. On top of that, the reports of the 7900XTX using up to 100 watts simply being idle is quite something lol when a 4090 is 18 watts in the same state.
 
Last edited:
Sounds like a lot of marketing speech. A bit like with Apple saying why they are better than Google(Android) or Microsoft(Windows). In the end all that matters is really the generational uplift per pricing tier and price/performance. It also seems the worse the generational price/performance uplift the more the marketing slogans get wheeled out.

You can define it as marketing because nvidia sure as hell do use it for marketing their advantages over their competition but at the end of the day, that doesn't stop the reality of some of those listed items actually being worthwhile and very real things, which shock horror, people actually use..... It's like saying people shouldn't be buying android, apple for their unique features they offer but buy purely based on hardware :o

You do realise there is far more to cost than just sheer hardware, right? Or do you expect nvidia and every other company to be robin hood and only charge for the cost of making the item and nothing else? Maybe I should go and tell our clients to stop charging for their software and only charge their customers for the monthly hosting cloud costs in azure, gcp and aws, I'm sure they would love to hear that :o
 
Last edited:
Nvidia found it's new crypto boom

The AI boom is real and demand is currently so high that ordering H100 GPUs now come with a 6 month waiting period and then cards are sold 40% over MSRP

considering Jensen just took a pay cut due to lower profits I doubt he he can rely on AI just yet and as with all booms it’s usually followed by a bust.
 
His pay cut was from like 25 million dollars, a cut of 2.5 million dollars wasn#t it? It's pretty obvious that the "pay cut" won't even scratch the surface of his leather jacket annoyingly!
 
Nvidia found it's new crypto boom

The AI boom is real and demand is currently so high that ordering H100 GPUs now come with a 6 month waiting period and then cards are sold 40% over MSRP


I did say this a while ago, chatgpt/open ai alone is absolutely massive, literally it's the equivalent to the internet and nvidia are leading this, all the major cloud providers i.e. microsoft, google and amazon are wanting to use their hardware and will be paying top dollar for their hardware, the profits from this alone in the long run will easily trump desktop gaming gpus:


Exciting but scary times ahead!

You can define it as marketing because nvidia sure as hell do use it for marketing their advantages over their competition but at the end of the day, that doesn't stop the reality of some of those listed items actually being worthwhile and very real things, which shock horror, people actually use..... It's like saying people shouldn't be buying android, apple for their unique features they offer but buy purely based on hardware :o

You do realise there is far more to cost than just sheer hardware, right? Or do you expect nvidia and every other company to be robin hood and only charge for the cost of making the item and nothing else? Maybe I should go and tell our clients to stop charging for their software and only charge their customers for the monthly hosting cloud costs in azure, gcp and aws, I'm sure they would love to hear that :o

Yup intel are taking the AI/machine learning very seriously too, amd weren't but iirc recently Lisa has changed her mind, smart move as it would have been suicide for amd to keep ignoring this, although suspect amd will be playing catchup for some time and their focus will be primarily on business/partnership side rather than pc gaming focused i.e. their usual approach

considering Jensen just took a pay cut due to lower profits I doubt he he can rely on AI just yet and as with all booms it’s usually followed by a bust.

Given the profound effect AI is having on the market as of right now and it's only been released this year..... it absolutely will not be a bust. This isn't anything like a fake currency that had no real benefit/use.

 
Some of you guys do realise that Nvidia are giving away free cards right?


The RTX 4060 is not the only card that is included in the giveaway. According to the promotion terms and conditions, the prize pool will feature RTX 4090, RTX 4080, RTX 4060 Ti, RTX 4060 graphics cards, monitors, and even a full RTX 4090 PC worth $7500.

What is worth noting is that not all 460 cards will go to directly to gamers. NVIDIA has split the prize pool to also give away 100 RTX 4060 Ti and 200 RTX 4060 cards to content creators. The two hundreds RTX 4060 are to be further given away to their community.

The new Summer of RTX is not a one-off chance to win the hardware. This giveaway will last for weeks and will encourage gamers to leave comments under social media posts.

How to enter:

  1. Follow NVIDIA social channels for key prompts and instructions
  2. Like/comment/tag/share posts across social
  3. Use #RTXOn (or any hashtag that is prompted) across any post throughout summer
  4. Be on the lookout for your favorite streamers to join in on the celebration

So a new social media campaign to promote Nvidia cards. Lots of people will be repeating all the Nvidia bumpf to try and win a new card.


I think he meant AMD vs Nvidia in regards to pricing. We can probably all agree pricing has been very poor from both of them on the whole this gen.

But that post was some guy on YouTube just listing every Nvidia marketing point,and which seemed like some non technical person parrot repeating stuff.I mean you could also do the same for Intel or AMD/ATI over the last 30 years too. But look above at the new Nvidia competition,which is probably why.

Even with RT,it was PowerVR which demoed amongst the first consumer examples of VR. Adaptive Sync was invented by the Vesa consortium,and they were in the process of wheeling it out(AMD and Intel are part of it),etc and Nvidia pushed forward with it. Even stuff people think Nvidia invented like unified shaders,tessellation,etc were first implemented on ATI dGPUs. Intel has technically better H264/H265 capability than Nvidia now,etc and a cheap Intel CPU or £130 A380 can do AV1. They were the first to properly do MCM CPUs(think the Q6600),so AMD wasn't the first to go that way. Intel did external cache too(Broadwell L4),and did stuff like Lakeview.

At this point I am fed up with both of them and their little pricing cartel. I wonder how long before the US regulators or the EU find some sort of collusion? They have both been accused of doing this:


It's probable half and half. But things like FG and improvements to shader execution as well as dedicated path tracing hardware have allowed RTPT to be possible at over 100fps. Whether those 100fps are "real" or not makes 0% difference, the reality is that the latency is low and the frame quality is high, this is all that matters and this is exactly what we get even up to 4k resolution. That in itself is great progress this generation. I've now seen it in front of my very eyes and can see the potential it has for some amazing things and only later this year once UE5.1 and 5.1 games start to roll out will we see where GPUs really stand I guess.

Now the issue is game developers, if game devs are too lazy because they know that hardware can now just work around shoddy optimisation through using FG/DLSS/brute force etc, then they will just cut corners and slowly patch up games over many months because the executives at publishers only care about release deadlines.

I see no major advances from any other GPU vendor, Intel is still in its early days but they are steadily improving performance with each driver maturity level, whilst AMD seem to be asleep at the wheel.

The thing is the large scale of hardware based upscaling,and even hardware based frame insertion were done in consoles and AV beforehand. They are useful to have but the issue is the base performance,VRAM,etc all has to be there and a reasonable price. Unless you buy an RTX4090 you are basically getting screwed over with subpar hardware.

My bigger issue is that all of these companies wheel out these advantages when they know they have poor value hardware. Apple does it all the time especially at the lower end and midrange tiers,and they make it sound like its reinventing the wheel and they try and do it because they also want to charge more for less. But even they have been hit hard by the slowing markets.

At this point,how about releasing decent hardware at all pricing levels PLUS the "advantages" instead? That way it will be the icing on the cake? ATM,it seems the cake is more icing and less sponge!:p

Daniel Owen also likes some of the Nvidia features but he summed up the current issues quite well:

Probably applies to AMD too.
 
Last edited:
Some of you guys do realise that Nvidia are giving away free cards right?


So a new social media campaign to promote Nvidia cards.




But that post was some guy on YouTube just listing every Nvidia marketing point,and which seemed like some non technical person parrot repeating stuff.I mean you could also do the same for Intel or AMD/ATI over the last 30 years too. But look above at the new Nvidia competition,which is probably why.

Even with RT,it was PowerVR which demoed amongst the first consumer examples of VR. Adaptive Sync was invented by the Vesa consortium,and they were in the process of wheeling it out(AMD and Intel are part of it),etc and Nvidia pushed forward with it. Even stuff people think Nvidia invented like unified shaders,tessellation,etc were first implemented on ATI dGPUs. Intel has technically better H264/H265 capability than Nvidia now,etc and a cheap Intel CPU or £130 A380 can do AV1. They were the first to properly do MCM CPUs(think the Q6600),so AMD wasn't the first to go that way.







The thing is the large scale of hardware based upscaling,and even hardware based frame insertion were done in consoles and AV beforehand.

My bigger issue is that all of these companies wheel out these advantages when they know they have poor value hardware. Apple does it all the time,and they make it sound like its reinventing the wheel and they try and do it because they also want to charge more for less. But even they have been hit hard by the slowing markets.

At this point,how about releasing decent hardware at all pricing levels PLUS the "advantages". That way it will be the icing on the cake. ATM,it seems the cake is more icing and less sponge!:p

Daniel Owen also likes some of the Nvidia features but he summed up the current issues quite well:

And what relevance does giving away free hardware have? :confused: Yes this is what you call "marketing", you do realise that these companies have dedicated marketing departments who handle social media, review samples and so on i.e. teams completely different and separate to the software engineers and hardware engineers, architects and so on.....

On the adaptive sync topic, this is always such a silly argument too, nvidia gpus at the time could not make use of adaptive sync, there was no way of nvidia gpus to utilise adaptive sync which was the primary reason nvidia brought out the gsync module, that and it was years ahead of what adaptive sync/freesync would offer, the first freesync monitor didn't arrive for what, something like 2 years after gsync? And when they did arrive, they were awful in QC and overall quality, low freesync range, no low frame rate compensation, no variable pixel overdrive, black screen/flickering and so on, yeah they were a lot cheaper but personally I and others would rather pay a premium to get a smoother/better less buggy experience. Again, another example of where companies can charge a premium when they are first to the market and have a superior product.... You can guarantee if amd were in this position, they would dump their "open source" "good for the gaming community" motto.

AMD/intel may have been first with things like tess. and any encoding methods but that doesn't change the fact that nvidia are leading here now with their nvenc av1 dual encoders.

hardware based upscaling,and even hardware based frame insertion were done in consoles and AV beforehand.

And?

Again, you are being like humbug and living in the past and making out like these methods are the "exact" same when reality is, they are very different and most importantly, far better ways than that unless you have some actual evidence to back up these claims and debunk the in depth analysis that the likes of DF etc. provide?

Yes in an ideal world, we want to see both the hardware and software and not just software advancements but reality is software and more efficient methods of handling workloads especially with ai/machine learning is the way forward rather than always just relying on hardware advancements, I can't see this approach changing any time soon.
 
Last edited:
Some of you guys do realise that Nvidia are giving away free cards right?




So a new social media campaign to promote Nvidia cards. Lots of people will be repeating all the Nvidia bumpf to try and win a new card.




But that post was some guy on YouTube just listing every Nvidia marketing point,and which seemed like some non technical person parrot repeating stuff.I mean you could also do the same for Intel or AMD/ATI over the last 30 years too. But look above at the new Nvidia competition,which is probably why.

Even with RT,it was PowerVR which demoed amongst the first consumer examples of VR. Adaptive Sync was invented by the Vesa consortium,and they were in the process of wheeling it out(AMD and Intel are part of it),etc and Nvidia pushed forward with it. Even stuff people think Nvidia invented like unified shaders,tessellation,etc were first implemented on ATI dGPUs. Intel has technically better H264/H265 capability than Nvidia now,etc and a cheap Intel CPU or £130 A380 can do AV1. They were the first to properly do MCM CPUs(think the Q6600),so AMD wasn't the first to go that way. Intel did external cache too(Broadwell L4),and did stuff like Lakeview.

At this point I am fed up with both of them and their little pricing cartel. I wonder how long before the US regulators or the EU find some sort of collusion? They have both been accused of doing this:




The thing is the large scale of hardware based upscaling,and even hardware based frame insertion were done in consoles and AV beforehand. They are useful to have but the issue is the base performance,VRAM,etc all has to be there and a reasonable price. Unless you buy an RTX4090 you are basically getting screwed over with subpar hardware.

My bigger issue is that all of these companies wheel out these advantages when they know they have poor value hardware. Apple does it all the time especially at the lower end and midrange tiers,and they make it sound like its reinventing the wheel and they try and do it because they also want to charge more for less. But even they have been hit hard by the slowing markets.

At this point,how about releasing decent hardware at all pricing levels PLUS the "advantages" instead? That way it will be the icing on the cake? ATM,it seems the cake is more icing and less sponge!:p

Daniel Owen also likes some of the Nvidia features but he summed up the current issues quite well:

Probably applies to AMD too.

Would be interesting if it ever comes out and they get a fined for price fixing. The fine will likely be tiny though and do more damage to AMD.

I am looking forward to see what we get next gen from both vendors.
 
My 4090 FE is giving me black screens. Its done it 4 times while gaming, system appears to keep going, have to use the restart button and it comes back. Tonight i was gaming for about 90 minutes, everything fine but then about 5 minutes after I finished gaming I lost the signal again. This time it didn't come back with multiple restarts. I have a LG CX OLED so only HDMI ports and the FE only has 1x HDMI port. Fortunately had a Display Port to HMDI cable and used that and got a signal, then the HDMI port started working again.

Little bit concerned, FE is on a water block so really don't need any RMA issues with Nvidia. Never gets even warm and the game I've been playing when it crashes is Insurgency so its not even being taxed, barely hits 200w to deliver 120 fps.

Anyone come across this? Latest drivers, system Aorus Z590 Master, 10900k, 32gb G-Skill 3600, Nvidia 4090FE Corsair 1000w Dual custom loops, LG C1 49" OLED
 
My 4090 FE is giving me black screens. Its done it 4 times while gaming, system appears to keep going, have to use the restart button and it comes back. Tonight i was gaming for about 90 minutes, everything fine but then about 5 minutes after I finished gaming I lost the signal again. This time it didn't come back with multiple restarts. I have a LG CX OLED so only HDMI ports and the FE only has 1x HDMI port. Fortunately had a Display Port to HMDI cable and used that and got a signal, then the HDMI port started working again.

Little bit concerned, FE is on a water block so really don't need any RMA issues with Nvidia. Never gets even warm and the game I've been playing when it crashes is Insurgency so its not even being taxed, barely hits 200w to deliver 120 fps.

Anyone come across this? Latest drivers, system Aorus Z590 Master, 10900k, 32gb G-Skill 3600, Nvidia 4090FE Corsair 1000w Dual custom loops, LG C1 49" OLED
I had the same thing on an LG C2 oled the first few days with my 4090, in my case it was the hdmi cable, bought a new one and all worked out well.
 
image.png
 
Back
Top Bottom