• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.

They are the same architecture, the same vendor and generation GPU, the lower end card with less bandwidth but more VRam is faster than the higher end card with more bandwidth but less VRam.

What in that is the logical reason for this discrepancy?
 
Last edited:
I'm not being a ###### to you i'm just trying to get you to see what you should be seeing :)
What difference does it make if it's bandwidth or the amount of vram itself that causes problems? It's a fact that there is an issue there and the 3060 (the one you compared the 6600xt to) doesn't have that issue.
 
What difference does it make if it's bandwidth or the amount of vram itself that causes problems? It's a fact that there is an issue there and the 3060 (the one you compared the 6600xt to) doesn't have that issue.

Well because you said it was bandwidth related, it isn't, the 3070 has more bandwidth than all of them.
 
@Bencher the 3060 with a 192 Bit Bus and 12GB of VRam is faster than the RTX 3070 with a 256Bit Bus and 8GB VRam.

Explain that. :)
What difference does it make if it's bandwidth or the amount of vram itself that causes problems? It's a fact that there is an issue there and the 3060 (the one you compared the 6600xt to) doesn't have that issue.


He didn't say 6600xt..


He said 3060 12gb vs 3070 8GB


Also it makes a huge difference if it's bandwidth or the amount of VRAM.. Do you work in the I.T industry ? If you do God help your employers if you don't understand why bandwidth/bit bus or RAM/VRAM and you recommend systems to them or their clients.
 
Last edited:
Honestly, nvidia having the console market would be for the better, just imagine how far ahead we would be in game visuals now if nvidia owned the market here and give developers what they want rather than amd deciding what is best for them. Amd have proven they are quite happy to put in as little effort as possible and only come out with solutions when they are pushed by their community to get on par/sliding behind even more e.g. reflex/low lag, freesync, ray tracing, upscaling and now frame generation, had it not been for nvidia, we probably still be waiting for these things :o

AMD don’t decide. Microsoft and Sony are the ones who decide by giving AMD the contract, because it was AMD who offered the right product at the right spec and price. Nvidia did once provide the GPU for the original Xbox and the PS3 and especially in the latter was hardly some kind of utopian vision. You seem to have an incredibly short memory or at least very selective about some of Nvidia’s business practices.
 
Last edited:
He didn't say 6600xt..


He said 3060 12gb vs 3070 8GB

This is what he said.

The 6600xt suffers from severe bandwidth issues, in recent games the 3060 is much faster than the 6600xt (forspoken / tlou / RT hogwarts / deadspace etc.)

Post #31...

 
AMD don’t decide. Microsoft and Sony are the ones who decide by giving AMD the contract, because it was AMD who offered the right product at the right spec and price. Nvidia did once provide the GPU for the original Xbox and the PS3 and especially in the latter was hardly some kind of utopian vision. You seem to have an incredibly short memory or at least very selective about some of Nvidia’s business practices.
Nvidia with their magical X86 CPUs. Oh wait, the contract happened because AMD was the only vendor that could provide an X86 CPU and decent dGPU. The only other company which was in the running is Intel. Plus Nvidia annoyed Nvidia over the original Xbox and ATI designed the first unified shader GPU for the XBox 360. It gets even worse when hardware based image reconstruction methods first happened on the AMD powered consoles, ie, Checkerboard rendering. Shows you how much PCMR is influenced by marketing even more than the console crowd! :cry:
 
Last edited:
He didn't say 6600xt..


He said 3060 12gb vs 3070 8GB


Also it makes a huge difference if it's bandwidth or the amount of VRAM.. Do you work in the I.T industry ? If you do God help your employers if you don't understand why bandwidth/bit bus or RAM/VRAM and you recommend systems to them or their clients.
He did. Previous page, he was "complaining" that nvidia sold 9times as many 3060s compared to amd selling 6600xt even though the latter was the better card, or something along those lines.
 
Last edited:
He did. Previous page, he was "complaining" that nvidia sold 9times as many 3060s compared to amd selling 6600xt even though the latter was the better card, or something along those lines.

OK but the question here that was asked was avoided/not answered ?


@Bencher the 3060 with a 192 Bit Bus and 12GB of VRam is faster than the RTX 3070 with a 256Bit Bus and 8GB VRam.

Explain that.
:)

Too many walls of text confusing the thread. But the point is why is a 3060 12GB model beating 3060ti/3070/ti in many of the recent games ?


Anyways will leave you guys to it.. the answer is simple...


VRAM size
 
Oh look another thread going down the vram rabbit hole.... :cry: Dont know how people aren't tired of this **** now :cry:

AMD don’t decide. Microsoft and Sony are the ones who decide by giving AMD the contract, because it was AMD who offered the right product at the right spec and price. Nvidia did once provide the GPU for the original Xbox and the PS3 and especially in the latter was hardly some kind of utopian vision. You seem to have an incredibly short memory or at least very selective about some of Nvidia’s business practices.

Back then I agree, but not anymore, do also agree that price is the main factor but personally I'm happy to pay a premium if it meant getting better/faster advancement in visuals.
Amd with their weaker hardware and lacklustre features are directly holding back the next era of graphics, dare I say had Nvidia been in current consoles, dlss would be in every console game, meanwhile fsr......
 
Last edited:
Oh look another thread going down the vram rabbit hole.... :cry: Dont know how people aren't tired of this **** now :cry:



Back then I agree, but not anymore, do also agree that price is the main factor but personally I'm happy to pay a premium if it meant getting better/faster advancement in visuals.
Amd with their weaker hardware and lacklustre features are directly holding back the next era of graphics, dare I say had Nvidia been in current consoles, dlss would be in every console game, meanwhile fsr......

Sony and Microsoft want nothing to do with Nvidia on their consoles, only Nintendo do so far and that's not exactly cutting edge graphics there.

Also what you forget is AMD are making custom APU's for Sony and Microsoft, so basically making what they want with AMDs current technology.

Did you forget what Nvidia did on the PS3 or SEGA consoles...?

Clue with SEGA .. Quads vs Triangles..
 
Sony and Microsoft want nothing to do with Nvidia on their consoles, only Nintendo do so far and that's not exactly cutting edge graphics there.

Also what you forget is AMD are making custom APU's for Sony and Microsoft, so basically making what they want with AMDs current technology.

Did you forget what Nvidia did on the PS3 or SEGA consoles...?

Clue with SEGA .. Quads vs Triangles..

As I said above, that was back then (a long time ago too....) when ati/amd were actually on par with Nvidia in many ways but these days, not so much.
 
As I said above, that was back then (a long time ago too....) when ati/amd were actually on par with Nvidia in many ways but these days, not so much.
https://www.electronicdesign.com/te...-research-nvidias-quadratic-processor-the-nv1

The Sega Deal

The big win, PR-wise, was its partnership with Sega. Sega of America established an exclusive licensing agreement with Nvidia and said it would convert Sega’s Saturn and arcade software to CD-ROMs for PCs equipped with Nvidia’s multimedia accelerators. Nvidia set up an exclusive license for PC 3D accelerator AIBs, meaning that Saturn-based games could not be ported to any other 3D PC AIB hardware.

That software pushed Sega Saturn’s sales to more than one million units in Japan in the wake of its release in November 1994. Launched in the U.S. in May 1995, the Sega Saturn quickly sold out of its limited initial U.S. distribution. The company said it would be in full distribution by early September. Saturn games were set to be released for Nvidia-based products three to six months after appearing for the Saturn console.


Tom Kalinske, president and CEO of Sega’s US division, said at the time that he believed the markets for gaming PCs and game consoles such as the Sega Saturn would thrive in the future. The PC games market could reach approximately 20% to 25% of the video game market by 1990, he estimated. Sega said it would spend $30 million to $50 million on advertising for Saturn over the next year.

According to Nvidia, Sega’s software would take advantage of every facet of the NV1 processor, driving the chip to its limit. Huang, Nvidia’s CEO, believed that “PC consumers are going to be stunned with the results of our combined efforts.”

Intel also entered the picture and planned a presentation at SIGGRAPH with Nvidia. Intel said it would also make Sonic the Hedgehog available to OEMs (Intel ported it using its native signal processing software, or NSP. That effort later resulted in a fight with Microsoft that got to the U.S. Department of Justice (but that’s a story for another time).

The Sega deal was a significant coup for Nvidia. The company’s quad-patch approach had some industry analysts wondering if it was too radical a departure from conventional tri-meshes to get developers on board.
Conclusion

But Nvidia was too far ahead of the curve, as it turned out. The PC industry ended up staying with polygonal graphics. There were criticisms of Nvidia’s approach, including the assertion that 2D graphics performance could not compete with other options at the time, particularly in DOS. The NV1’s audio quality failed to live up to expectations, and the AIB was expensive. The result? It failed to gain ground in the market.



Nvidia co-developed the NV2 with Sega for use in the Dreamcast console. But Sega ended up dropping Nvidia as a partner and supplier. The NV2 never came to market, and when Nvidia failed to recoup its investment, the company almost folded. Nvidia dropped its spherical approach in one of the PC industry’s most daring turnarounds, repositioning the company to a polygon design. In 1997, it introduced the Riva 128—one of the most successful graphics chips of all time.

Same NV1 tech used in the SEGA Saturn..



 
Last edited:
Probably that too!
:cry:


Something worth watching regarding the XBOX and some history lessons for the young ones, I posted this a while back on the consoles section but worth watching to see how and why Microsoft did what it did with the Xbox. Really good documentary.


 
Did you forget what Nvidia did on the PS3

The running joke there was Sony basically had to go hat in hand to Nvidia and ask them to come up with something for the ps3, prior to this the CELL was going to handle everything, cpu tasks, audio tasks and the graphics side of things, then the cocaine wore off and they realised it wasn't capable of doing what they thought so NVidia gave them the "rsx reality synthesizer" which was basically a custom 7800gtx.
 
Last edited:
Back
Top Bottom