The world of technology is full of myths, and within it the graphics cards are one of the components that has been most affected by this reality. I am sure that as soon as you read this, some of these myths will have come to your mind, although you may also have thought of others that you don�t know if they are true or not you have never been able to disprove them at all.
I am aware that many of our readers continue to have this problem, and that some of these myths have become so entrenched that for certain people they are  “universal” truths when, in fact, are totally false things. To help you overcome these myths and have a more realistic and accurate view of graphic cards, today I want to share with you six myths that we must overcome.
Before we start I want to tell you that some of the myths we are going to look at made some sense a few years ago, and they were also true for certain generations of graphics cards, but that does not make them universal truths that we must always bear in mind as true.
In this regard, I want to highlight something very important, and that is that both AMD and NVIDIA have had clearly superior graphics generations that made their graphics cards better than those of their rival, but that happened at a specific timeand should not be interpreted as something absolute and permanent.
With this in mind, I believe that we are ready to begin. If you are planning to buy a new graphics card and need help choosing, I recommend you take a look at our shopping guide. Now, make yourselves comfortable, we have a lot of interesting things to read.
1.-NVIDIA or AMD graphics cards are better by brand alone
This myth is deeply associated with the idea of ⤽status⤠it is also affected by the personal preference of consumers, who turn either of these two companies into a kind of idol, going so far as to defend it tooth and nail with unheard-of blindness.
Needless to say, this is a problem. Buying something because of a preconceived idea of status can lead us to make a bad purchase, and get to the point of that level of absurd worship of a brand can have far worse consequences. Over the past two decades I have had conversations with fans of one brand or another, and I have encountered people who were incapable of even the slightest criticism of their favorite company, a problem that continues to this day and is still very topical.
NVIDIA graphics cards are not better because they are from NVIDIA, and the same goes for AMD graphics cards. Each company has good, less good and bad generations, although fortunately in recent years we have not seen a really bad generation of graphics cards from either of these two companies. So we should not prejudge or get carried away with this myth, each manufacturer offers a distinct value, and each graphics card has its advantages and disadvantages, regardless of whether it is from AMD or NVIDIA.
2.-Graphics memory is the most important thing in a graphics card
Another of the most popular myths, and one of the most recurrent. The users with less knowledge in the technological world are the most prone to be carried away by the ⤽greatness of numbersâ¤. Yes, I am referring to the classic “this graphics card has twice the graphics memory”, a very simple phrase that already encourages the other person to think that just because of that it has to perform twice as well as the other one, when in fact it may perform even less.
It is true that this is a problem that today is suffered less than in previous generations, as manufacturers are opting to reduce costs and it is no longer common to see underpowered graphics cards with lots of graphics memory, although there are still situations that can lead us to error. For example, a GeForce RTX 3060 has 12 GB of graphics memorywhile a GeForce RTX 3070 Ti has 8 GB of graphics memory. We might think that the former performs better and is better, but in fact the opposite is true.
The fact that a graphics card has more memory does not imply that it is better than another with less graphics memory. In the end, what matters is the architecture of the graphics card, its raw power, the generation it is in, as it determines how advanced it is, and its hardware configuration. So, for example, the Radeon RX 6600 XT has only 8 GB of graphics memory, while the Radeon VII has 16 GBbut the former is more powerful, more efficient and has, in addition, dedicated hardware to accelerate ray tracing.
We could give many other examples, such as the GeForce RTX 2060, which has only 6GB of graphics memory but is able to outperform the 11GB GeForce GTX 1080 Ti in ray-traced games, thanks to the performance improvement achieved by its RT cores, and the value of its tensor cores and DLSS should also be taken into account. In the end, the graphics memory is only one of the many data to consider when evaluating a graphics card, and it is not the most important one.
3.-Graphic memory does not improve gaming performance
We could say that this myth is the nemesis of the previous one, since it minimizes the real importance of graphics memory. As we have already said above, it is true that graphics memory is not one of the most important aspects when choosing a graphics card, but this does not mean that it does not have any importanceor that it does not affect gaming performance, in fact the opposite is true.
The first thing we must be clear about is that in order to run a game we need to have a minimum amount of graphic memory. If we do not meet this minimum, the game may work, but it will give us major problems, among which we can highlight:
- Texture errors, popping and graphical glitches.
- Permanent underperformance.
- Jerks and stalls as a result of constant emptying and filling cycles of the graphics memory.
Not having enough graphics memory to run a game is one of the most serious problems we can face, because the GPU will not be able to keep all the things it needs stored in that memory, and it will have to constantly repeat work cycles it has already done. These will be joined by other new work cycles, which will end up saturating the GPU and causing the performance to be very low.
The amount of graphics memory, and its bandwidthcan significantly affect the performance of a game, and even prevent us from accessing certain quality settings if we do not reach a certain level. However, it is true that once we have passed the optimal level, having a larger amount of graphics memory will not make any difference. This is essentially the same as with RAM.
4.-AMD graphics cards consume more, and get hotter, than NVIDIA graphics cards
What can I say, it is a classic, and the truth is that it has made sense in some graphics generations, but in recent years it has become a myth. If we limit ourselves to the power consumption values listed in the specifications of each graphics card we already realize that the power consumption of the GeForce RTX 30 is higher than that of the Radeon RX 6000, and with the GeForce RTX 20 and GeForce RX 5000 things were also very tight.
For example, the GeForce RTX 3080, which is the direct rival of the Radeon RX 6800 XT, has a TGP of 320 watts while the latter has a TBP of 300 watts AMD has managed to fine-tune the power consumption a lot with its latest graphics cardsand this has been noticeable. However, if we bring performance into the equation and look deeper into things like ray tracing and image reconstruction and rescaling, it looks better for NVIDIA, as it uses a more advanced architecture.
Continuing with the above example, the GeForce RTX 3080 performs much better than the Radeon RX 6800 XT in ray tracing, and supports second-generation DLSS, a technology clearly superior to AMD’s FSR 1.0 and FSR 2.0. These two clues are more than enough to justify the difference in power consumption between the two, but the important thing is that there is no abysmal difference in power consumption between the graphics cards of the two companies, and that this myth does not make any sense.
AMD graphics cards also do not get hotter just because they are from AMD, this is another misconception that stopped making sense quite some time ago. It is true that the Sunnyvale firm sometimes launched generations that reached very high temperatures, such as the Radeon R9 290, but NVIDIA also did the same with models such as the GeForce GTX 480, and today it is absurd to generalize by maintaining this myth.
5.-The AMD graphics card drivers are a disaster: much worse than those of NVIDIA
Being honest, we have to admit that AMD has made some major mistakes with their graphics card drivers in recent years, in fact we can remember the case of 2020, when faulty drivers caused many users of the Radeon RX 5700 and RX 5700 XT to find themselves with severe crashes and black screens.
It is true that AMD still has room for improvement in certain aspects, but its drivers are not a disaster, and they are not much worse than those of NVIDIA graphics solutions, in fact something very curious happens, and that is that AMD has been able to take care and improve so much the optimization of its drivers that they have earned the reputation of “finewine”, meaning that some of their graphics cards have aged like fine wine thanks to driver improvements.
AMD drivers have also been evolving in terms of interface and advanced featuresand the leap to the Radeon Software Adrenalin was a major leap for the Sunnyvale company, giving them the foundation they needed to put them in a much more competitive position with NVIDIA. It has built on that foundation with numerous improvements, and has polished the design and interface with great success.
Game support has also improved a lot over the years, and today we can find many titles that are deeply supported by NVIDIA optimized to work better with AMD Radeon graphics cards, such as AC: Valhalla. I think with all this on the table it is pretty clear that this is a myth to forget.
6.-Only top-of-the-line graphics cards can run 4K games
Many users still believe that playing in 4K is something that is limited to the most powerful graphics cards, and that this resolution is something  “new” that has only recently started to be used, so only the most current models are ready to work with it Nothing could be further from the truthin fact, the GeForce GTX 980 Ti was one of the first graphics cards that was actually capable of running 4K games in a more than acceptable way.
I had such a graphics card, and to give you an idea it was capable of running games like Battlefield 4 and Far Cry 4 in 4K at maximum quality while maintaining over 30 FPS stable. Other less demanding titles such as Tomb Raider 2013 and CoD Advanced Warfare ran at 55 and 80 FPS in 4K, also at maximum quality.
Playing in 4K is not a goal we set two days ago, it is something that has been possible for years, although new video game developments have increased the requirements, and therefore older graphics cards are no longer capable of delivering good performance in that resolution with current titles. This is undisputed, but as of today a GeForce RTX 2070 or a Radeon RX 5700 are perfectly capable of running 4K titlesand they are not top-of-the-line models.
It is true that having a more powerful graphics card will give us greater fluidity, but this does not mean that we will not be able to enjoy an optimal experience in 4K with a mid-high-end graphics card, or even mid-range. For example, the GeForce RTX 3060 is capable of running Battlefield V in 4K while maintaining 60 FPS, gets 82 FPS in DOOM Eternal and 55 FPS in Death Stranding, and without having to resort to DLSS.