The five worst graphics cards in the last ten years

In recent years, the world of the personal computer has gone through an important development that has shaken the world of graphics cards. The jump from DirectX 11 to DirectX 12 and the growing popularity of Vulkanhas impacted numerous generational models that did not support both APIs well, and the growth in graphics memory consumption has resulted in many solutions so far being “the best of the best.” are starting to become obsolete.

We could say it is the law of life. All components of a PC have a useful life that is also determined by their raw specifications can be shortened or lengthened depending on other important factorshow to support advanced technologies and the path the industry is choosing. The latter is very important. In fact, we already saw at this point how the stagnation around an outdated API, DirectX 11, could hurt more advanced graphics cards designed for DirectX 12 and Vulkan.

I am convinced that more than one of our readers will know what point in time we were referring to in the previous example, at what stage the two generations of Kepler-based NVIDIA GPUs, the GTX 600 and the GTX 700, coexisted (with the exception of ) GTX 750 and 750 Ti (based on Maxwell) as well as the Radeon HD 7000 and Radeon RX 200. The dominance of DirectX 11 between 2012 and 2016, coupled with the low demand for graphics memory in most games, made Kepler remain a very capable architecture, but today, with the rise of DirectX 12 and Vulkan, the Radeon HD 7000 and Radeon RX 200 You aged much better.

Raw performance is important, that’s clear, but when we talk about graphics cards we also have to consider other factors that we already expected, such as architecture, support for APIs and advanced technologies, and of course the amount of memory graphics installed. AMD performed better in this regard as it incorporated higher memory configurations and chose to use technologies that did not have a good level of support at the time, such as: asynchronous computingfor example, but he also made mistakes.

We could give many examples, but I think that is the clearest thing that we have lived with the arrival of Radeon R9 Fury and Fury X., two graphics cards that rely on HBM memory and were limited to 4 GB. This became a very serious problem when games were consuming more graphics memory and both fell below their direct competitors from NVIDIA. The GTX 980 Ti with 6 GB of graphics memory has aged better than the Radeon R9 Fury X.

It is clear that both NVIDIA and AMD have made mistakes in the past few years, although today both fortunately know how to play their cards well and we have a high level of competition in the graphics card sector. NVIDIA democratized ray tracing with the RTX 3080 and RTX 3070, and the Sunnyvale giant responded strongly with the launch of the Radeon RX 6000, a generation of graphics that is evolving a good value in the price-performance ratio.

In the next few weeks we will see the performance of this new generation of graphics and tell you everything you need to know to judge whether the different models are really a good buy. In the meantime, we invite you to join us to liven up the wait What were the five worst graphics cards in the last ten years? I’m assuming we’ll forego the classic low-end models, which usually offer terribly poor value, in order to offer you a more interesting, complete, and less “predictable” item.

1. The five worst graphics cards in recent history: GeForce GTX 690

Graphics cards

There was certainly plenty to choose from, mostly thanks how much the Kepler architecture from NVIDA has aged. It is a fact that this architecture was a very important change as it marked the disappearance of “hot shaders” and allowed NVIDIA to triple the number of shaders compared to the previous generation based on Fermi.

To give you an idea The GTX 580 had 512 shaders while the GTX 680 had 1,536 shaders. The second tripled the number of shading units of the first, but these worked at a lower frequency because, as we said, this property known as “hot shaders” was eliminated, which allowed the group of cores to work at a frequency of much larger work.

However, this important development has been tarnished by the foundation of an architecture that was developed from the beginning with DirectX 11 in mind. This allowed NVIDIA to offer AMD’s Radeon HD 7000 superior performance in most cases, and sparked a marked controversy that led to two opposing positions: those who said NVIDIA was right to do so A solid bet looking to the present rather than the futureand those who thought exactly the opposite, that AMD would age better.

The truth is that both positions were correct. Those who bought a Kepler-based GTX performed well for several years, and those who bought a Radeon HD 7000 They were rewarded by seeing their graphics card age better as competing alternatives.

Why do we think the GeForce GTX 690 is one of the worst graphics cards in the last ten years? The answer is very simple because it is a model that has all of Kepler’s shortcomings And it has all of the issues associated with a dual GPU setup as well. This graphics card is equivalent to two GTX 680s in a single PC, which means that each GPU is limited to 2GB of graphics memory and displays everything Its potential depends on the SLI mode support that every game features.

The introductory price was more than 1,000 eurosand it has aged so badly that today it is no better than a GTX 680 in most cases due to the poor SLI mode support that most games offer. It was a terrible investment that has aged very badly and therefore deserves to be on this list.

2. The five worst graphics cards in recent history: GeForce GTX TITAN Z.

The five worst graphics cards in the last ten years 35

It was the successor to the previous one, and while it is true that it was a real object of desire at the time, it has aged just as much as the GTX 690 because it precisely presents, the same shortcomings: It is based on the Kepler architecture and suffers from all the complications of dual GPU graphics cards. However, we must say in his favor that this model at least 6 GB GDDR5 for each GPU.

The hardware-level configuration is much more powerful than that of the GTX 690, which has been completely understandable since then The GTX TITAN Z corresponds to the assembly of two GTX 780 Ti Black Editions on a single circuit board. It adds 2,880 shaders per graphics core and 6 GB of GDDR5 (12 GB total), but has very poor support for DirectX 12 and Vulkan and has not aged well at all.

Nowadays we can only use it in a mono GPU configuration in most games, which means that at best it will work at the level of a GTX 780 Ti Black Edition, which means that it is below the solutions currently lower mid-range that we can buy for less than 150 euros, something completely disappointing for a graphics card at the time It costs more than 3,000 euros.

For many NVIDIA fans, this graphics card has become a valued collector’s itemThis, together with the small number of units sold at the time, meant that the used market has a very high price. If you have a unit and you are thinking of selling it, you can get a good price if you find the right buyer, but in terms of performance, its value is very low today.

With games based on DirectX 11 and compatible with SLI mode, like Crysis 3, the GTX TITAN Z made a significant difference, but even in the best of cases It was still not a good option, especially considering that at this point (2014) we could find a Radeon R9 295X2 that performed the same or slightly better in some cases, for around 1,500 euros.

3. The five worst graphics cards in recent history: Radeon R9 Fury X.

The five worst graphics cards in the last ten years 37

The Radeon R9 Fury X didn’t just care because of the high number of shaders that I would use this AMD (4,096) graphics card, but also because of the fact that it would be equipped with HBM storage. These acronyms are known as “high bandwidth storage,” which means high bandwidth storage. It is a very special type of memory because it is stacked in 3D and can form configurations with very high data buses. On this particular model, the memory ran at 1 GHz but had a 4,096-bit bus for a total bandwidth of 512 GB / s.

As with the Radeon graphics cards of the time, based on the GCN architecture, the Radeon R9 Fury X should offer one Good use of the Vulkan and DirectX 12 APIsand its raw performance was well above that of the GTX 980 Ti (8.6 TFLOPs versus 6 TFLOPs). However, the performance of both was very similar for the most part, and over time the AMD solution has aged much worse.

The Radeon R9 Fury X is still a graphics card that performs well at 1080p resolution in any current game, and in most cases also performs well at 1440p with maximum qualities The 4 GB graphics memory take their toll. and they’ve aged a lot worse than the GTX 980 Ti. Considering that both cost the same when they hit the market, it’s clear that the second was a better investment.

This is a clear example of the problems posed by creating such an unbalanced top quality graphics card. It was a big mistake and quite difficult to understand to put together a very powerful configuration and to provide it with an amount of graphics memory typical for the middle range, since AMD had already formed graphics cards with 8 GB of graphics memory (Radeon R9 290X) and R9 390-390X ). I understand that the high cost of HBM storage You motivated this decision, but that doesn’t mean it is no longer a mistake.

The development in terms of drivers was not good. and in relatively old games based on DirectX 11 but still widely used today, its performance is closer to the GTX 980 than the GTX 980 Ti. This helps us a lot to understand why it deserves to be on the worst list Graphics cards of the past ten years.

4. The five worst graphics cards in recent history: 3 GB GTX 1060

The five worst graphics cards in the last ten years 39

When NVIDIA introduced the GTX 1060, it surprised us with an unusual movement as it chose to differentiate between two models. one with 3 GB of graphics memory and the other with 6 GB of graphics memory. A priori we could think that beyond this difference, both graphics cards were the same, but nothing could be further from the truth.

The 3GB GTX 1060 has 1,152 shaders, 72 texture units, 48 grid units and a 192-bit bus, in addition to the 3 GB GDDR5 at 8 GHz. In contrast, the other model has 1,280 shaders and 80 texturing unitsplus double the graphics memory. These differences mark a growing gap in the performance of both graphics cards and have made the 3GB GTX 1060 a graphics card that He doesn’t age well at all.

Back then, This model costs more money than an RX 580, a graphics card that has withstood much better over time and has experienced a kind of “second youth” thanks to the adjustments introduced by AMD at the driver level.

There’s no question that the 3GB GTX 1060 performs well in many games, but the low graphics memory and spec-level cuts take their toll. In some games we’ve seen there are minimum frame rates per second that even drop below 30 FPS and in titles that require a large amount of graphics memory even at 1080p resolutions, such as: B. DOOM Eternal, not only “drowns”, but also forces us to significantly reduce the graphics qualityOtherwise, we will not be able to maintain an absolutely stable 60 FPS.

It was a bad investment and with the new generation of consoles just around the corner its development can only get worse.

5. The five worst graphics cards in recent history: Radeon RX 5500 XT 4 GB

The five worst graphics cards in the last ten years 41

I’m sure more than one put their hands to their heads while reading this election, but there is a very simple explanation. When AMD introduced the Radeon RX 5500XT, it announced the availability of two models: the 4 GB version and the 8 GB version. Both retain the same specification, ie they do not differ in the number or frequency of the shaders, but only in the size of the integrated memory.

So far, AMD has followed the approach of the Radeon RX 570 and 580, which are available in 4 GB and 8 GB versions. However, we have two main problems. The first is that the RX 5500 XT doesn’t offer any significant performance improvement to justify its high price point, there Performance roughly on par with the RX 580 and costs more money. They weren’t a good value buy.

On the other hand, we have to keep in mind that the Radeon RX 5500 XT are limited to the PCIE Gen4 or Gen3 x8 interfaceThis is an important detail that significantly affects the performance of the graphics card when we use it in a PCIE Gen3 x16 slot as it only uses eight lines and is limited by the less available bandwidth. This problem doesn’t occur when we plug it into a PCIE Gen4 slot and it also affects the 8GB model, albeit to a lesser extent.

In short, the 4 GB Radeon RX 5500 XT is a more expensive graphics card than the RX 580 and with one important limitation that in all cases degrades performanceUnless you have a PC with a PCIE Gen4 slot, which is rare to this day. With that in mind, it should come as no surprise that this is one of the worst graphics cards in recent years that has had a tough time.

Click to rate this entry!
(Votes: 0 Average: 0)
Share!

Leave a Comment