Nvidia RTX 2080 Ti is approaching


So you saw the recent BF V cinematics and gameplay footage released recently… and the thought may have crossed your mind “I’m going to need a new GFX card”.

Low-and-behold… Nvidia rumours emerge! Well. Bit more than rumours to be fair. It seems that as early as this Monday, Nvidia will be launching their new RTX 2080 and RTX 2080 Ti.

Spec wise, the folk at videocardz.com have the breakdown. It looks a bit like this:

So, it has the same amount of memory as the existing GeForce GTX 1080 Ti - 11GB - but it now runs GDDR6 which is clocked at 14GBps - with a theoretical bandwidth to 616 Gb/s - approximately 132 Gb/s more than the 1080 Ti.

Also increased are the 4352 CUDA cores. Up from the 1080 Ti 3584.

Check out Nvidia’s RTX real-time cinematic promo video, albeit for the 6000. Tasty.

Price unconfirmed. Might require a remortgage. If I owned my own house. :stuck_out_tongue:


Yeah saw the vid last night and was reading up about it all. Looks tasty but price will probably be silly. At least the other cards should come down


I’m waiting for this and benchmarks and then most likely just getting a 1080Ti


As stookz says the really nice thing about the latest crazy cards coming out is the price drop on the still awesome cards like the 1080 and 1080Ti which will help me when I update my system :slight_smile:


I need benchmarks mainly in Monster Hunter World at 4k, because thats the only game I can’t run at full details, 4K and 60fps. Its highly unlikely it would hit 60 fps right now as the current 1080Ti only gets me around 35 fps, so a 100% performance increase seems utopian. Even then MH:W runs at 60 fps in 1440p which is enough for me anyway, so there is barely any incentive right now getting a new card. The only new game which might be power hungry could be BF 5, but even BF1 run well at 4k (60fps almost max details) and I’m not sure if I pick it up.


The only thing that worries me though is that after the cryptocurrency boom and the GPU shortages everywhere, Nvidia stockpiled and invested a lot into getting enough cards to market. Now that the crypto bubble is gone they are still left with the cards, which will probably lead to one of two outcomes.

  • They take the loss and sell the 10 series cards at a lower price
  • They price the RTX series cards higher than the usual $/e800 / £600ish, so as not to compete with their existing stock

Tbh, I could see the second scenario being more likely than the first, , eg. RTX 2080 will cost $1000 / £800. It would still be cheaper to get a 10 series card, and they don’t take the loss. We’ll find out later today I suppose!




Well predicted @Cobaas

Here’s a breakdown of pricing for various cards on the ebuyer website:


(Scroll to bottom)


That’s a fair bit of pocket money!


I like how the tab for Ebuyer opens with “cheap…”

Ain’t nothing cheap there bucko!


There are already 41 people on Scan buying the new MSI 2080Ti…


Jesus fucking wept:

PSU cables needed: 1 x 6-pin PCIe and 2 x 8-pin PCIe


Bloody hell they’re expensive!


I really need to get my head back in to what AMD have to offer and what the price Vs punch is compared to Nvidia these days.

Kate is still rocking her 7950 2gb which is still beasting considering its age and we’re only really looking to replace it because of some serious coil whine and the fans starting to go. Until it died I was really happy with my 7990 3gb.


AMD’s srongest GPU at the moment is Vega64 and they are roughly equivalent to a 1080, AMD need to put some of the money they made on Ryzen 3, 5 and 7 and Threadripper CPU’s and get it into new GPU’s!

I think custom cooler Vega64’s are pretty decent now but yeah come on AMD make something new!

I used to rock an AMD GPU but nVidia have been ahead of them for quite a long time.


Yeah, was about to say a similar thing…

AMD are in a funny place. Their Ryzen range have some distinct advantages over Intel - more cores for less cost (until you get to Threadripper levels) - but realistically, not much software and therefore not many people, will take advantage of it. Their single core speeds are lacking in comparison.

I love my R9 390 (well, apart from the heat) - it is a solid little card and matches Nvidia’s mid-range cards.

Maybe that is AMD’s problem. They are competing against 2 major rivals - Intel AND Nvidia - and not being able to concentrate on one of them


I think I will skip this generation. My thoughts are two fold. Raytracing as neat as it sounds is probably not well implemented or runs well enough that its worth the money. Game developers need at least 1 year to get the new tech implemented so its present in enough games.

I kinda doubt the performance increase too, I guess it will be slightly below the last gens 30% (1080Ti over 1080) or below the massive increase of the 1080Ti over the older 980Ti (almost 60%).

Until the new tech is widespread enough in games the 3080Ti is probably released which then will also be a massive upgrade again over the older gen as the hardware also matured enough!


I have read a lot about these new cards and this is the first gen of cards that are capable of raytracing in hardware and people have been expecting a lot more out of them when running raytracing when they don’t really understand what it takes to run raytracing real time.

I liken the RTX card to be a bit like the first gen VR headsets that are available in that they are very impressive but the tech will improve and these cards need to exist to allow devs to learn to code in real time raytracing and when the next gen cards come out the raytracing performance will be even more impressive.

I have read some comments that liken the RTX cards to the original physx cards! Remember those!? Now that technology is built in to GPU’s.

I am interested to see how the RTX cards perform in traditional rasterisation rendering as it looks like the non Ti cards have less CUDA cores but we don’t yet know if they have made IPC improvements to the cores and so we need to wait for actual benchmarks to see how these run regular rasterisation stuff vs the previous gen stuff.

Nvidia have outdone themselves with marketing BS on this one since they refer to the performance for raytracing which wasn’t possible on previous gen cards and then show “graphs” showing how many times better they are than previous generations rather than any meaningful numbers comparable to the current gen stuff.

I’m waiting to see the actual performance on current and older games!


It will be similar to the first VR headsets, not a great user experience but pure gold for the developers to implement. The cards themself could be nice for sciency stuff though, could reduce rendering time quite significantly in certain projects.

Lets wait for numbers, maybe they proove me wrong :exploding_head: