Top
550°

Nvidia Announces GTX780Ti 12GB GDDR5 – 4K and 4K Surround Gaming

HardwarePal : We previously stated that Nvidia would announce a GK110 based GPU and the day has finally arrived. But as we and most people thought that the GPU would be the Ti version of the GTX 770 i.e. GTX 770 Ti, Nvidia has gone a step further and announced the GTX 780 Ti.

Read Full Story >>
hardwarepal.com
The story is too old to be commented.
SlyGuy1038d ago (Edited 1038d ago )

"But according to us, the GTX 780 Ti may have 2880 Cuda cores while having a 384-bit memory interface and up to 12GB GDDR5 memory."

Not confirmed, but I hardly think it will come with 12GB of VRAM unless it was the 790 maybe.

KingPin1038d ago

whats the point of this?
let be honest here, the price of this thing is going to be insane. and no games in the next few years will be able to take full advantage of the hardware and eventually when it does, these cards will be the norm in the market. so buying anything like this now is just a complete waste of money if they actually plan to sell it in the near future.

SirBradders1038d ago

You could say that about any product though the initial price is extortianate but after a while it drops and if no one purchased the initial few then companies wouldn't be able to keep releasing updated tech. If you got the money spend it (although i aint lol).

KingPin1038d ago

well i was comparing it with other parts like CPU and RAM...i mean having a PC with something like 64GB of ram now is possible, but it isnt worth having. however getting one with 32GB might be even if you doing extreme video editing. same with CPU, 64bits and dual-cores came out about the time 32Bits were started to show their age. same with quad cores and octa-cores now.

but i see where your point so yeah, not disagreeing with you. although i'd like to see what the graphics would look like now with this card getting maxed out.

andrewer1038d ago

Why the need of saying GDDR5? Every GPU nowadays use it...With those rumors of GDDR6 coming next year I came thinking it was really GDDR6 lol

Show all comments (8)