Click to view our Accessibility Statement or contact us with accessibility-related questions
FuckHead-Fi
424
Nov 23, 2018
The RTX series is a joke.
MiTo
13955
MiTo
Nov 23, 2018
FuckHead-FiBut for this price isn't it a far better option than the 1080 Ti?
Sadness
0
Nov 24, 2018
MiToI agee.
Anzial
1494
Nov 24, 2018
MiTo1080tis are much cheaper, and have more VRAM. The RTX feature on these is useless as any RTX title will slow down to a crawl when you enable it. On top of it, 1080 ti has proven track record of reliability while RTX launch is plagued with quality control issues. It's a half-baked tech that nvidia clearly rushed to the market before they ironed out all the kinks. All in all, it's a failed launch not unlike Nvidia FX back in the day.
MiTo
13955
MiTo
Nov 24, 2018
AnzialThe only thing that I disagree with is your 1st statement, where did you find a 1080 Ti for cheaper than this? I’m sure I must be missing something...
Xannen
63
Nov 25, 2018
MiToNo. The 2080 is only marginally better performance for hundreds of dollars more. Turing series isnt worth the price hike.. you're literally paying for Nvidia's R&D for this series. They are passing the cost of their development onto gamers for ray-tracing capabilities that nobody asked for.
techwiz
235
Nov 25, 2018
XannenI have to disagree with you on that last point, ray tracing is pretty sweet. The fact that we can do that real time is also pretty amazing. Edit: Link about Disney's use of ray tracing from 2015 https://www.engadget.com/2015/08/02/disney-explains-hyperion-renderer/
Xannen
63
Nov 26, 2018
techwizI was at the Nvidia announcement for Turing in Vancouver. Not at all questioning that Ray-tracing and next gen lighting is a key next step. What I have an issue with, as do many others, is how they are releasing this now, clearly half-baked, with little to no games to take advantage of it, and those that do come at such a huge frame rate penalty, that it makes the 20 series a wasted effort in many respects. I do applaud Nvidia for taking that step, I really do.. just don't think they put enough time into that.. the whole release needed to bake another couple months.
Anzial
1494
Nov 26, 2018
MiToreddit hardwareswap
techwiz
235
Nov 26, 2018
XannenTo be fair, this is how new technology always rolls out. It was this way with PhysX, Hairworks, to a lesser extent Ansel and now RTX. Devs are only now getting their hands on the hardware so they’re only now able to produce the engines that can take advantage of the new software platform. Give it a few more months and everyone will need RTX hardware to get the “best” experience. I think the first game with Hairworks came to the market 6mo after the nvidia announcement. Just gotta have some patience.
Anzial
1494
Nov 26, 2018
techwizTo be fair, none of those techs affected FPS as mush as RTX. BF5 very limited RTX implementation halves the performance! Maybe the next gen will provide a satisfactory RTX performance, 20 series simply don't have the muscle to deliver any kind of decent RTX experience even though they cost an arm and a leg and RTX is touted as it's main feature.
techwiz
235
Nov 26, 2018
AnzialI mean, yea... but it has to start somewhere. The 2080ti is still a good card when DXR is turned off, and a 1080ti would die trying to do real time ray tracing like the RTX series can. It's all relative. Ray tracing isn't new technology, but hardware accelerated ray tracing is and that's what allows this to be enabled at all in games, which I think is pretty boss. Before RTX, you had to pre-render and pre-bake all these reflections so that they looked nice and performed well in real time. Being able to perform those calculations on the fly is pretty dope and will definitely lead to significantly nicer looking games going forward. And maybe reduce game sizes since shaders won't have to be pre-baked as much anymore. First gen tech gripes aside, DXR looks really promising, but it's absolutely the new "but can it run Crysis?"
Anzial
1494
Nov 27, 2018
techwizAgain, RTX on 2080 ti is meaningless because it destroys the performance. Nvidia should've introduced the tech and sell it for cheap to get things rolling with ray-tracing but with this overpriced approach, nvidia leaves the door wide open for AMD and/or Intel to come with their own raytracing tech and take over the market after which RTX will be useless not unlike PhysX. Point being, inflated prices on RTX series severely limits its adoption and game developers will not have any incentive to develop something for RTX, unless nvidia pays for it, like they did with hairworks/PhysX - and look where those ended up. Nowhere, that's where. Same thing will happen to RTX.
techwiz
235
Nov 27, 2018
AnzialOk, you seem to have some fundamental misunderstandings of the technologies, so let me try and address them one at a time. PhysX, you say is useless, but it's the #1 physics enabling library in use today. Almost every major physics engine uses PhysX acceleration whenever possible over OpenCL because it's more optimized than the generic OpenCL counterpart. You probably don't even realize it, but I bet a large portion of the games you enjoy utilize PhysX in some way whenever possible. Hairworks - Same as PhysX. Everywhere you have hair, fur, fuzz, grass or separate leaves, there's a good chance it's leveraging Hairworks or PhysX to make it looks more convincing. Again, probably not something you notice unless you're looking for it. Also not advertised as a prominent feature in games, because most people have no idea what it is. Ray Tracing, like I said, isn't new technology. We've been doing ray tracing for well over a decade in software and some hardware with OpenCL. Hell, even Unity 3D has some ray tracing capabilities built in. What nvidia is offering is a hardware platform that accelerates a task that used to be measured in seconds per frame, not frames per second. This is an extremely expensive operation. DXR isn't some software doohickey they can just switch on and call it a night, it's all new silicon. We have been able to enable ray tracing in games for the better part of a decade, but no one does it 100% because it's so damn expensive. You can see smaller implementations of ray tracing in games like Inside where the lighting interacts with physical objects to create that spooky atmosphere. nvidia's hardware is enabling that for *all* lighting in real time, not just a spotlight or 2. I bet CAD users will find the RTX enhancements well worth the extra money to cut their render times significantly. AutoCAD uses ray tracing extensively for its materials-based lighting render pipeline. Now with all that said, if you think nvidia is going to go ahead and burn billions to research a technology and then offer it to you at a discount, I'll have whatever you're smoking. There isn't another card on the market that can do ray tracing like the RTX series of cards. For the time being, this custom silicon has them untouchable with regards to this technology until others come up with their own silicon. Which, as you might imagine, doesn't just happen overnight (as evident by nvidia saying it took them years to make this happen). So, no, Intel or AMD aren't going to come out of the woodwork with their own ray tracing tech anytime soon. At least not for another 2 years based on typical silicon R&D timescales. Making a new product is hard and expensive af. If the RTX series is too expensive, just go for a 10 series card instead. You won't be missing much, as like you said most games don't really support ray tracing yet and the 20 series cards only perform a bit better than the 10 series at non-DXR tasks. No big whoop. People who do need ray tracing however, like pros doing CAD work or physics sims, will be very happy with this new lineup. But you can't expect them to drop dope tech for pennies over the base cost of the regular GPU, that's just bad business and won't allow them to continue with their research. This is a taste of the future, just like CUDA and PhysX were a taste of the future back in the days of Fermi and Kepler.
Anzial
1494
Nov 28, 2018
techwizRTX in CAD? You made me lol. Nvidia is not going to do CAD RTX. CAD and other professional software is served by entirely different line of videocards, ever heard of Quadros? Those comes with entirely different drivers and software to support professional use. I'm sure nvidia will come up with RTX equivalent but it's not going to be the same thing there. On top of which, AMD/Intel will be likely to offer their vision of raytracing acceleration and they won't be using RTX. As for billions of R&D costs, nvidia did it for AI, not raytracing, RTX was obviously something they just slapped on at the last moment, hence such terrible performance in raytracing on these videocards. Nvidia is making boaltloads of money from this research elsewhere, but on gaming front, they will lose, and lose big, and they won't care As for PhysX and hairworks - name the latest titles that use them? BF5, for example, introduced RTX and yet completely ignores physx and hairworks. There's barely ANY game that use those technologies. You are the one who doesn't realize the fact that in most cases they'll just use closs-platform compatible libraries to implement those effects, not nvidia-proprietary ones. Yes, that is slower but this way they avoid relying nvidia proprietary tech that most users don't have. The majority of PCs around the world run on Intel integrated GPUs, plus there's a sizeable AMD presence, and game makers can't ignore those in favor of nvidia-only techs. So yeah, most of developers just don't bother with physx/hairworks because they require time to implement which is add costs to what they want to be as optimized as possible for the widest range of configurations. With the way RTX is priced, it will meet the same fate.
techwiz
235
Nov 28, 2018
AnzialWhat are you talking about buddy? You do realize Quadros and GTX cards share the same silicon, right? The titan line is the fine line between Quadro and GTX, not to mention Quadros cost $2-4k a pop. Also every game made with unreal engine or unity 3D utilize PhysX for their physics engine. And battlefield has had PhysX support since at least the launch of DICE’s frostbite engine with Bad Company and it’s destructible environments. And all the tomb raiders since the series reboot have all used hairworks. One last FYI, PhysX and the rest of nvidia’s gameworkx toolkit are cross platform and hardware agnostic. Nvidia just promises performance for nvidia cards due to their tailored hardware and drivers. But no, devs don’t always opt for cross platform stuff that often, look at how popular DirectX is. Big companies just make the graphics engine support multiple platforms like DX12 and Vulkan.
Anzial
1494
Nov 28, 2018
techwizWow. you are so clueless, it's just pointless to argue with you any more. Frostbite does NOT use physx, never did, they have their own physics engine, and if you don't believe me, check it yourself with PhysX indicator in the game. Tomb Raider uses AMD's TressFX and PureHair, not nvidia's hairworks. On top of that, you complete misunderstanding of the differences between GTX and Quadro is astounding, to put it mildly. Anyways, I'm done with you, you can continue to live in your own bubble, just don't be shocked when the reality hits you in the face.
techwiz
235
Nov 28, 2018
AnzialYou're right, Frostbite is powered by Havok which does not have a PhysX backend. I thought it did. But Quadro and Geforce are definitely the same silicon. Here's nvidia's own words on the difference between Geforce and Quadro: https://nvidia.custhelp.com/app/answers/detail/a_id/37/~/difference-between-geforce-and-quadro-gpus Not much on there is hardware specific, it's mostly all software differentiation. The same GPU silicon is just configured for a different workload, like more CUDA or higher precision FPUs. They're not different products, that would be silly and expensive. As for Tomb Raider, maybe I'm mixing it up with Witcher? I remember there being a lot of hype about hairworks vs TressFx and I think I mixed up which was which. IIRC, TressFX was less polished than Hairworks, but Hairworks ran poorly on AMD (as per usual). Still, both Witcher and Tomb Raider ran cross-platform... because both those techs are just libraries and algorithms. They're not platform or hardware specific in any way. Just like every other SDK out there...