PC Building Guide and Discussion #12

Commander Clueless

Apathy of the Leaf
Sep 10, 2008
15,560
3,463

guinness

Not Ingrid for now
Mar 11, 2002
14,521
301
Missoula, Montana
www.missoulian.com
Personally, I don't think 30-50 FPS is bad at all, but with the market moving towards 1440p, 4k, HDR, 144 Hz monitors, or sometimes all of the above, woof.

These cards would still smoke at 4K though, just without any raytracing.
 

Matias Maccete

Chopping up defenses
Sep 21, 2014
9,708
3,647
30-50 FPS? Hell no. I'm so out of the loop with all of this. Since I got my 1070 and 8600k I stopped following all of this since I'm not planning on upgrading for several years. I got like 4 years out of my 670, and 5 out of my 3570k, I'll shoot for similar with this set up.

I'll need to google what the hell raytracing is. It seems like a weird thing to focus on if they can't even hit 60FPS with it though. Hey these new super expensive cards are way better our last series (at raytracing, and even then they struggle). Yeah f*** that noise, seems better to wait until raytracing is more prevalent, and a high end card can actually handle it.
 

Nuckles

_________
Apr 27, 2010
28,436
3,919
heck
Geforce RTX 2080 Ti: Raytracing-Performance in Shadow of the Tomb Raider (Off-screen-Gameplay)

Apparently, Raytraced Shadow of the Tomb Raider runs at 30 to 50 FPS @ 1080p with a 2080 Ti. So the 2080 and 2070 is probably pretty worthless for raytracing huh?
For as much as I've disliked what nvidia has done so far with the announcement of the RTX series, I wouldn't be surprised if the driver for the 20xx cards isn't quite optimized yet and part of the reason why they aren't being released for another month.
 

Matias Maccete

Chopping up defenses
Sep 21, 2014
9,708
3,647
Huh, the only video that directly compared a scene with raycasting on and off wasn't all that crazy. There was a battlefront like demonstration that looked great, but it didn't have any raycasting off times, and also didn't seem like it was from a game. I'll reserve judgement until I see a lot more, but I'm not seeing what the huge deal is yet.
 
Last edited:

Commander Clueless

Apathy of the Leaf
Sep 10, 2008
15,560
3,463
As a graphics programmer, I wish.

It's not exactly new, by the way. Most of our knowledge and techniques around lighting in graphics are a result of ray tracing algorithms.

Very true, although I believe the "new" technology is the real-time side of it. As I understand it, that has previously required far too much rendering power to be realistic.

With that in mind, the ability to do it in real time at 30-50 FPS at 1080p on a $1,200 ($1,600 CAD) card is extremely impressive. Unfortunately, from a consumer perspective, it hardly seems worth the cost....particularly as it appears to still very much be in the "early adopter" stage.

As a disclaimer, I'm the type of person who will happily reduce or turn off shadows in games for increased performance, as they tend to carry a lot of weight and I don't find them to be a big deal personally, so I'm probably a bit biased. I'm also a bit of cheap bastard. :laugh:

Also, considering some pre-orders are already sold out, I'm guessing not everyone agrees with me. :laugh:
 

Commander Clueless

Apathy of the Leaf
Sep 10, 2008
15,560
3,463
I wish I grabbed the 2080 from EVGA. The non-ultra one was $750, which is pretty good. Love their warranties.

Probably best to wait to see benchmarks, reviews, and what the price settles out at anyway.


Although I will say I love the look of the new EVGA cards.
 

waffledave

waffledave, from hf
Aug 22, 2004
33,475
15,883
Montreal
Very true, although I believe the "new" technology is the real-time side of it. As I understand it, that has previously required far too much rendering power to be realistic.

With that in mind, the ability to do it in real time at 30-50 FPS at 1080p on a $1,200 ($1,600 CAD) card is extremely impressive. Unfortunately, from a consumer perspective, it hardly seems worth the cost....particularly as it appears to still very much be in the "early adopter" stage.

As a disclaimer, I'm the type of person who will happily reduce or turn off shadows in games for increased performance, as they tend to carry a lot of weight and I don't find them to be a big deal personally, so I'm probably a bit biased. I'm also a bit of cheap bastard. :laugh:

Also, considering some pre-orders are already sold out, I'm guessing not everyone agrees with me. :laugh:

Oh for sure. Doing it in real time is truly impressive and insane. But I don't think it's a gimmick or fad, I think that was the goal all along. Most advancements we've made in computing have been us simply waiting for tech to be good enough to pull off things that were originally theoretical.
 

Commander Clueless

Apathy of the Leaf
Sep 10, 2008
15,560
3,463
Oh for sure. Doing it in real time is truly impressive and insane. But I don't think it's a gimmick or fad, I think that was the goal all along. Most advancements we've made in computing have been us simply waiting for tech to be good enough to pull off things that were originally theoretical.

You may very well be correct. I think the "gimmicky" part of the equation from the consumer side is being sold something hard that ends up seeming like simply enhanced shadows/lighting/reflection.

Of course, from the professional side, that's probably a gross oversimplification....however, to gamers, that's about what you're getting at a significant price hike and, most likely, performance hit.

As a gamer, it's hard not to draw mental parallels to features like HairWorks, although I realize it's quite different considering raytracing involves an actual shift in hardware to accomplish.

For the record, I do think this represents an industry shift. I'm just not convinced it's worth adopting this early into it.
 
Last edited:
  • Like
Reactions: Matias Maccete

Sined

The AndroidBugler!
Jun 25, 2007
7,129
25
I guess we'll see, but the evidence I have so far doesnt sound good
Raytracing is the theoretical mathematical representation of how light works in real life.
While the jury is out on whether or not it runs well on current available technology, there is absolutely no doubt that this is the future of photorealism in any medium that uses computer generated graphics.
 

Commander Clueless

Apathy of the Leaf
Sep 10, 2008
15,560
3,463
I guess we'll see, but the evidence I have so far doesnt sound good

I think it's just early, if that makes sense.

As far as I understand it, it's already being used heavily in CGI rendering for movies and the like. It makes sense to bring it to a real time environment like gaming, but until now that's been out of reach.

As usual, it will take time to adopt support for it. If we can get it to a point where it's viable without asking too many sacrifices in terms of performance and price, I think you'll see it take off.

Just the opinion of a random idiot on an internet forum, though, so take it for what it's worth. :laugh:
 

Matias Maccete

Chopping up defenses
Sep 21, 2014
9,708
3,647
Raytracing long term could be great, what I'm questioning is dropping a thousand bucks for a card that can only do it at 1080p @30fps. It screams, "wait until the next gen or two" to me. I'd like to see more videos comparing gameplay with and without before I say anything about the tech itself, but as far as the cards go, I'm skeptical that they'll be worth getting for raytracing.
 

guinness

Not Ingrid for now
Mar 11, 2002
14,521
301
Missoula, Montana
www.missoulian.com
Raytracing long term could be great, what I'm questioning is dropping a thousand bucks for a card that can only do it at 1080p @30fps. It screams, "wait until the next gen or two" to me. I'd like to see more videos comparing gameplay with and without before I say anything about the tech itself, but as far as the cards go, I'm skeptical that they'll be worth getting for raytracing.

Pretty much, they will be awesome cards for everything else.

The ray tracing does sound fantastic, and will make for more realistic games, but the math is hard, and the amount of VRAM required is substantial. However, between the reviewers (tech enthusiasts, whom probably aren't paying for these cards) and miners, I'm not happy at the price/features creep.

But I'm just sitting here with a 970 and 580, and other than the 970's gimped VRAM, I feel those cards can do 1080p just fine. 580 can do 4k for older engine games.
 

Matias Maccete

Chopping up defenses
Sep 21, 2014
9,708
3,647
Pretty much, they will be awesome cards for everything else.

The ray tracing does sound fantastic, and will make for more realistic games, but the math is hard, and the amount of VRAM required is substantial. However, between the reviewers (tech enthusiasts, whom probably aren't paying for these cards) and miners, I'm not happy at the price/features creep.

But I'm just sitting here with a 970 and 580, and other than the 970's gimped VRAM, I feel those cards can do 1080p just fine. 580 can do 4k for older engine games.
Yeah I'm in no rush to upgrade. Hell my 670 did well enough at 1080p for 4 years. When I start having to turn games down to medium, I'll start looking for deals, and when I find one I'll jump on it. I got my 1070 for $335 shortly before the crypto insanity. I'll do the same in a few years, and by that time Ray tracing will probably be more widely supported, and cards will be able to run it at higher FPS.
 

Ad

Upcoming events

Ad

Ad