**Benchmarking AMD Radeon RX 6800 XT with Fortnite and Black Ops**
Having a little bit of a future-proof, I don't think that changed my mind. Next we have Fortnite, running full 4K Epic settings, pretty much everything completely maxed out here. For reference, we were able to get just about 60 FPS on the RTX 2080. Look, Fortnite's hard to benchmark. You can't get the exact same scenarios over and over again. I want to try to at least be in the same location and keep things as consistent as possible. Yeah, I'll call that about 49 FPS. It's, again, better than Vega 64, but what we're really lacking here is any kind of real win for the Radeon 7 yet. It's fine, I mean, it's not bad. We are able to play a lot of 4K games at close to 60 FPS maxed out, but close to 60 FPS is not the same as 60 FPS, which is pretty much what the RTX 2080 is giving us.
So with Black Ops, we're able to run it again 4K maxed out. The only thing I'm doing is turning the anti-aliasing down to Medium, but besides that, we should be good. Now for reference, we were able to get 72 FPS on the 2080 like this, so let's see what Radeon 7 will give us. Also, the GPU's at 99 degrees right now? What, no! Whoa, whoa, whoa, we're pulling, like, 420 from the wall now. That's, like, 70, no 40, 50 more watts than we were pulling earlier? Okay, if you ignore the ridiculous temperature, Black Ops is actually taking advantage of all of our memory. Look at that, we're at 10 gigs of RAM? Is it gonna hit a limit at some point? It's just going up. It's just (chuckling), it's just going up.
Now, I mean, I guess the standard sort of thing applies. This is pre-release, I don't have final drivers, so there could be some weird bugs, but nothing that I was reading about said anything about this. I mean, performance is actually pretty good here. I mean, we're getting 75 to 80 frames per second. And you know what, the VRAM actually has stabilized. It's using about 13.5 gigs of RAM, which is ridiculous, but you know what, if you've got 16 gigs of RAM, why not? Yeah, you know what, that VRAM usage actually looks accurate. We were using about 15 gigs of RAM. Everything makes much more sense today. So Black Ops was actually telling the truth.
After doing a little bit of digging, AMD has actually changed the way they report the temperature on the new GPUs. As opposed to the edge temperature which just shows, like, 70, 74 degrees, which is still the limit, instead it now also shows the top temperature of the entire GPU which is known as the junction temperature, and that actually can run up to 110 degrees safely. And it did confirm that, yes, Black Ops does actually use more than 8 gigs of RAM. Whether or not that makes a big difference to performance is kinda hard to tell, but it did match the 2080.
So like the other games, we are running Battlefield at 4K, DX12 enabled, and pretty much everything's on Ultra. The main difference is that there's no ray tracing enabled since I want to keep things fair between the Radeon as well as the GTX cards or the RTX cards. For reference, the RTX 2080 delivered about 48 FPS in Battlefield 1. Let's see what Radeon 7 can do. This pretty much bogs out around 50, 51 FPS, so this is actually the first game that we've seen the Radeon 7 do better than the RTX 2080.
All right, I call that 51. There are some legitimately cool things with the Radeon 7. The fact that it's the first seven-nanometer GPU is interesting, but the main thing holding it back is that it's essentially a Vega 64 that has been shrunk down, given higher core clocks as well as more memory, but at its, well, core, it's still pretty much the Vega that we've had for the last couple of years. It comes close to the 2080, but it can't quite take the win, not unless you're really taking advantage of more than that 8 gigs of RAM. So I'm curious, would you want to pick up the Radeon 7 or the RTX 2080, or would you rather just not spend $700 on a graphics card?
WEBVTTKind: captionsLanguage: en- Hey guys, this is Austin.You might remember this, the streaming PCthat we built last yearfeaturing Ryzen 7 2700X.The only issue is that, well,it hasn't exactly seena lot of love recently.But lucky for us, there's justa little bit of an upgradecourtesy of our friends over at AMD,who are launching Radeon 7 today.We got to take a quicklook at Radeon 7 at CES,but this is essentiallythe new high-end Radeon card from AMD.Now, really, the target here is very muchmore along the lines of an RTX 2080,and with a $700 price tag,this certainly does not come cheap.Okay, so not only do wehave our graphics card,but we also have, wow,this is heavy (laughing),the GPU itself.So do you remember backwhen Vega 64 came out,they also sent us areviewer's kit with the GPU.However, here it's a little bit different.So if you take a close look,you'll see that not only do you havethe actual Radeon 7 GPU itself,but you also have the HBM2 memorywhich is on all four sides.Personally, I just love takinga look at this kind of stuff.So unlike Vega, this is a littlebit more of a beefy cooler.We do have three fans,and, oh yeah, look at that.It fits pretty much perfectly.Dude, that looks really cool!If you put Radeon 7side-by-side with Vega 64,at least for the reference coolers,they look very similar,although of course with this guy,we're getting the triple fan set upand honestly what looks likea much-beefier heatsink on the inside.Essentially what you're getting hereis a cut-down version of theirmuch more expensive cards,but because it is still based onthat seven-nanometer process,it should be a pretty decent performer.There's only one way to find that out.Let me guess, is there multiple ways?But I'm just gonna test it now.So to properly put Radeon7 through its paces,I've gathered its closest competitors,not only the RTX 2080as well as the 2080Ti,but also that Vega 64.At this point, I'vebenchmarked all the cardsinside our Ryzen system,so now it's time to give Radeon 7 a try.So while Time Spy runs,I have a power meterif you can see on this,oh, oh, okay, I can dothis, I can do this.I have a power--(power meter button clicks)(Austin laughing)I can't believe I actually did that.Dude, who put the power strip,the thing for the button forthe power strip on the side?All right, let's tryTime Spy one more time.So power consumption looks pretty decent.We're at 382 watts versus 342 on the 2080as well as just over 400 withVega as well as the 2080Ti.Not a massive difference,but considering that it'sa similar card to Vegaexcept with much higher clock speedsas well as that seven-nanometer process,you know, it could be worse.And we've got 8647.Ooh, that's not so hot.So it's a little bit faster than Vega 64,but that is significantlybehind both the 2080and especially the 2080Ti.Gotta say, if that's really the caseand the benchmarks wetry are similar to that,I'm gonna be prettydisappointed with the Radeon 7.All right, next up we have GTA V.Now this is runningessentially maxed out at 4Kwith the exception of MSAA is set to 2X.Let's see how it performs.So for reference, the 2080,which this really shouldbe competing with,delivered about 56 FPS here,so that's definitely what we need to hit,or at least get as close as possible, so.We're not terrible, it'sdefinitely faster than Vega,but that's still behindthe 2080 for the most part.And our result is 48.5.Yeah, I mean, that's better than Vega,but it's not (chuckling),that's behind the 2080by a pretty significant margin.I mean, it's an improvementover what you're getting on Vega 64,but the issue is thatthis card costs $700,the same as the RTX 2080, and so far,you're not quite gettingthat level of experience.Performance?Now there are some advantages to Radeon 7.So you do have double the memory.This has 16 gigs of RAM versuseight which is on the 2080,and for some applications, I mean,games aren't really pushingmore than 8 gigs of RAM yet,but a lot of video editing stuff canand 3D rendering can hit10, 12, 13 gigs of RAM.It's cool, too, right?You've got the same amount ofVRAM for your graphics cardas I have for entiresystem RAM with this guy.So, I mean, it's cool there's stuffthat you can do with 16 gigs of RAM,but I care much more aboutthe actual performance.Having a little bit of a future-proof,I don't think that changed my mind.Next we have Fortnite.Again, we're runningfull 4K Epic settings,pretty much everythingcompletely maxed out here.For reference, we were able to getjust about 60 FPS on the RTX 2080.Look, Fortnite's hard to benchmark.You can't get the exact samescenarios over and over again.I want to try to at leastbe in the same locationand keep things as consistent as possible.Yeah, I'll call that about 49 FPS.It's, again, better than Vega 64,but what we're really lacking hereis any kind of real winfor the Radeon 7 yet.It's fine, I mean, it's not bad.We are able to play a lot of 4K gamesat close to 60 FPS maxed out,but close to 60 FPS isnot the same as 60 FPS,which is pretty much whatthe RTX 2080 is giving us.So with Black Ops, we're ableto run it again 4K maxed out.The only thing I'm doing isI'm just turning theanti-aliasing down to Medium,but besides that, we should be good.Now for reference, we were able to get72 FPS on the 2080 like this,so let's see what Radeon 7 will give us.Also, the GPU's at 99 degrees right now?What, no!Whoa, whoa, whoa, we're pulling, like,420 from the wall now.That's, like, 70, no 40, 50 more wattsthen we were pulling earlier?Okay, if you ignore theridiculous temperature,Black Ops is actually takingadvantage of all of our memory.Look at that, we're at 10 gigs of RAM?Is it gonna hit a limit at some point?It's just going up.It's just (chuckling),it's just going up.Now, I mean, I guess, thestandard sort of thing applies.This is pre-release, Idon't have final drivers,so there could be some weird bugs,but nothing that I was reading aboutsaid anything about (chuckling) this.I mean, performance isactually pretty good here.I mean, we're getting 75to 80 frames per second.And you know what, the VRAMactually has stabilized.It's using about 13.5gigs, which is ridiculous,but you know what, if you'vegot 16 gigs of RAM, why not?Yeah, you know what,that VRAM usage actually looks accurate.We were using about 15 gigs of RAM.Everything makes much more sense today.So Black Ops was actuallytelling the truth.After doing a little bit of digging,AMD has actually changed the waythey report the temperatureon the new GPUs.So as opposed to the edge temperaturewhich just shows, like, 70, 74 degrees,which is still the limit,instead it now also showsthe top temperature of the entire GPUwhich is known as thejunction temperature,and that actually can runup to 110 degrees safely.And it did confirm that, yes,Black Ops does actually usemore than 8 gigs of RAM.Whether or not that makes abig difference to performanceis kinda hard to tell,but it did match the 2080.So like the other games,we are running Battlefieldat 4K, DX12 enabled, and prettymuch everything's on Ultra.The main difference is thatthere's no ray tracing enabledsince I want to keep thingsfair between the Radeonas well as the GTXcards, or the RTX cards.So for reference, theRTX 2080 delivered about48 FPS in Battlefield 1.Let's see what Radeon 7 can do.This pretty much bogsout around 50, 51 FPS,so this is actually thefirst game that we've seenthe Radeon 7 do better than the RTX 2080.All right, I call that 51.There are some legitimatelycool stuff with the Radeon 7.The fact that it's thefirst seven-nanometer GPUis interesting, but themain thing holding it backis that it's essentially a Vega 64that has been shrunk down,given higher core clocksas well as more memory,but at its, well, core,it's still pretty much the Vegathat we've had for the last couple years.It comes close to the 2080,but it can't quite take the win,not unless you're really taking advantageof more than that 8 gigs of RAM.So I'm curious, would youwant to pick up the Radeon 7or the RTX 2080, or would yourather just not spend $700on a graphics card?