Trending February 2024 # Rtx 4060 Ti Review Roundup – How Is The 4060 Ti Rated? # Suggested March 2024 # Top 3 Popular

You are reading the article Rtx 4060 Ti Review Roundup – How Is The 4060 Ti Rated? updated in February 2024 on the website We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 Rtx 4060 Ti Review Roundup – How Is The 4060 Ti Rated?

RTX 4060 Ti review roundup – how is the 4060 Ti rated?

The results are in for the RTX 4060 Ti

The RTX 4060 Ti is almost here, and that means the review embargo is lifted – just 24 hours or so before the release. Let’s talk you through our RTX 4060 Ti review roundup. Plenty of top tech sites have reviewing the latest GPU from Nvidia’s 40-series.

This more mid-range option could provide suitable power at a more modest budget. It’s previous-generation counterpart, the RTX 3060 Ti, was a massive success – so let’s see if the 4060 Ti can follow suit. Nvidia’s 8GB version is arriving first, so that’s what’s in the firing line right now.

We’ve had a look at some of the best reviews on the web right now. The verdict is in, and it seems that many reviewers are drawing the same conclusions. Let’s see how well the RTX 4060 Ti has fared.

RTX 4060 Ti review roundup

Now we can jump into the RTX 4060 Ti reviews. In the build up to the release day, we’ve been covering all about the 4060 Ti – from the size to the hashrate, and more. Despite all that, there’s perhaps nothing better than a hands-on review.

This is where we get a better judgement on the new graphics card, and any shared opinions can sway our own thoughts on the RTX 4060 Ti. So, let’s see if the new GPU is doing well, or you should be looking at some equivalents instead.

First up, Dexerto. They jump straight into their review with a 3 out of 5 stars rating. A lot of focus is being put on the 8GB of VRAM included in this version of the RTX 4060 Ti – and whether it’s enough. Especially for $399. The 3 stars isn’t exactly the warmest welcome from Dexerto.

The lack of sufficient VRAM is a point of contention for many consumers, especially those that want to play on higher resolutions. Dexerto do point out that the card is more geared towards 1080p though. Their verdict is that the card makes too many compromises to be highly rated.

Now onto Tom’s Hardware, who are technically a little more generous with a 3.5/5 rating. They point out similar issues though, with the memory capacity and bus width compromising on performance at higher resolutions. They are happy that the $399 price tag remains unchanged from last-gen though.

They compare the 4060 Ti with last gen’s 3070, and the 8GB RAM limit’s the GPU’s future proofing. Also pointing out that DLSS 3 increases latency, so it may not actually be worth the compromise on other specs for this technology for many users.

PC Magazine is up to next. And again, another 3.5 rating out of 5. So far, the card is looking pretty average and nothing too special. All the praise is for the 1080p performance and cool, efficient performance. They actually mention the price as pro too – at least compared to current competition.

When it comes to cons, yet again the memory (chiefly the limited bandwidth) is criticized. They label the 1440p performance as ‘so-so’, and the potential for 4K gaming is limited. So, lots of focus on resolution as expected. They do point out that ray tracing performance is great for 1080p.

Ars Technica offers a contrast right off the bat. Labeling the RTX 4060 Ti as “one of the best $400 GPUs” – though “it’s still a bit of a letdown”. One of the main problems here is that the performance increases just aren’t what they expected after moving on a generation.

They do praise the power consumption and 1080p ray tracing performance, as well as commending it a little bit at 1440p. However, they don’t really see it as a strong enough upgrade to the RTX 3060 Ti. Pointing out that DLSS frame gen is a nice feature, but it’s “not a magic wand”.

The first thing we notice from Techspot’s review is that they’re straight to call out the 8GB of VRAM, especially at $400 these days. In fact, they go on to say that the 16GB model should be the only option, as 8GB just doesn’t cut it anymore at this level.

They’ve put together a bunch of benchmarks as you’d expect, and the gap between the 3060 Ti and 4060 Ti isn’t sufficient enough in some titles. With that, they deem the RTX 4060 Ti overpriced, saying that even the 16GB edition should be around $350, with this 8GB no more than $300.

Finally we have PCWorld, who are just as disappointed. They point out that it’s not really a suitable upgrade over the previous generation, let down by the fact that a card aimed mostly at 1080p can still cost $400. They opt for a 2.5/5 rating, with “technical decisions” letting the card down.

They do praise the card for being ideal for 1080p high refresh rates, but there are plenty of cards on the market that already achieve this. The DLSS 3, Reflex, among other features and power efficiency is where the RTX 4060 Ti earns some points. But overall, it’s too much cash for a 1080p GPU.

Does the RTX 4060 Ti have good reviews?

Unfortunately, the RTX 4060 Ti has not recieved the best reviews. The highest we’ve spotted following the review embargo is just 3.5/5 stars.

Many of the reviews are focusing on the downgrade to the memory bandwidth, and less-than-ideal 8GB of RAM, which negatively affects the GPU at 1440p and 4K resolutions. Although the card is performing well at 1080p with raytracing and maxed out settings, many reviewers are disappointed that a card chiefly designed with 1080p in mind can still cost $400.

However, many admit that the price point is not the worst thing, when compared to rest of the competition right now, especially with the addition of DLSS 3, frame gen, and lower TDP.

RTX 4060 Ti review roundup FAQs

Is the RTX 4060 Ti good value?

At an MSRP of $399, the 8GB RTX 4060 Ti inherits the same price as it’s predecessor. This means it’s good value for money, but only compared to the rest of the market.

Many reviews are quick to point out that the card is still overpriced, but this is much the same for many of the 40-series GPUs.

Is the RTX 4060 Ti released yet?

Reviews for the RTX 4060 Ti turn up just a day before the release. Even though you may spot an early review, that doesn’t mean the card is out just yet.

Reviews start going live on May 23rd, the RTX 4060 Ti is released on May 24th.

You're reading Rtx 4060 Ti Review Roundup – How Is The 4060 Ti Rated?

Rtx 4060 Ti Vs Rtx 2080 Ti: Which Gpu Is Better?

RTX 4060 Ti vs RTX 2080 Ti: which GPU is better?

Which is better? the 4060 Ti or the 2080 Ti?

Gigabyte AORUS GeForce RTX 2080 TI Xtreme

Clock speed

1770 MHz



Memory Bus Width


The RTX 4060 Ti is the latest addition to Nvidia’s mid-range GPU series, boasting the powerful ADA Lovelace architecture that delivers impressive performance for mid-level gamers. However, how does it compare to the well-established RTX 2080 Ti? Let’s delve into the comparison of the RTX 4070 Ti vs RTX 2080 Ti.

While the RTX 4060 Ti is set to launch on May 24, 2023, the RTX 2080 Ti was released on September 27, 2023, giving it a solid four years to establish its presence in the market. Over this time, the RTX 2080 Ti received driver updates, further enhancing its efficiency and optimizing its performance.

Here at WePC, we take pride in providing accurate and accessible information. It’s essential to note that while some of our content may contain opinion-based analysis, we strive to present factual information whenever possible. So, let’s dive into the details of the RTX 4060 Ti versus the RTX 2080 Ti.

4060 Ti vs 2080 Ti: Architecture

The 40 series and the 20 series cards represent a significant generational leap in GPU technology, with two full generations separating them. The new 40 series cards introduce exclusive features such as Nvidia DLSS 3 and frame generation, which have the potential to enhance the performance capabilities of these cards and bring them closer to their counterparts. However, it’s important to consider that their performance is still influenced by the games they support and the specific specifications they offer.

Now, let’s delve into the details of the individual cards. The original RTX 2080 Ti is built on the TU102-300A-K1-A1 GPU variant. Manufactured using TSMC’s 12nm process, it boasts an impressive 18.6 billion transistors packed within a 754 mm² die size. On the other hand, the RTX 4060 Ti features the AD106-350/351 processor, based on the Ada Lovelace architecture. Manufactured using TSMC’s 5nm process, it has a smaller die size of 190mm², although the exact transistor count is currently undisclosed.

RTX 4060 Ti vs 2080 Ti: specifications

Here are the specifications of the two GPUs we are comparing today:

RTX 4060 Ti (8/16GB)RTX 2080 TiGPUAD106-350/351TU102-300A-K1-A1GPU processTSMC 5nmTSMC 12nmCUDA cores4,3524,352Base clock2.31 GHz1.35 GHzBoost clock2.54 GHz1.54 GHzVRAM8/16GB GDDR611 GB GDDR6Memory clock/bandwidth18 Gbps / 288 GB/s14Gbps / 616.0 GB/sMemory interface128-bit352 bitTDP/TGP160/165W250 W

RTX 4060 Ti vs RTX 2080 Ti: things to consider

Here are some very important considerations to make when choosing between the RTX 4060 Ti and the RTX 2080 Ti.

GPU core frequency

Clock speed, also known as GPU core frequency, represents the rate at which the GPU processor operates and carries out tasks. The silicon dies within the GPU is capable of processing instructions at different speeds. Typically, with each new generation, clock speeds are raised, leading to enhanced performance.

When comparing the RTX 4060 Ti and the RTX 2080 Ti, it’s worth noting that the boost clock of the 4060 Ti surpasses that of the 2080 Ti by a whole 1 GHz, making it significantly more efficient in processing tasks. In terms of raw processing power, the 4060 Ti clearly takes the lead, offering an easy win over the 2080 Ti in terms of raw power.


Video memory in a graphics card plays a crucial role as a pixel buffer for graphics, ensuring smoother and more efficient operation, especially at higher resolutions. A graphics card with increased memory bandwidth and capacity excels at handling the demanding pixel counts of larger resolutions.

If your aim is to enjoy 4K gaming, it becomes essential to have a graphics card capable of delivering that level of performance. While the VRAM capacity of the 4060 Ti may not be substantial, NVIDIA is addressing this by releasing a 16GB version. However, in terms of the 8GB version, the 4060 Ti falls short compared to the 2080 Ti’s 11GB of VRAM. Nevertheless, it’s worth noting that the 4060 Ti boasts faster GDDR6X VRAM, while the 2080 Ti utilizes the slower GDDR6 variant.

This distinction means that the 4060 Ti can utilize its lower VRAM capacity more efficiently and at a faster rate compared to the 2080 Ti. Despite having less VRAM, the 4060 Ti’s faster memory technology allows it to optimize performance and handle memory-intensive tasks more effectively than its predecessor.

Processor cores

The internal structure of a GPU processor is crucial as it determines how different core types are organized. These core types, such as CUDA cores for processing and RT cores for ray tracing, along with other components like TMUs and ROPS, directly impact a graphics card’s capabilities.

GPU Power consumption

TGP, which stands for either thermal design power or total graphics power, provides insights into a GPU’s power consumption and potential heat generation.

In terms of power consumption, the RTX 4060 Ti operates at 160W, while the 2080 Ti requires 250W, just 90W more than the 4060 Ti. This means that the 2080 Ti demands a more robust power supply, generates more heat, and tends to run hotter overall. On the other hand, choosing the lower-powered RTX 4060 Ti ensures cooler operation and proves to be more cost-effective, both in terms of pricing and thermal capacity.

RTX 4060 Ti Vs RTX 2080 Ti: Performance

The RTX 4060 Ti is expected to outperform the 2080 Ti in most applications, and even if it falls slightly short, technologies like DLSS will help bridge the gap. It’s remarkable that we’ve reached a point where the lower-tier GPUs of a current generation can surpass the top-tier GPUs of previous generations, and it has only taken two generations to achieve this.

Is the RTX 4060 Ti better than the RTX 2080 Ti?

The RTX 4060 Ti is considered better than the RTX 2080 Ti in several aspects.

In terms of GPU core frequency, the RTX 4060 Ti has a boost clock that is 1 GHz higher than the 2080 Ti, indicating that it can process tasks more efficiently and deliver better performance in raw power.

In terms of processor cores, both the RTX 4060 Ti and the 2080 Ti have the same number of CUDA cores. However, the CUDA cores in the 4060 Ti are more efficient and faster, making them superior in performance and surpassing the capabilities of the cores in the 2080 Ti.

Lastly, in terms of power consumption, the RTX 4060 Ti operates at 160W, while the 2080 Ti requires 250W. This means that the 4060 Ti demands less power, generates less heat, and provides a more cost-effective solution in terms of pricing and thermal capacity.

Considering these factors, the RTX 4060 Ti emerges as the better choice compared to the RTX 2080 Ti in terms of overall performance, power efficiency, and value for money.

Final word

In conclusion, the RTX 4060 Ti surpasses the RTX 2080 Ti in various aspects, making it the superior choice. With a higher GPU core frequency and faster processing power, the 4060 Ti outperforms the 2080 Ti in raw performance.

Moreover, the 4060 Ti operates at a lower power consumption, resulting in cooler operation and cost-effectiveness. Overall, the RTX 4060 Ti provides better performance, power efficiency, and value for money when compared to the RTX 2080 Ti.

Nvidia Unleashes Geforce Rtx 4070 Ti, Advanced Rtx 40

GeForce RTX 4070 Ti


We’ve seen this graphics card before, though. The presentation didn’t call explicit attention to the fact, but Nvidia France accidentally confirmed the RTX 4070 Ti is the new name for Nvidia’s controversial “unlaunched” RTX 4080 12GB, which will now cost $799 instead of the $899 originally intended when it was called a 4080. That’s still $200 more than the last-gen 3070 Ti and $100 more than the higher-tier RTX 3080 debuted at, so Nvidia’s terrible RTX 40-series pricing continues unabated.

Shaving the extra $100 off helps a bit but at first blush, it still seems like very poor value indeed—and its existence makes the RTX 4080’s $1,200 sticker price even more egregious. Is the RTX 4080 really worth $400 more than the 4070 Ti and $500 more than the RTX 3080?

Note that all of these games support DLSS 3, so you are unlikely to see this sort of performance gains over the 3090 Ti in titles that don’t actively support the cutting-edge feature.


Stay tuned for GeForce RTX 4070 Ti reviews to find out. With a street date of January 5, you shouldn’t be waiting long. Nvidia claims the new GPU will outpunch the last-gen RTX 3090 Ti flagship and if that’s true, it’ll be a beast of a graphics card no matter what its price is. Whether it’s worth the cost remains an open question given Nvidia’s recent track record.

RTX 40-series laptops


The RTX 40-series is also coming to laptops. This is a very welcome announcement if mobile pricing winds up being less insane than the desktop GPUs. Nvidia claims this is the biggest leap in generational performance ever while running much more efficiently than before. And, for the first time ever, this generation’s laptop lineup is spearheaded by Nvidia’s ferocious xx90-class—previous flagships drew too much power for consideration in notebooks and topped out with xx80-class GPUs instead.

Nvidia calls these a “new class of laptops.” The company also claims “GeForce RTX 40 Series GPUs will be featured in over 170 laptops from all of the major manufacturers.”


The GeForce RTX 4090 and 4080 for laptops unlock the ability for surround 4K gaming at 60 frames-per-second—gaming with three 4K displays arrayed around you. A bit of a weird flex for laptops, but you love to see the power. Nvidia didn’t provide many hard metrics but said creators can also expect to see 2x faster video exports (are Lovelace’s dual AV1 encoders also in the mobile version?). Look for RTX 4080 and 4090 laptops to hit the streets on February 8 with prices starting at $1,999.


Those are likely to be found exclusively in high-end creator and gaming laptops, though. For the rest of us, there’s the RTX 4050, 4060, and 4070 for laptops—yes, even though we’ve yet to see any of those GPUs in desktop form. Nvidia says that they’ll deliver better-than-RTX 3080 mobile performance at a third of the power (given DLSS 3’s inclusion, take that with a big grain of salt). Nvidia says at least some of these laptops will target 80fps 1440p gaming and will be available on February 22 starting at $999. That could represent an increase in price, however. The RTX 30-series mobile lineup started at $999 with the RTX 3060, whereas this new lineup starts at $999 with the lower-tier RTX 4050.

RTX 4080 on GeForce Now


Nvidia also announced that the RTX 4080 is coming to its GeForce Now streaming service via a new “Ultimate” tier that replaces the previous RTX 3080 tier at the same $19.99/mo. price.

Nvidia claims the new RTX 4080 server pods deliver 5x the teraflops of an Xbox Series X and a new competitive 240Hz mode will help deliver ultra-low latency that’s roughly twice as responsive as what you get with console gaming. You also get those juicy DLSS 3 capabilities.

GeForce Now’s RTX 4080 capabilities will start firing up in select North American and European data centers sometime in January, with a wider release planned for later this quarter. All current RTX 3080 tier subscribers will be automatically switched over to the Ultimate subscription once RTX 4080 servers fire up.

Review Roundup: Ipad Mini And Fourth

Ahead of the iPad Mini and fourth-generation iPad becoming available to customers Friday, early reviews of both devices have hit the web. You can see in the collection below that the reviews are fairly positive, discussing how great the build quality is, the lightness of the tablet, and how well it fits in your hand. Starting with the iPad mini:

The Loop:

I use my iPad mini for tasks rather than watching videos or playing games, but I use it a lot. This is a Wi-Fi model, which was on all the time and I have yet to see anything cause a significant drain on the battery. The battery is lasting days for me and it is on 24/7.


In fact we found the brightness and color reproduction to be improved over the iPad 2, comparable to the latest Retina displays. Colors are very pleasing to the eye and viewing angles, as ever with an Apple display, do not disappoint. You can line up as many friends as you like and sit them shoulder-to-shoulder, they’ll all have a bright, clear picture. Yes, mini owners may have to make do with some resolution envy, but they at least won’t be lacking in any other regard.

The Verge:

And it does raise the floor here. There’s no tablet in this size range that’s as beautifully constructed, works as flawlessly, or has such an incredible software selection. Would I prefer a higher-res display? Certainly. Would I trade it for the app selection or hardware design? For the consistency and smoothness of its software, or reliability of its battery? Absolutely not. And as someone who’s been living with (and loving) Google’s Nexus 7 tablet for a few months, I don’t say that lightly.


While we’re on the subject of the screen, let’s not beat around the bush — if there is a weakness of this device, it’s the screen. But that statement comes with a very big asterisk. As someone who is used to a “retina” display on my phone, tablet, and even now computer, the downgrade to a non-retina display is quite noticeable. This goes away over time as you use the iPad mini non-stop, but if you switch back a retina screen, it’s jarring.


On the other hand, what will make some think twice about buying an iPad mini is the price. Starting at £269 for a WiFi only model, this is £100 dearer than the Kindle Fire HD or the Nexus 7, which is now available in a 16GB version for £159.

Whether it’s worth it depends on how much of a premium you put on great design and a vast ecosystem of apps. Apple will sell a lot of these little beauties, that’s for sure.


The iPad Mini is a design shift from the iPad, and perhaps the biggest one in the iPad’s entire history. Despite how popular the iPad’s been, it’s not really a device that’s very comfortable to use when not sitting down or at a desk. It’s a use-when-you-get-there device, or use-when-comfortably-seated. An iPhone or iPod Touch is truly mobile, and the iPad is only halfway there.


Even though this screen isn’t state of the art, it’s O.K. If you’ve ever laid your eyeballs on the ultra-smooth text rendered by the Retina iPad, its text will look fuzzy by comparison, especially at teensier type sizes. But the tradeoff it presents compared to the 7-inchers — fewer pixels, but more space — is reasonable enough.


In shrinking the iconic iPad, Apple has pulled off an impressive feat. It has managed to create a tablet that’s notably thinner and lighter than the leading small competitors with 7-inch screens, while squeezing in a significantly roomier 7.9-inch display. And it has shunned the plastic construction used in its smaller rivals to retain the iPad’s sturdier aluminum and glass body.


What will surprise you is the weight. The specs already show that the iPad mini is lighter than the Kindle Fire, 308g v 395g (and 340g for the Nexus 7); even if you add on a Smart Cover, it’s still lighter than the uncovered Kindle Fire. It’s thinner too. This is a device that will be ideal for holding in one hand for reading on train rides or other commuting; or you might even forget it’s in that coat pocket.


Apple quotes up to 10hrs of wireless browsing over Wi-Fi for the iPad mini, or up to 9hrs if you’re using the tablet’s cellular connection. In practice, with a mixture of browsing, some video playback, games, music – both locally-stored and streaming – and messaging, we comfortably exceeded Apple’s estimate. In fact, we exceeded 11hrs of use before encountering a battery warning.

Fox News:

Those tablets don’t have the complete experience that the iPad does. Come on: The iPad is still the gold standard for tablet computing after all. With stellar hardware and hundreds of thousands of apps, the iPad is the Kleenex of facial tissue. The Tivo of DVRs. It has all the perks of using an iOS device: AppStore, iMessages, FaceTime, etc.

Moving on to the 4th gen iPad:


If you were going to get an iPad before, obviously, you’ll want to get this one now. In fact, you don’t even have a choice — Apple has discontinued the third-generation model. The prices remain the same across the board as do all of the other features (WiFi/LTE, Retina display, etc).

Yes, it is kind of lame for those of us who bought third-generation models that Apple updated the line so quickly, but well, that’s Apple. To me, the fourth-generation leap doesn’t seem to be nearly as big as the leap from the first to second generation or from the second to third generation, so perhaps take some solace in that.


The third-generation iPad arguably didn’t need refreshing; in fact, if Apple hadn’t opted to change to Lightning, it could realistically have held off changing its largest tablet until early 2013, as per its typical yearly refresh cycle. That makes for a reasonably straightforward upgrade decision if you’re a 3rd-gen iPad owner. Unless you’re desperate for Lightning – perhaps you’ve also got an iPhone 5, and want to use all the same accessories rather than buy the adapter dongle – then we’re yet to see apps that really demand the potent A6X chipset.

The Verge:

For now, if you’re within your return window you should probably swap for the newest iPad, but if not? Rest assured you’re not really missing that much. Not yet, at least.


In my testing, battery life seems to have remained the same despite the processor, and so have the cameras. In fact, the camera is one of the places where the impact of the A6X processor can be seen: taking pictures is an astonishingly fast and picture quality is improved thanks to the A6X’s image signal processor.

FTC: We use income earning auto affiliate links. More.

Asus Noctua Rtx 3080 Oc Review

ASUS Noctua RTX 3080 OC review

All glory to the Hypnowl

ASUS and Noctua first teamed up to create a custom RTX 3070. But now they have released a new card which we see how it fairs with this ASUS Noctua RTX 3080 OC review. It aims to improve upon the most popular graphics card. With the promise of better and quieter cooling although at a much bigger size coming in at over four slots. The new card brings the Noctua aesthetics to the graphics card world. 

ASUS Noctua RTX 3080 OC

Core Clock Speed

1440 MHz base, 1815 MHz boost

Stream Processors


Memory Size



310 x 144.8 x 87.5 mm (12.2 x 5.7 x 3.45 inches)

PSU Required

850 W

Packaging and aesthetics

Coming in a beefy box with a handle it feels almost like a briefcase for your new graphics card. The massive box was built to hold the massive size of the Noctua 3080. Its dimensions are 310 x 144.8 x 87.5 mm (12.2 x 5.7 x 3.45 inches) which puts it at a whopping 4.3 slots. So it’s no surprise it needs such a hefty box to carry it.

The Noctua version is based upon ASUS’ TUF model but improves upon the original design. With an enlarged copper baseplate to cover a wider area. Including the VRAM as well as the die itself for better performance overall by keeping temperatures under wraps. Along with a bigger heatsink for a greater thermal density.Which are then helped out by two Noctua NF-A12x25 fans instead of the stock three 92mm fans. The superior fans are more optimized for better airflow and static pressure characteristics. Which does give the card lower GPU and VRAM temperatures while running quieter than the original TUF. Promising an improvement of 4.5 dB, 3°C GPU, and 14­­°C in VRAM over the previous offering.

The build itself is sturdy and surprisingly nice looking. As with such a large heatsink and fans you’d expect a lot of bending and GPU sag. However, ASUS has done well to reinforce the card to prevent that from occurring. With a well-connected structure, it prevents any loose areas. With a strong backplate and IO shield, it also connects the furthest points together to create a solid build. Preventing bending over time can ruin the aesthetics of the graphics card in the long term.

Noctua is known for its unique color choices of beige and brown. Which it has also brought to the 3080 which the fans are. However, the design around the fans is a less polarizing more toned-down color choice. It uses a darker brown and silver to create an owl-looking design on the front side. With the spinning fans being a hypnotizing pair of eyes. Put together it creates a well-designed and thought-out graphics card.



Specification and price

Tech Specs

Core Clock Speed

1440 MHz base, 1815 MHz boost

Stream Processors


Memory Size



310 x 144.8 x 87.5 mm (12.2 x 5.7 x 3.45 inches)

PSU Required

850 W

Thermal Design

Modified TUF cooler with 2x NF-A12x25 120 mm fans

Memory Clock

19 Gbps

Memory Bus


Card Bus

PCIe 4.0 x16


2x HDMI 2.1, 3x DisplayPort 1.4a

Power Connectors

2x 8-pin


Excellent cooling outperforming other solutions

great build quality to support its bigger size

Plenty of headroom for overclocking

Quiet and powerful performance


Much more expensive over other custom solutions

Color scheme not to everyones tastes

Uses old 10 GB model

The Noctua 3080 is based upon the 10GB version of the GPU. Which is the older version with 10 GB of GDDR6X memory. Across a 320-bit memory interface, it is clocked at 19 Gbps giving it a bandwidth of 760.3 GB/s. With the majority of the card specs kept the same.

It does offer an overclocked engine clock, with a boost clock of up to 1815 MHz. Which is what 8704 CUDA cores can potentially run at. It also has four maximum displays and five ports. Two of which are HDMI 2.1 and three DisplayPort 1.4a. It does only use two 8-pin power connectors with a recommended power supply of 850W.

The cooling solution utilized is based upon the ASUS TUF model. This increases the heatsink’s width to 120.50 mm, the height to 42.20 mm, and the depth to 94 mm. Along with an increased copper base plate, size to include the memory chips as well for improved thermals. Which is also supported by two NF-A12x25 120mm fans. For optimized airflow through the heatsink to dispel all the heat.

For pricing, ASUS has announced a retail price of $950. Although much higher than the original price of the 3080 of $699. However, the current pricing of the 3080 is much higher already, and it’s not too far off. If you can find it in stock anywhere. With using only finding it on Newegg for a much higher inflated price of $1,349.99.




We compared the use of the ASUS Noctua RTX 3080 vs an MSI Gaming X Trio RTX 3080. Looking in particular at the core temperatures the cards achieved, the frequency they ran at, and the fan speeds. Using benchmarks and stress testing we see how well the cards perform under the biggest pressure.

Our test bench is made up of:

CPU: AMD Ryzen 5 5600x

Cooler: BeQuiet Dark Rock 4

RAM: Corsair 2x8GB 3200MHz Corsair Vengeance (XMP)

Motherboard: AORUS X570 Pro (re-sizeable bar enabled)


Case: Cooler Master MF 700

From the results of the two cards, we see the same GPU clock utilization and power usage. This is from one minute before a 10-minute Furmark run and one minute after. Although not perfectly aligned it shows us the relative behaviors of the two.

The two cards have the same overclock on them so we do see the frequency between the two stay the same. With a steady rate of around 1500 MHz during the test and peaking to near 2000 MHz upon finishing.

Then we see the fan RPMs. With the Noctua, a much more aggressive response jumps to 1369 RPM, whilst the MSI jumps to around 800-900 RPM and ramps up after. Whilst the brown fans ramp down and average around 1100 RPM. With MSI peaking at 1366 RPM and averaging around that same speed.

We can also see the power draw of the two cards. Both peaked at 340 W, about 20 W higher than the TDP of the 3080, and the MSI GPU is capable of more power from more power connectors it uses.

Lastly, we see the temperature of the GPU and we see how well the Noctua fans perform to keep the temperature down. The GPU peaks at 60.1°C and with the fans kicking in, it drops to below 58°C. Whereas the MSI temperature keeps climbing even when the fans start spinning. Peaking out at 70.7°C after four minutes it then drops to around 69°C. With the ramp down taking much longer after stopping than the Noctua card.

With the lower RPM and bigger size, the fans run quietly. Although we do not have the means for sound comparison to truly compare their levels. Neither of the cards was particularly noisy. The Noctua option causes less turbulent air but you can hear the coil whine. 

Either way, it is capable of keeping the GPU temperature much lower. With a better default fan curve better responding to the workload than the MSI card.


Overall the ASUS Noctua RTX 3080 is a great build of a graphics card. Giving you the iconic brown of full-size fans. Along with an owl-like design, it is a unique design with excellent cooling. Capable of great performance in the long term as well as pushing it in performance. It is a great choice for a card if you like the design and have the space to actually fit it in. Although it does come in at an inflated price that does come with the premium of the card. So if you have the means it can be the card for you. And we’re likely to see more from the collaboration further down the line.

Meet Surface Laptop Studio, The Rtx

The Surface Laptop Studio won’t come cheaply, though. Microsoft has priced the notebook at $1,599.99 and above, with prices climbing to $3,099.99 for the most premium model. The Surface Studio Laptop is available for preorder beginning today, and will ship on Oct. 5.

Content creation improves via the new display

With the Surface Laptop Studio, Microsoft has simply swapped one iconic hardware design for another. The original Surface Book defied description. Though it was probably most often used as a traditional clamshell notebook, a detachable hinge allowed it to be used as a tablet. The drawback was that the tablet component lacked a kickstand, making the tablet a bit awkward to tote about and actually use, save for inking. The Surface Laptop, by contrast, is a simply a traditional clamshell notebook PC, and as our Surface Laptop 4 review shows, a pretty good one—even it faces some tough competition.

Microsoft’s new “dynamic woven hinge” on the Surface Laptop Studio solves that problem. In both its design as well as its name you can see how the Surface Laptop Studio has evolved from the design of the Surface Studio all-in-one. The Surface Laptop Studio operates in Laptop mode, but the 14.4-inch PixelSense display can pull forward into what Microsoft calls Stage mode, where the display is thrust forward, covering the keyboard.

The hinge on the Surface Laptop Studio can support the display, though it’s not the best angle for inking.

Here, the Surface Laptop Studio looks very reminiscent of the HP Elite Folio, though it appears that the metal hinge may be able to support the display somewhat as it pulls forward. When fully pulled forward and pressed down flat, the Surface Laptop Studio can also operate in Studio mode for inking. PCWorld was offered a bit of hands-on time with the Surface Laptop Studio prior to the launch, and we can report that the hinge seems extremely sturdy, with additional support in the “pull forward” and Studio modes,

The “hybrid” concept extends to other aspects of the Surface Laptop Studio’s design, as well. For one, the Surface Laptop Studio is offered in just one size: a 14.4-inch display neatly combines the Surface Book’s former 13-inch and 15-inch form factors into a single “PixelSense Flow” touchscreen display with 2,400 x1,600 resolution and Dolby Vision HDR support. PixelSense Flow apparently refers to the new Dynamic Refresh Rate feature built into Windows 11. When enabled, the feature allows displays that support it to run at a higher refresh rate for smoother inking. In the case of the Surface Laptop Studio, the display can run up to 120Hz.

Nevertheless, that’s a step down from the Surface Book 3, which offered either 3000×2000 (267 ppi) for the 13.5-inch display or 3240×2160 (260 ppi) for the 15-inch option. At 201 ppi, Microsoft’s Surface Laptop Studio appears to be trading pixel resolution for refresh rate—not unheard of, to be fair, in a world where 300Hz 1080p gaming laptops exist.

This, too, may indicate that Microsoft is fixing one of the Surface Book 3’s flaws: battery life. The Surface Book 3 lasted about 12 hours on battery—not bad at all, but not on par with previous Surface Books. Microsoft is claiming that the Surface Laptop Studio will yield between 18 and 19 hours of battery life, an untested claim that, if true, would outperform the Surface Book 3 by a substantial margin.

A pair of Thunderbolt 4 ports are on the side of the Surface Laptop Studio.

But the Surface Laptop Studio also marks Microsoft’s conversion to Thunderbolt, with a pair of Thunderbolt 4/USB 4 ports that can be used for charging or for I/O purposes. Essentially, a Surface Laptop Studio owner will have a choice between using an existing Surface Dock or investing in the small but growing ecosystem of Thunderbolt 4 docks.

There’s also a new Surface Slim Pen 2 hidden beneath the keyboard. Microsoft’s Surface Slim Pen 2 doesn’t come bundled with the Surface Laptop Studio, but there’s a new magnetic cubby underneath the keyboard that can be used to dock and charge the new pen. Microsoft’s Slim Pen 2 supports 4,096 levels of pressure, smooth inking, as well as a new haptic motor that simulates the feel of inking on different surfaces—which felt a bit gimmicky during our hands-on time.

There’s another small twist to the physical design: the thickness. The Surface Laptop Studio measures 12.7 x 9.0 x 0.7 inches—slightly thinner than a Surface Book, but also chunkier than the Surface Laptop 4, which measures 0.57 inches thick. That appears to be due to thermal venting around the bottom outside edges of the Surface Laptop Studio. That, plus any additional cooling needed for the RTX hardware, likely contributed to the extra depth. The Surface Laptop Studio also weighs between 3.83 pounds and 4.00 pounds—lighter than the Surface Book 3, but heavier than the Surface Laptop 4.

Microsoft’s Surface Laptop Studio hides a pen cubby underneath the keyboard.

Windows Hello 2.0 boosts Teams calls, logins

If the pull-forward display improves inking for content creation, then Microsoft’s improvements in its camera and input system should improve your next Teams call—key changes considering that Teams chat will be front and center in Windows 11. For one, Microsoft included what executives are calling Windows Hello 2.0: the depth camera goes beyond the visible range to identify you, but the facial tracking is now smart enough to factor in beards, glasses, even surgical masks. Even better, the 1080p front-facing camera on the Surface Laptop Studio can now adjust for the lighting on your face, so that you’ll be in focus and well exposed.

Likewise, audio has improved. Microsoft drilled additional mics into the Gorilla Glass to more clearly pick up your voice, and the Surface Laptop Studio now includes quad Omnisonic speakers—two near you, two further back—plus Dolby Atmos sound for clear conferencing audio and music playback.

RTX hardware makes this a gaming PC

Throughout the past few years, the internal upgrades dominated the conversation of any new Surface. With the Surface Laptop Studio, both the internal and external improvements are worth talking about.

The Surface Laptop Studio is one of the first major PC platforms we’ve seen that uses Intel’s “Tiger Lake” H35 chip, a quad-core, 11th-gen CPU that Intel launched in January. The Surface Laptop Studio will ship in two configurations: a Core i5-11300H, and an Core i7-11370H. The H35 was designed for ultraportable gaming, with a target of over 70fps at 1080p resolution, generally at “high” graphical settings. That’s in line with the other major addition to the Surface Laptop Studio: A discrete GeForce RTX GPU from Nvidia.

Only the latter Core i7 model will ship with the GeForce RTX 3050 Ti. When Nvidia launched these new GeForce GPUs this past May, the company noted that the new RTX 3050 Ti chips would come with a marked step down in gaming performance on traditional games, with roughly a third of the CUDA cores and half of the tensor cores found in the GeForce RTX 3080 GPU. The Dynamic Refresh Rate feature, too, will be used to enable smooth frame rates at up to 120Hz.

Don’t forget about Windows 11

We’ve included the Surface Laptop Studio’s basic features below, followed by the prices. Our earlier Surface Book 3 review includes those features for comparison.

Surface Laptop Studio features:

Display: 14.4-inch PixelSense Flow touch display (2400×1600 (201 PPI), 120Hz)

Processor: Intel Core H35 i5-11300H, H35 i7-11370H

Graphics: Iris Xe (H35 i5-11300H), Nvidia RTX 3050 Ti w/4GB GDDR6 DRAM; Commercial: RTX A2000 with 4GB GDDR6 GPU

Memory: 16GB/32GB LPDDR4X RAM

Storage: 256GB , 512GB, 1 TB, 2TB SSD (all removeable)

Ports: 2 USB-C (Thunderbolt 4/USB 4.0), 1 Surface Connect, 3.5mm headphone jack

Camera: User-facing: 1080p

Battery: Core i5 (up to 19 hours), Core i7 (up to 18 hours)

Wireless: 802.11 ax (Wi-Fi 6); Bluetooth 5.1

Operating system: Windows 11 Pro/Home, or Windows 10 Pro

Dimensions (inches): 12.7 x 9.0 x 0.7in

Weight: 3.83lb (Core i5), 4.00 lb (Core i7)

Color: Platinum

Price: Starting at $1,599

Surface Laptop Studio configurations, prices

Core i5/16GB memory/256GB SSD/integrated GPU: $1,599.99

Core i5/16GB/512GB/iGPU: $1,799.99

Core i7/16GB/512GB/RTX 3050 Ti: $2,099.99

Core i7/32GB/1TB/ RTX 3050 Ti: $2,699.99

Core i7/32GB/2TB/GeForce RTX 3050 Ti: $3,099.99

Editor’s note: This article originally published on Sept. 22, but was updated on Sept. 23 to include a video of our hands-on impressions.

Update the detailed information about Rtx 4060 Ti Review Roundup – How Is The 4060 Ti Rated? on the website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!