jeudi 25 juillet 2019

Is G-Sync worth it? We dive in to see if gamers should invest

Screen tearing and stuttering in PC games just suck. We both know it and VSync really isn't a great solution. Nvidia acknowledged the issue by introducing Adaptive VSync to help eradicate visual artifacts. But that was only a temporary fix. Nvidia's current answer is a hardware-based method called G-Sync requiring a specific display and discrete GeForce graphics chip. Is G-Sync worth it? We dig in to find out.

The reasons why we need VSync, G-Sync, and similar technologies are already explained in a separate article, "What is VSync". If you haven't read the article, here are a few cliffsnotes to get you up to speed:

Problem #1: Tearing happens when the GPU outputs more than the display's refresh rate.
Solution: VSync caps the framerate to the display's refresh rate.

Problem #2: Stuttering happens when the capped GPU can't output at the display's refresh rate.
Solution: VSync caps the framerate again to half the display refresh rate.

For a long time, VSync was the answer to our tearing and stuttering woes.

For a long time, VSync was the "answer" to our tearing and stuttering woes. Capping frame rates to the refresh rates eliminated tearing. Capping a framerate to half the refresh rate eliminated stuttering. However, that latter mode causes input "lag" because your actions aren't fully rendered on the screen. Our starting point is 2012 when Nvidia's Adaptive VSync entered the GeForce scene.

Adaptive VSync

Nvidia introduced Adaptive VSync in its 300 Series drivers. The aim was to provide the benefit of VSync without the annoying stutter. Like VSync, Nvidia's version locked your game's framerate to the display's refresh rate, eliminating the ugly tearing effect.

The difference between VSync and Adaptive VSync pertains to what happens when the GPU begins to struggle. As we previously described, VSync will drop the framerate down to half the display's refresh rate to eliminate stuttering. That means you're getting 30 frames per second (or 30Hz) on a 60Hz display.

Adaptive VSync didn't lock the framerate when the GPU began to struggle. Instead, this feature unlocked the framerate until performance improved. After that, Adaptive VSync locked the framerate again until the GPU began to struggle.

While this solution helped maintain a tear-free visual experience and eliminated stutter, Nvidia wanted to provide gamers with a better, more immersive experience.

Related: Nvidia's new program helps creatives find the perfect laptop

Enter G-Sync

Is G-Sync Worth It

Nvidia introduced G-Sync in 2013. It's based on variable refresh rate technology and removes the need for VSync and Adaptive VSync altogether. This method relies on a proprietary module residing within the display that replaces the typical scaler board and chip. A scaler board handles processing, decoding image input, rendering the image, controlling the backlight, and so on. That means Nvidia's G-Sync module has full control over the display's refresh rate.

Nvidia's first G-Sync module supported 60Hz and DisplayPort only. The second version added support for the 144Hz to 240Hz range along with HDMI 1.4, though G-Sync still requires a DisplayPort connection. The third version adds HDR and upgrades to HDMI 2.0 and DisplayPort 1.4

On the PC side, you need a compatible discrete GeForce graphics chip. Drivers serve as a communication platform between the GPU and external G-Sync module so that both remain synchronized throughout gameplay. That's where the variable refresh rate comes into play.

The entire tearing and stuttering issue is all about timing.

The entire tearing and stuttering issue is all about timing: The GPU becomes a slave to the display with VSync enabled. But with a variable refresh rate, the GPU and display are seemingly on the same page. For example, when a complete frame arrives in the GPU's front buffer, it scans to see if the display is in the middle of vertical blanking: the period between screen refreshes. This scanning process prevents screen tearing.

What's unknown is how G-Sync handles dropped frame rates when games stress the GPU. One theory is that G-Sync merely repeats frames in accordance to the panel's variable refresh rate capabilities as the GPU struggles. Meanwhile, if the GPU pushes out more frames than the display's maximum refresh rate, G-Sync will mimic VSync and cap the framerate.

While all of this hardware talk sounds lovely, you'll need to meet these minimum requirements on the PC side:

  Desktop Notebook
GPU: GeForce GTX 650 Ti BOOST GTX 965M
Driver: R340.52 or higher R352.06 or higher
Platform: Windows 7 / 8.0 / 8.1 / 10 Windows 7 / 8.0 / 8.1 / 10
Protocol: DisplayPort 1.2 DisplayPort 1.2

Now here's the kicker: the display. You can't hook up any display and expect G-Sync to work. Because the feature works on a hardware level on both sides of the cable, you'll need a display packing Nvidia's module, or a display that supports VESA's Adaptive Sync standard. We'll get to that latter option in a moment.

Related: Best MSI laptops to buy in 2019 – gaming, creation, and workstations

G-Sync displays aren't cheap

HP Omen X 35 Display

Since 2013, display manufacturers like Acer, Asus, Dell, HP, and more have produced desktop and laptop monitors with Nvidia's module. Resolutions and refresh rates are not a limiting factor, but support for 4K HDR didn't hit the G-Sync scene until 2018. Overall, G-Sync display prices aren't exactly cheap.

Nvidia lists all G-Sync enabled displays here. There are 61 in total that promise a "premium" experience ranging from 23.8 inches to 38 inches. Only one display supports HDR, but not at 4K (UHD). Here are samples pulled from its G-Sync (vanilla) lineup:

With the introduction of G-Sync HDR, Nvidia introduced the G-Sync Ultimate brand. Currently, you'll find a mere five models listed under this banner supporting HDR at 4K (UHD), high refresh rates, and brightness levels of over 1,000 nits. Note that the HP unit listed below is based on Nvidia's Big Format Gaming Display design.

Big difference in price, right? The "vanilla" G-Sync batch offering standard features ranges from $379 to $818. We simply pulled random selections from Nvidia's current list. Overall, these prices aren't bad, especially for the Dell and MSI units, but you likely pay more than your typical desktop monitor due to Nvidia's proprietary technology and quality control. Nvidia says these panels go through more than 300 certified tests.

The second list, G-Sync Ultimate, is based on that third design we previously mentioned: G-Sync HDR. You get the latest technologies backed by high refresh rates, high resolutions, ultra-low latency, and multi-zone backlighting. If you want everything G-Sync has to offer right now, then be prepared to pay four-digit prices. As shown above, you could buy a gaming desktop for the same price.

Finally, all displays in both groups have a variable refresh rate range between 1Hz and their maximum: 60Hz to 250Hz. That's not the case with Nvidia's third display group.

Related: PCI Express 4.0: What it is and why it's important

Meet G-Sync Compatible

BenQ ZOWIE XL2740 display

Nvidia introduced this program in 2019. These monitors do not include G-Sync modules but do have a variable refresh rate. They're based on VESA's Adaptive-Sync, a standard added to DisplayPort 1.2a at the end of 2014. AMD began supporting Adaptive-Sync in 2015 with a software-side solution called FreeSync in its Radeon drivers. FreeSync provides a communication line between Radeon GPUs and off-the-shelf scalers in Adaptive-Sync displays. We bring up AMD because many Adaptive-Sync displays carry AMD's FreeSync branding.

"The DisplayPort Adaptive-Sync specification was ported from the Embedded DisplayPort specification through a proposal to the VESA group by AMD. DisplayPort Adaptive-Sync is an ingredient feature of a DisplayPort link and an industry standard that enables technologies like Radeon FreeSync technology.," AMD states in its FAQ.

In addition to DisplayPort, Adaptive-Sync monitors began supporting variable refresh rates through HDMI connections in late 2017 with the release of HDMI 2.1. G-Sync, on the other hand, remains locked to DisplayPort connections. G-Sync only supports a static refresh rate over HDMI.

GeForce gamers now have a larger display selection.

Nvidia currently lists 33 displays under its G-Sync Compatible banner. Here are the variable refresh rate ranges:

  • 50Hz – 144Hz
  • 48Hz – 240Hz
  • 48Hz – 144Hz
  • 48Hz – 120Hz
  • 40Hz – 165Hz
  • 40Hz – 144Hz
  • 30Hz – 144Hz

Now here are a few G-Sync Compatible examples:

The prices aren't bad and perhaps slightly cheaper than vanilla G-Sync solutions in some cases. The key takeaway from this is if you already have one of the 33 displays and you're upgrading to a GeForce GPU, you don't necessarily need a G-Sync display. As Nvidia states, you get a "baseline" experience. If you want something better with a wider variable refresh rate, then you'll need to make that G-Sync display investment.

Related: 120Hz adaptive displays: The future or just a gimmick?

So umm, is G-Sync worth it or not?

We've spent some time explaining all the G-Sync options, but is G-Sync worth it? That's a decision each gamer ultimately needs to make. If you're a lightweight gamer that only plays Roblox, then G-Sync would be an unnecessary, costly upgrade. We're not saying Roblox and similar platforms don't deserve the best hardware, but they're also not targeting photo-realistic experiences at high frame rates. If anything, you should invest in a decent, low-cost GPU and a 60Hz display.

PC gamers with money to spend should take the G-Sync plunge. But even if you didn't want to dump loads of cash into a combined GPU and display upgrade, you can grab the MSI Oculux NXG251R as listed above along with MSI's GeForce GTX 1660 Ti Ventus XS 6G OC add-in graphics card for around $650. Together they should provide a decent G-Sync experience without the need for a bank loan.

Do esports gamers need G-Sync? They want performance, not fidelity. They want to play with an extremely high framerate with virtually no input lag. The ideal solution would be games running at 1,920 x 1,080 on a 240Hz display. Why invest in a feature you'll likely never use? Of course, if esports gamers want to play Rage 2 and Destiny 2 on the side with the highest fidelity possible, then an investment in G-Sync is recommended.

The bottom line is, what do you want from your PC games? If you don't care about screen tearing and stuttering, then don't invest. If you simply want to participate in esports, G-Sync shouldn't be a priority. However, PC gamers wanting the most immersive experience possible should consider G-Sync. The investment may cost an arm and a leg, but you'll experience beautiful worlds the latest consoles can't even beat.

What about notebooks?

Acer Predator Triton 500

We didn't forget about notebook gamers, though our argument mostly centers on desktop. Fact is, the rules apply here as well: If you're a hardcore PC gamer wanting high fidelity on-the-go, invest in a laptop with a powerful GeForce GPU and a high-refresh display with G-Sync. Even more, you can purchase a standalone G-Sync display if you want an experience that surpasses your built-in screen. Just make sure your laptop has a DisplayPort or Mini DisplayPort connector.

Of course, if you're purchasing a laptop for business first and gaming second, chances are you may not need G-Sync. But this writer is admittedly churning out this article on an Alienware laptop packing a GeForce GPU and a G-Sync display. There's nothing wrong with mixing business with pleasure if you're willing to spend the money.



from Android Authority https://ift.tt/2ycYDWc
via IFTTT

Aucun commentaire:

Enregistrer un commentaire