What Does 'FG' Or 'MFG' On An Nvidia Graphics Card Actually Mean?

When Nvidia unveiled its RTX 50-series graphics cards at CES in January this year, the company talked about some interesting numbers: 240 FPS (frames per second) in "Cyberpunk 2077." However, this was not from a flagship GPU costing more than $1500, but from its mid-range graphic card priced around $800. That's because this mid-range graphics card was leveraging Nvidia's Frame Generation (FG) and Multi Frame Generation (MFG). 

Nvidia used CES to showcase the performance benefits of its new DLSS technologies (Deep Learning Super Sampling) on GeForce RTX GPUs, especially the RTX 50 series. If you play "Call of Duty," "Fortnite," "Cyberpunk 2077," or other graphics-intensive games, you know how a single stutter spoils the overall gameplay experience, while a split-second of input lag can get you eliminated. A smooth frame rate without any latency is a must in these games. 

Nvidia graphics cards do promise big FPS numbers to make games visually smoother with the FG and MFG feature. However, a section of the gaming community is skeptical, arguing that this visual smoothness comes at the cost of responsiveness, which is essential in competitive gaming.

All about Frame Generation and Multi Frame Generation

Frame Generation is a feature on newer Nvidia graphic cards to generate and insert an AI-generated frame between two real frames to make motion look smoother in-game. After analyzing two real frames, FG generates an artificial frame based on its prediction to fill the gap. 

Due to these "fake frames," the frame counter may read 120 FPS, even if the graphics card is natively rendering the game at 60 FPS. On the other hand, Multi Frame Generation for RTX 50-series graphic cards, inserts up to three AI-generated frames between two real ones; meaning a game running on 60 FPS displays up to 240 FPS with MFG.

Nvidia highlights FG/MFG in its announcements and advertisements, as these FPS numbers look impressive in comparison charts, especially for a mid-range graphics card, such as an RTX 5070. Considering a flagship GPU was needed for such performance before the RTX cards, Nvidia shows the potential of its graphic cards to handle demanding games with frame generation.

Why higher FPS is not the full story

While frame generation and multi-frame generation deliver a smooth visual experience for gamers, those impressive numbers don't translate to the responsiveness that actually matters in competitive gaming. Since "fake" frames don't carry user input the way real frames do, latency remains a key issue in fast-paced games. Even if the game looks smoother at 120 or 240 FPS with FG and MFG, respectively, the game will still only react to your inputs at the "real" frame rate. This leads to input lag, which is why some gamers believe that the performance metrics can be misleading.

Nvidia graphic cards do feature Reflex technology to minimize system latency when Frame Generation is enabled, but it's only effective to an extent, and can't eliminate the delay inherent in the AI frame prediction process. However, FG and MFG features can be really useful for those who play graphics-heavy singleplayer games like "Cyberpunk 2077," "Red Dead Redemption 2," etc., where motion smoothness is worth it for an immersive experience; and the input lag is not a dealbreaker. But for those who play fast-paced, competitive games like "Call of Duty," where split-second reactions matter, input lag can spoil your progress.

For this reason, some believe that technology like FG and MFG is becoming a substitute for game optimization and better graphics hardware. With Nvidia, AMD, and other graphics card makers working on making frame generation available in new graphic cards, the argument is whether it will allow developers to use DLSS and Frame Generation to hit high frame rates, versus optimization and expansive graphics settings, to target a higher framerate.

Recommended