Archives by Day

About Rainier

PC gamer, WorthPlaying EIC, globe-trotting couch potato, patriot, '80s headbanger, movie watcher, music lover, foodie and man in black -- squirrel!

Advertising

As an Amazon Associate, we earn commission from qualifying purchases.





NVIDIA Rolls Out Project G-Assist System Assistant, GeForce Game Ready Driver, App Update And DLSS 4 with Multi Frame Generation Support To 9 Titles

by Rainier on March 25, 2025 @ 3:00 a.m. PDT

NVIDIA is releasing an experimental version of the Project G-Assist System Assistant feature for GeForce RTX desktop users, an update for the NVIDIA app, new GeForce Game Ready Driver and more titles adding DLSS 4 support.

Today, NVIDIA is releasing an experimental version of the Project G-Assist System Assistant feature for GeForce RTX desktop users, via NVIDIA app. Project G-Assist runs locally on GeForce RTX AI PCs and is built to help users control a broad range of PC settings, from optimizing game and system settings, charting frame rates and other key performance statistics, to controlling select peripherals settings such as lighting — all via basic voice or text commands. G-Assist can provide real-time diagnostics and recommendations to alleviate system bottlenecks, improve power efficiency, optimize game settings, overclock your GPU, and much more.

AI developers and enthusiasts can also extend the capabilities of G-Assist by building custom plugins. To get started, NVIDIA has published a GitHub repository with samples and instructions for creating plugins that control PC peripherals, connect to 3rd-party app APIs, and more. G-Assist is built with NVIDIA ACE, which is the same AI tech suite game developers use to breathe life into NPCs. You can download Project G-Assist through NVIDIA app’s Home tab in the Discovery section. More information on Project G-Assist can be found in this article. And to expand the possibilities of local generative AI workflows, NVIDIA NIM microservices are now available on RTX AI PCs and workstations with AI Blueprints coming soon. To learn more about Project G-Assist’s System Assistant plugin builder, NIM and AI Blueprints, check out this week’s RTX AI Garage blog post

In addition to Project G-Assist availability, NVIDIA app has several other updates launching this week, including expanding the functionality of DLSS overrides and enabling players to fine tune image quality or boost performance. The new NVIDIA app update adds a new feature to the “DLSS Override - Super Resolution” setting for users who have GeForce Game Ready Driver or NVIDIA Studio Driver 572.83 WHQL, or newer installed. Previously, it could activate DLAA or DLSS Super Resolution Ultra Performance mode in games and apps lacking native support. Now, you can enable any DLSS preset, or alternatively customize the resolution scaling between 33% and 100%. Display Scaling and Display Color settings have also been moved over from the NVIDIA Control Panel, modernizing and improving them, and taking another step towards unifying all NVIDIA GPU features in one responsive application. You can read more about the updates here.

NVIDIA is also rolling out support for DLSS 4 with Multi Frame Generation in new games including Enlisted KARMA: The Dark WorldThe First Berserker: Khazan, and Warhammer 40,000: Darktide. And DLSS 4 is coming to Remedy Entertainment’s FBC: Firebreak at launch

Here’s a roundup of the new games and updates that are integrating DLSS. You can read the full run-down on each game in this DLSS article: 

  • Enlisted: In this WW2 MMO shooter game, each player commands a squad of AI-controlled soldiers in battle, taking direct control as needed, giving battles massive scale and period-accurate intensity. Enlisted already includes support for DLSS Frame Generation, DLSS Super Resolution, and NVIDIA Reflex. With this latest update, Enlisted’s visual effects are further enhanced with DLSS 4 with Multi Frame Generation, magnifying performance by an average of 4.3X at 4K max settings for GeForce RTX 50 Series gamers. And via NVIDIA app, GeForce RTX gamers can upgrade DLSS Super Resolution to the latest transformer AI model, further enhancing image quality. 
  • FBC: Firebreak: This award-winning game from Remedy Entertainment is a three-player cooperative first-person shooter. Set in the Federal Bureau of Control, familiar to fans of the critically acclaimed Control, players and their team are on call to confront everything from reality-warping Corrupted Items to otherworldly monsters. DLSS 4 with Multi Frame Generation, the entire suite of DLSS 4 technologies, and NVIDIA Reflex will be available at launch in FBC: Firebreak. Additionally, all of the jaw-dropping Full Ray Tracing technologies seen in Alan Wake 2 will return in FBC: Firebreak, giving gamers the opportunity to greatly enhance image quality.
  • Half-Life 2 RTX Demo: Following the beta release last year, NVIDIA has now officially released RTX Remix. To demonstrate its capabilities, Half-Life 2 owners can now download a free Half-Life 2 RTX demo from Steam, showcasing Orbifold Studios’ work in Ravenholm and Nova Prospekt, ahead of the full game’s release at a later date. This community-made remaster of Valve’s classic game is made by Orbifold Studios, whose mission is to remaster the entirety of Half-Life 2 which includes updating every texture, model, and level, adding extra geometric detail to buildings and surfaces to realistically interact with fully ray-traced lighting. The demo features DLSS 4 with Multi Frame Generation, DLSS Super Resolution, and DLSS Ray Reconstruction as well as Neural Radiance Cache and RTX Volumetrics 
  • KARMA: The Dark World: Set in a dystopian 1984 in East Germany, this first-person cinematic psychological thriller explores an alternate timeline where the Leviathan Corporation rules with an iron fist, controlling its citizens through mass surveillance, social class rules, mind altering drugs and the promise that the gates to Utopia will open to those who serve. The game launches on Thursday, March 27 with support for DLSS Frame Generation, DLSS Super Resolution, and NVIDIA Reflex. GeForce RTX gamers can upgrade DLSS Super Resolution using NVIDIA app’s new DLSS 4 overrides and GeForce RTX 50 Series gamers can activate DLSS 4 with Multi Frame Generation. Head here to learn how.
  • The First Berserker: Khazan: From the extensive universe of Dungeon & Fighter, comes a fast-paced adventure that follows the Great General Khazan on a journey from being falsely branded a traitor, his escape and ultimate revenge on those who wronged him. Widely available on Thursday, March 27, Advanced Access Deluxe Edition buyers can access now by downloading a two-level demo from Steam. All GeForce RTX gamers can activate DLSS Super Resolution, accelerating frame rates, as well as NVIDIA Reflex to reduce PC latency to make battles even more responsive. Alternatively, activate DLAA if you have performance to spare, maximizing image quality.GeForce RTX gamers can also upgrade DLSS Super Resolution using NVIDIA app’s new DLSS 4 overrides and GeForce RTX 50 Series gamers can activate DLSS 4 with Multi Frame Generation to multiply performance by 3.5X on average at 4K max settings. Head here to learn how.
  • Warhammer 40,000: Darktide: With the Nightmares and Visions update launching this week, players can return to the depths of Tertium, an unforgiving hive city of the 41st millennium, to root out heretics attempting to gain control of the city. This latest update introduces new content and upgrades players to DLSS 4, adding DLSS Multi Frame Generation for GeForce RTX 50 Series gamers, and updating DLSS Frame Generation and DLSS Super Resolution to the latest AI models.
  • Control: This visually stunning third-person action-adventure blends open-ended environments with the signature world-building and storytelling of renowned developer, Remedy Entertainment. Now, with a recently released update players can further enhance Control and its technology by upgrading DLSS Super Resolution to DLSS 4’s transformer AI model with NVIDIA app, increasing detail, clarity and image stability, film grain, as well as shadow and texture improvement. With the latest Ultra Ray Tracing preset, players will see an improved image quality and temporal stability, texture quality enhancements, faster texture streaming as well as support for a wider range of ultrawide monitor aspect ratios. 
  • Kingdom Come: Deliverance II: Warhorse Studios’ Kingdom Come: Deliverance II has been a smash hit, selling millions of copies and winning plaudits aplenty for its engrossing and realistic role-playing and gripping story in its massive medieval open world. A recently released update gives all GeForce RTX gamers the ability to use our new, enhanced DLSS Super Resolution transformer AI model, further refining image quality for players. To use it, head into Kingdom Come: Deliverance II’s Graphics Settings screen, scroll down to “Resolution scaling,” select “DLSS 4” as your “Technology,” select “Transformer” as your “DLSS Preset,” then hit “Confirm.” The steps above activate DLSS Super Resolution’s Transformer Preset J. via NVIDIA app’s overrides you can enable the even-newer Preset K, which further enhances image quality. Additionally, our overrides allow you to use NVIDIA DLAA in Kingdom Come: Deliverance II.
  • RF Online Next: Netmarble N2’s RF Online Next is a new massively multiplayer online role-playing game based on the highly popular RF Online, which has been online for over 20 years, and played by over 20 million users from 54 countries. RF Online Next launched late last week in Korea with day-one support for DLSS Super Resolution. A global release will follow at a later date.

Generative AI is unlocking new capabilities for PCs and workstations, including game assistants, enhanced content-creation and productivity tools and more.

NVIDIA NIM microservices, available now, and AI Blueprints, coming soon, accelerate AI development and improve its accessibility. Announced at the CES trade show in January, NVIDIA NIM provides prepackaged, state-of-the-art AI models optimized for the NVIDIA RTX platform, including the NVIDIA GeForce RTX 50 Series and, now, the new NVIDIA Blackwell RTX PRO GPUs. The microservices are easy to download and run. They span the top modalities for PC development and are compatible with top ecosystem applications and tools.

The experimental System Assistant feature of Project G-Assist was also released today. Project G-Assist showcases how AI assistants can enhance apps and games. The System Assistant allows users to run real-time diagnostics, get recommendations on performance optimizations, or control system software and peripherals — all via simple voice or text commands. Developers and enthusiasts can extend its capabilities with a simple plug-in architecture and new plug-in builder.

Amid a pivotal moment in computing — where groundbreaking AI models and a global developer community are driving an explosion in AI-powered tools and workflows — NIM microservices, AI Blueprints and G-Assist are helping bring key innovations to PCs. This RTX AI Garage blog series will continue to deliver updates, insights and resources to help developers and enthusiasts build the next wave of AI on RTX AI PCs and workstations.

Ready, Set, NIM!

Though the pace of innovation with AI is incredible, it can still be difficult for the PC developer community to get started with the technology.

Bringing AI models from research to the PC requires curation of model variants, adaptation to manage all of the input and output data, and quantization to optimize resource usage. In addition, models must be converted to work with optimized inference backend software and connected to new AI application programming interfaces (APIs). This takes substantial effort, which can slow AI adoption.

NVIDIA NIM microservices help solve this issue by providing prepackaged, optimized, easily downloadable AI models that connect to industry-standard APIs. They’re optimized for performance on RTX AI PCs and workstations, and include the top AI models from the community, as well as models developed by NVIDIA.

NIM microservices support a range of AI applications, including large language models (LLMs), vision language models, image generation, speech processing, retrieval-augmented generation (RAG)-based search, PDF extraction and computer vision. Ten NIM microservices for RTX are available, supporting a range of applications, including language and image generation, computer vision, speech AI and more. Get started with these NIM microservices today:

  • Language and Reasoning: Deepseek-R1-distill-llama-8B, Mistral-nemo-12B-instruct, Llama3.1-8B-instruct
  • Image Generation: Flux.dev
  • Audio: Riva Parakeet-ctc-0.6B-asr, Maxine Studio Voice
  • RAG: Llama-3.2-NV-EmbedQA-1B-v2
  • Computer Vision and Understanding: NV-CLIP, PaddleOCR, Yolo-X-v1

NIM microservices are also available through top AI ecosystem tools and frameworks.

For AI enthusiasts, AnythingLLM and ChatRTX now support NIM, making it easy to chat with LLMs and AI agents through a simple, user-friendly interface. With these tools, users can create personalized AI assistants and integrate their own documents and data, helping automate tasks and enhance productivity.

For developers looking to build, test and integrate AI into their applications, FlowiseAI and Langflow now support NIM and offer low- and no-code solutions with visual interfaces to design AI workflows with minimal coding expertise. Support for ComfyUI is coming soon. With these tools, developers can easily create complex AI applications like chatbots, image generators and data analysis systems.

In addition, Microsoft VS Code AI Toolkit, CrewAI and Langchain now support NIM and provide advanced capabilities for integrating the microservices into application code, helping ensure seamless integration and optimization.

Visit the NVIDIA technical blog [need link] and build.nvidia.com to get started.

NVIDIA AI Blueprints Will Offer Pre-Built Workflows

NVIDIA AI Blueprints give AI developers a head start in building generative AI workflows with NVIDIA NIM microservices.

Blueprints are ready-to-use, extensible reference samples that bundle everything needed — source code, sample data, documentation and a demo app — to create and customize advanced AI workflows that run locally. Developers can modify and extend AI Blueprints to tweak their behavior, use different models or implement completely new functionality.

The PDF to podcast AI Blueprint will transform documents into audio content so users can learn on the go. By extracting text, images and tables from a PDF, the workflow uses AI to generate an informative podcast. For deeper dives into topics, users can then have an interactive discussion with the AI-powered podcast hosts.

The AI Blueprint for 3D-guided generative AI will give artists finer control over image generation. While AI can generate amazing images from simple text prompts, controlling image composition using only words can be challenging. With this blueprint, creators can use simple 3D objects laid out in a 3D renderer like Blender to guide AI image generation. The artist can create 3D assets by hand or generate them using AI, place them in the scene and set the 3D viewport camera. Then, a prepackaged workflow powered by the FLUX NIM microservice will use the current composition to generate high-quality images that match the 3D scene.

NVIDIA NIM on RTX With Windows Subsystem for Linux

One of the key technologies that enables NIM microservices to run on PCs is Windows Subsystem for Linux (WSL).

Microsoft and NVIDIA collaborated to bring CUDA and RTX acceleration to WSL, making it possible to run optimized, containerized microservices on Windows. This allows the same NIM microservice to run anywhere, from PCs and workstations to the data center and cloud.
Get started with NVIDIA NIM on RTX AI PCs at build.nvidia.com.

Project G-Assist Expands PC AI Features With Custom Plug-Ins

As part of Project G-Assist, an experimental version of the System Assistant feature for GeForce RTX desktop users is now available via the NVIDIA App, with laptop support coming soon.

G-Assist helps users control a broad range of PC settings — including optimizing game and system settings, charting frame rates and other key performance statistics, and controlling select peripherals settings such as lighting — all via basic voice or text commands.

G-Assist is built on NVIDIA ACE — the same AI technology suite game developers use to breathe life into non-player characters. Unlike AI tools that use massive cloud-hosted AI models that require online access and paid subscriptions, G-Assist runs locally on a GeForce RTX GPU. This means it’s responsive, free and can run without an internet connection. Manufacturers and software providers are already using ACE to create custom AI Assistants like G-Assist, including MSI’s AI Robot engine, the Streamlabs Intelligent AI Assistant and upcoming capabilities in HP’s Omen Gaming hub.

G-Assist was built for community-driven expansion. Get started with this NVIDIA GitHub repository, including samples and instructions for creating plug-ins that add new functionality. Developers can define functions in simple JSON formats and drop configuration files into a designated directory, allowing G-Assist to automatically load and interpret them. Developers can even submit plug-ins to NVIDIA for review and potential inclusion.

Currently available sample plug-ins include Spotify, to enable hands-free music and volume control, and Google Gemini — allowing G-Assist to invoke a much larger cloud-based AI for more complex conversations, brainstorming sessions and web searches using a free Google AI Studio API key.
In the clip below, you’ll see G-Assist ask Gemini about which Legend to pick in Apex Legends when solo queueing, and whether it’s wise to jump into Nightmare mode at level 25 in Diablo IV:

For even more customization, follow the instructions in the GitHub repository to generate G-Assist plug-ins using a ChatGPT-based “Plug-in Builder.” With this tool, users can write and export code, then integrate it into G-Assist — enabling quick, AI-assisted functionality that responds to text and voice commands.

Watch how a developer used the Plug-in Builder to create a Twitch plug-in for G-Assist to check if a streamer is live:

More details on how to build, share and load plug-ins are available in the NVIDIA GitHub repository.

Check out the G-Assist article for system requirements and additional information.

Build, Create, Innovate

NVIDIA NIM microservices for RTX are available at build.nvidia.com, providing developers and AI enthusiasts with powerful, ready-to-use tools for building AI applications.

Download Project G-Assist through the NVIDIA App’s “Home” tab, in the “Discovery” section. G-Assist currently supports GeForce RTX desktop GPUs, as well as a variety of voice and text commands in the English language. Future updates will add support for GeForce RTX Laptop GPUs, new and enhanced G-Assist capabilities, as well as support for additional languages. Press “Alt+G” after installation to activate G-Assist.

Each week, RTX AI Garage features community-driven AI innovations and content for those looking to learn more about NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.

blog comments powered by Disqus