Score:2

GPU Acceleration on a Windows Server without virtualization over RDP

us flag

I'm trying to find out if it's possible to run a Windows Server with one GPU which is shared between all RDP clients so that people could

  • create a session on the server
  • start some program with a UI which needs GPU acceleration
  • disconnect afterwards while the program stays running and gets full acceleration
  • later reconnect to the session

Maybe that's an unusual use case because most things i can find about Windows Server and GPU seem to be about virtualization, f.e. here where it's even mentioned that

if your workload runs directly on physical Windows Server hosts, then you have no need for graphics virtualization; your apps and services already have access to the GPU capabilities and APIs natively supported in Windows Server

which might indicate that is is possible.

I've read about RemoteFX and GPU-Partitioning, f.e. here, but it again looks like this is only for virtualization and i don't care about how fast rdp would update remote screens as long as the running programs get the full acceleration.

Am i searching for the wrong things? Is this even possible?

If it's possible, how would it impact performance when the session is connected and when it's disconnected?

Bernd Schwanenmeister avatar
au flag
Please add details about that program's needs. "some program with a UI which needs GPU acceleration" is too vague. What are the exact requirements? And was it ever tested on an out-of-the box "vanialla" server installation with onboard graphics via RDP? Most things just run.
ridilculous avatar
us flag
It's a WinUI application and uses some specific nvidia extensions, f.e. one for external texture storage. I could never test it on a Server over RDP but on a Workstation over RDP where it works fine.
ridilculous avatar
us flag
@BerndSchwanenmeister I also plan to record the app window server side hoping i get the full fps there without being throttled by rdp.
Score:2
cn flag

As it's a physical server you need to instruct your server to use it's own GPU for the RDP client that connect to it.

It's there;

Local Computer Policy\Computer Configuration\Administrative Templates\Windows Components\Remote Desktop Services\Remote Desktop Session Host\Remote Session Environment

Then enable “Use the hardware default graphics adapter for all Remote Desktop Services sessions

A printscreen done, sorry my OS is in French, but it's the location.

Please note the OS of the users that connect must be minimum in Windows 10 too.

The limit you can hit is more the GPU memory if your application is not intensive on the GPU. It would be the calculate how much users can use the application before the video ram is depleted.

enter image description here

ridilculous avatar
us flag
Do you know if the fact that this runs in a session and (at least as long as the session is connected) has to stream the screen contents to the client (significantly) affects the performance, compared to running the same app on a Windows Desktop with the same GPU?
yagmoth555 avatar
cn flag
@ridilculous It does affect, but you will have to test for sure to see if it's a good plan or not depending on your application. Multiple factor can cause bad FPS/lag, like if the worker is remote, and the internet link is bad, etc..
Bernd Schwanenmeister avatar
au flag
You should just try it. If it works on a workstation over RDP, it will most probably be the same on a server.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.