I'm trying to find out if it's possible to run a Windows Server with one GPU which is shared between all RDP clients so that people could
- create a session on the server
- start some program with a UI which needs GPU acceleration
- disconnect afterwards while the program stays running and gets full acceleration
- later reconnect to the session
Maybe that's an unusual use case because most things i can find about Windows Server and GPU seem to be about virtualization, f.e. here where it's even mentioned that
if your workload runs directly on physical Windows Server hosts, then
you have no need for graphics virtualization; your apps and services
already have access to the GPU capabilities and APIs natively
supported in Windows Server
which might indicate that is is possible.
I've read about RemoteFX and GPU-Partitioning, f.e. here, but it again looks like this is only for virtualization and i don't care about how fast rdp would update remote screens as long as the running programs get the full acceleration.
Am i searching for the wrong things? Is this even possible?
If it's possible, how would it impact performance when the session is connected and when it's disconnected?