Score:0

Task spooler multi-slot "slot index" for GPU allocation

cn flag

I am trying to use task spooler to schedule CI/CD testing tasks for a CUDA project. The system contains multiple GPUs, and a single task uses most of the memory of one GPU, so I want to avoid giving a single GPU multiple tasks. tsp allows "MULTI-SLOT" mode, where it can have multiple active slots that can pull from the queue, rather than just one. However, I so far see no way for the task itself to deduce which slot it has been assigned, which means that I don't know which GPU to run the task on. I could check with something like nvidia-smi which GPUs are being utilized, but that might cause race conditions if multiple items are pulled from the queue in a small amount of time.

Is there a way to deduce this slot number, or send it to the process somehow?

Marco avatar
br flag
What about one task spooler server per GPU?
Jan Heemstra avatar
cn flag
@Marco that would have no race conditions, but it would mean that I would have to "queue-balance" the input queues in order to get similar performance. Since tsp is a per-user queue, it would also mean that I would have to add a user for every GPU in the system. In my case I think it wouldn't be worth the headache over just using a single GPU.
Marco avatar
br flag
No, taskspooler is not a "per-user queue", it is a "per-socket queue", which is configured via "TS_SOCKET" environment variable.
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.