Score:1

Killing all Python processes that are using either of the GPUs

gb flag

I have the following and I don't want to enter the PID of each Python process that uses either of GPUs one by one. How can I do so?

+---------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A      1831      C   python3.8                         137MiB |
|    0   N/A  N/A      2266      C   python3.8                         137MiB |
|    0   N/A  N/A      2612      C   python3.8                         137MiB |
|    0   N/A  N/A      2722      G   /usr/bin/X                          9MiB |
|    0   N/A  N/A      2758      C   python3.8                         137MiB |
|    0   N/A  N/A      2971      G   /usr/bin/gnome-shell                6MiB |
|    0   N/A  N/A     20403      C   python3.8                         137MiB |
|    0   N/A  N/A     21616      C   python3.8                         137MiB |
|    1   N/A  N/A      1831      C   python3.8                         137MiB |
|    1   N/A  N/A      2266      C   python3.8                         137MiB |
|    1   N/A  N/A      2612      C   python3.8                         137MiB |
|    1   N/A  N/A      2758      C   python3.8                         137MiB |
|    1   N/A  N/A     20403      C   python3.8                         137MiB |
|    1   N/A  N/A     21616      C   python3.8                         137MiB |
+---------------------------------------------+

Update: I used both killall and killall python3.8 and none worked:

[jalal@goku ~]$ nvidia-smi
Thu Jun 10 19:29:19 2021       
+---------------------------------------------+
| NVIDIA-SMI 460.67       Driver Version: 460.67       CUDA Version: 11.2     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  GeForce GTX 108...  Off  | 00000000:05:00.0 Off |                  N/A |
|  0%   35C    P2    59W / 250W |    843MiB / 11178MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   1  GeForce GTX 108...  Off  | 00000000:06:00.0 Off |                  N/A |
|  0%   37C    P2    61W / 250W |    826MiB / 11178MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+---------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A      1831      C   python3.8                         137MiB |
|    0   N/A  N/A      2266      C   python3.8                         137MiB |
|    0   N/A  N/A      2612      C   python3.8                         137MiB |
|    0   N/A  N/A      2722      G   /usr/bin/X                          9MiB |
|    0   N/A  N/A      2758      C   python3.8                         137MiB |
|    0   N/A  N/A      2971      G   /usr/bin/gnome-shell                6MiB |
|    0   N/A  N/A     20403      C   python3.8                         137MiB |
|    0   N/A  N/A     21616      C   python3.8                         137MiB |
|    1   N/A  N/A      1831      C   python3.8                         137MiB |
|    1   N/A  N/A      2266      C   python3.8                         137MiB |
|    1   N/A  N/A      2612      C   python3.8                         137MiB |
|    1   N/A  N/A      2758      C   python3.8                         137MiB |
|    1   N/A  N/A     20403      C   python3.8                         137MiB |
|    1   N/A  N/A     21616      C   python3.8                         137MiB |
+---------------------------------------------+
[jalal@goku ~]$ killall python3.8
[jalal@goku ~]$ nvidia-smi
Thu Jun 10 19:29:26 2021       
+---------------------------------------------+
| NVIDIA-SMI 460.67       Driver Version: 460.67       CUDA Version: 11.2     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  GeForce GTX 108...  Off  | 00000000:05:00.0 Off |                  N/A |
|  0%   35C    P2    59W / 250W |    843MiB / 11178MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   1  GeForce GTX 108...  Off  | 00000000:06:00.0 Off |                  N/A |
|  0%   37C    P2    62W / 250W |    826MiB / 11178MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+---------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A      1831      C   python3.8                         137MiB |
|    0   N/A  N/A      2266      C   python3.8                         137MiB |
|    0   N/A  N/A      2612      C   python3.8                         137MiB |
|    0   N/A  N/A      2722      G   /usr/bin/X                          9MiB |
|    0   N/A  N/A      2758      C   python3.8                         137MiB |
|    0   N/A  N/A      2971      G   /usr/bin/gnome-shell                6MiB |
|    0   N/A  N/A     20403      C   python3.8                         137MiB |
|    0   N/A  N/A     21616      C   python3.8                         137MiB |
|    1   N/A  N/A      1831      C   python3.8                         137MiB |
|    1   N/A  N/A      2266      C   python3.8                         137MiB |
|    1   N/A  N/A      2612      C   python3.8                         137MiB |
|    1   N/A  N/A      2758      C   python3.8                         137MiB |
|    1   N/A  N/A     20403      C   python3.8                         137MiB |
|    1   N/A  N/A     21616      C   python3.8                         137MiB |
+---------------------------------------------+
[jalal@goku ~]$ killall
Usage: killall [-Z CONTEXT] [-u USER] [ -eIgiqrvw ] [ -SIGNAL ] NAME...
       killall -l, --list
       killall -V, --version

  -e,--exact          require exact match for very long names
  -I,--ignore-case    case insensitive process name match
  -g,--process-group  kill process group instead of process
  -y,--younger-than   kill processes younger than TIME
  -o,--older-than     kill processes older than TIME
  -i,--interactive    ask for confirmation before killing
  -l,--list           list all known signal names
  -q,--quiet          don't print complaints
  -r,--regexp         interpret NAME as an extended regular expression
  -s,--signal SIGNAL  send this signal instead of SIGTERM
  -u,--user USER      kill only process(es) running as USER
  -v,--verbose        report if the signal was successfully sent
  -V,--version        display version information
  -w,--wait           wait for processes to die
  -Z,--context REGEXP kill only process(es) having context
                      (must precede other arguments)
guiverc avatar
cn flag
One of the reasons I like `killall` but I don't really understand your question as are you trying to kill gnome-shell too? (you've listed it)
terdon avatar
cn flag
What command gives you that output? Which of those processes do you want to kill?
Mona Jalal avatar
gb flag
the point is exactly not to kill gnome-shell and only kill python processes without entering their PIDs @guiverc
guiverc avatar
cn flag
As I stated in first commend; I'd use `killall` or `killall python3.8` in that example. Use `man killall` to read your options (which are many, including using patterns). As I don't know what your paste was I don't know if you'll need to adjust the command
Mona Jalal avatar
gb flag
@guiverc please check the updated post
guiverc avatar
cn flag
Sorry I don't know what the `nvidia-smi` shows and currently used boxes are all AMD so I can't explore what it likely is.. so cannot help with how to interpret what it's actually showing and thus the required option you'll need (answer may only be path is also need `/usr/bin/python3.8` for example; but I don't know `nvidia-smi` sorry; I'd use `ps` myself.
Score:0
gb flag
$ killall -9 python3.8

Here, SIGKILL (9) will force it to exit.

Credits to VG9t

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.