Score:0

AMD GPU OpenCL ROCm - rocminfo error message HSA_STATUS_ERROR_OUT_OF_RESOURCES

gu flag

OS: Ubuntu 22.04.1
Ive just installed OpenCL and ROCm appears to be working and I can get some apps to render using the Graphics card.
AMDGPU is installed with
sudo amdgpu-install --opencl=legacy,rocr -y

Output from clinfo
Platform Name AMD Accelerated
Parallel Processing
Number of devices 1
Device Name Ellesmere
Device Vendor Advanced Micro
Devices, Inc.
Device Vendor ID 0x1002
Device Version OpenCL 2.0 AMD-APP (3380.4)
Driver Version 3380.4 (PAL,HSAIL)
Device OpenCL C Version OpenCL C 2.0
Device Type GPU
Device Board Name (AMD) Radeon RX 580 Series
Device PCI-e ID (AMD) 0x67df
Device Topology (AMD) PCI-E, 0000:01:00.0
Device Profile FULL_PROFILE

I ran $ sudo dmesg | grep -e "kfd"
and it output
[ 6.379838] kfd kfd: amdgpu: Allocated 3969056 bytes on gart
[ 6.380067] kfd kfd: amdgpu: added device 1002:67df
So apparently the hardware is working ok. But when I run rocminfo I get an error message. Any idea how I can solve this?

$ rocminfo
ROCk module is loaded
hsa api call failure at: /long_pathname_so_that_rpms_can_package_the_debug_info/src/rocminfo/rocminfo.cc:1148 Call returned HSA_STATUS_ERROR_OUT_OF_RESOURCES: The runtime failed to allocate the necessary resources. This error may also occur when the core runtime library needs to spawn threads or create internal OS-specific events.

How do I get rocminfo to work?

I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.