About isolate+
Assuming you are seeing this in the top
command interactive screen output in the terminal, the +
sign means the string is shortened to fit in the current terminal window ... Expanding/maximizing the terminals window should reveal the rest of the process's command name string as Isolated Web Co
which should actually be Isolated Web Content
but the characters excess of 15 are disposed off as a result of the limit for the command field in /proc/{PID}/stat
.
This is related to the new Firefox’s Site Isolation feature AKA Project Fission which seems to be enabled by default in the late versions of Firefox ... It involves using/utilizing GFX WebRender(a GPU-based 2D rendering engine) which relies on system driver/support for your GPU ... If you experience issues from that, then it's most likely a GPU driver/setting issue and you need to either update/change the driver to a more stable/tested/better one(if available) or change the GPU driver settings if that is an option e.g. NVIDIA optimus/hybrid cards to select a single active GPU for example.
Disabling this safety feature is not recommended for pure safety fail-safe reasons.
This feature, however, can be disabled in Firefox's configuration tab(about:config
) by setting both the fission.autostart
and gfx.webrender.all
options to false
and restarting Firefox afterwards.
About actuality
Other than that, you might find the following resource from Mozilla Support useful:
Firefox uses too much memory or CPU resources - How to fix
However, if you are assuming the Web Browser(Firefox, Chrome, ... etc.) is the only/default culprit in this interaction, then hold your horses ... Webpages are in fact a set of programmatic instructions/codes i.e. they are computer programs that are written/coded to interact with and use system resources e.g. CPU, GPU, memory, disk space ... etc. ... The fact they do so through a proxy(The Web Browser) in a somewhat controlled/isolated environment doesn't change what they actually are ... Therefore, like any other program, they can be useful/harmful, well written/coded or otherwise, system resources efficient or a resources hogging son of a web.
I have tested the mentioned example website in your question:
https://www.edclub.com/sportal/program-3/2945.play
on both Firefox and Google-Chrome Web Browsers and noticed unexpectedly high CPU/GPU usage similar to your numbers mentioned in your question(although Google-Chrome showed a bit less CPU usage (around 50%) as it's GPU utilization engine seemed a bit more efficient) ... I then manually inspected the source code of that web page to find out that they use a lot(more than I would normally expect in a web page) of code rendered scenes/views i.e. they paint the picture you see almost entirely from scratch using your computer's resources (Please don't take it as if I'm criticizing their code, but rather describing it) ... I then ran a standardized web analyzing tool(Lighthouse) and got the following results:
Therefore, the high load you see is normal ... It is the result of a rather intensive browser's web elements drawing/rendering process ... This will raise the CPU/GPU load/utilization for a short while at the initial scene drawing then settle down then spike again when new scene parts are drawn e.g. when moving the view … This is reflected in your system's load averages at 1, 5 and 15 minutes of 2.36, 1.20 and 0.80 respectively which falls within the normal expected average load on a modern(quad core CPU or more) desktop computer.
About discrepancy
Web browsers have two main components: UI and engine ... While UIs are highly portable and cause no discrepancy, engines are not portable and might cause discrepancy between different operating systems or even between different hardware.
Engines vary, performance-wise, depending on OS or hardware ... Some engines have better support/utilization for some OSs/hardware than others ... Therefore, the same web browser/web browser engine might and will show some discrepancy between different OSs or even between different hardware on the same OS.
About alternatives
You can tweak the engine configuration for your needs:
For example, the rendering behavior of the Gecko engine in Firefox can be changed to e.g. enable WebGPU in about:cofig
by setting(add any none-existing option) gfx.webrender.all
, dom.webgpu.enabled
, layers.gpu-process.enabled
, layers.mlgpu.enabled
, media.gpu-process-decoder
and media.ffmpeg.vaapi.enabled
to true
… Then, restarting Firefox afterwards.
Also, see this Ubuntu discourse topic about enabling hardware acceleration support for WebRender.
You can always install another browser with a different engine:
As I mentioned above, after testing your example web-page, Google-Chrome showed less CPU usage than Firefox ... The reason as I see it is that the Blink engine used by Google-Chrome(and supposedly the Chromium browser family) seemed (On my system and probably will on your system as well) to offer better support/utilization for the GPU which in turn took much of the load for the rendering process off the CPU while the Gecko engine used by Firefox didn't seem to offer as such support/utilization for the same GPU which in turn put all/most of the rendering process on the CPU instead.
About GPU
Why blame GPU support when it's the CPU load that is high ... It's the CPU that's not doing its job flawlessly ... Can it be that the CPU support is the culprit?
No, f your CPU is not supported, then most likely you'd not have time/way to notice increased CPU load from a web content rendering process ... Trust me, you wouldn't have had a chance to install a web browser let alone launching and using one.
CPU support is foremost a 0
or 1
binary chance ... Nothing in between would be productively workable.
That said, a CPU(used to be AKA "micro-processor" long time ago when Earth was flat) is a actually a silicon-based(until now) micro-processor(s) chip ... Likewise, the GPU, RAID, Network, Encryption ... etc. have their own specialized and dedicated micro-processor(s) chips that are designed to do a certain job oftentimes more efficiently than the mighty CPU itself.
Can the CPU do everything alone? ... Sure, like old times when it was known as "the" micro-processor ... So computer's life goes on ... Hardware(Graphics acceleration, Network flow management, RAID management, Encryption ... etc.) done with separate specialized micro-processor(s) chips have an equivalent Software(Graphics acceleration, Network flow management, RAID management, Encryption ... etc.) done with the CPU.
Will the CPU load be the same in either case? ... No ... Will it at least be equivalent to the total load of those hardware chips? No, It will be much higher because those specialized chips are good at doing what they are made to do ... Take the GPU for example. It's made so that it handles many(much more than a CPU can) small processes in parallel(e.g. rendering thousands small parts of a picture/scene/view at the same time). The CPU on the other hand likes bigger processes with little threads/parts in parallel and will finish them faster than a GPU, but will choke on rendering the canvas/scene/view of a very complex web component.
Web Engines should prefer a GPU when rendering web content but will use the CPU when their transaction with a GPU fails ... Moreover, It can be hard for developers to add support for all the GPUs on multiple platforms/OSs ... So, for the most part, multi-platform browsers keep painting/rendering on the CPU as an easy portable/compatible alternative because CPUs have almost universal support on different OSs while GPUs don't ... i.e. It's common and normal to have computer users asking for a driver/support for a certain GPU on a certain OS, but how often do they ask for a driver to their CPU?
The reason CPUs are universally supported (apart from the obvious reasons that no OS can run on them without proper support) is their architectures are limited in variety and their working mechanisms i.e. addresses, bridges, buffers … etc. are standardized and they are backward compatible by nature … Other chips, however, are mostly not.