Score:1

Render view inside block with caching

lc flag

My setup is a bit unconventional. I have a view displaying a block, relying on a contextual filter (let's call it product ID). I also have a custom block that renders this view programmatically because I need to include this block in multiple places on the page and I have some custom logic that pulls the actual product ID to call the view with. Basically:

$view = Views::getView('view_id'); 
$args = ['product_id' => $whatever_product];
return $view->buildRenderable('views_block_id', $args);

The process basically works but, as usual, I'll have problems when there are several such blocks on the page. Views only caches using the block id as a cache tag, so the first rendered view gets cached and displayed in all places. Naturally, switching off the cache would work:

return $view->buildRenderable('views_block_id', $args, FALSE);

but not exactly what I have in mind, I don't want to lose the benefits of caching.

My initial thought was quite simple, let's use custom cache tags in the view, thanks to views_custom_cache_tag. So I did, including the argument from the contextual filter:

views_block:view_id-views_block_id
custom:{{ arguments.product_id }}

But it still doesn't work.

Is there any other way I missed? I can't push new cache tags right before I try to render the view. The usual view hooks don't get called in this case (the second block already gets the cached variant, without even bothering to go near the hooks).

4uk4 avatar
cn flag
*Naturally, switching off the cache would work* `return $view->buildRenderable('views_block_id', $args, FALSE);`. If this works then use it. You don't need to cache the rendered result of the View when you are in a block which is cached on its own.
lc flag
If I have the same block twice on the page that happens to refer to a different product, well, I won't be very happy if the block caches itself with one of them. :-) Come to think of it, if the block caches itself with its own added instance ID, not just the generic one...
lc flag
OK, thanks, if you copy the same to an answer, I'd be happy to accept it.
Score:2
cn flag

In a block you can switch off caching of the rendered View in ViewExecutable::buildRenderable:

$view->buildRenderable('views_block_id', $args, FALSE)

because the rendered result of each block instance is already cached.

By the way, cache tags are not involved when caching variants of the same element. This is only controlled by cache keys and contexts. With cache=FALSE you disable cache keys, but not contexts, which should still bubble up to the block level. If a context is missing, you can set a cache context manually, for example for the route or a url path query arg if the product ID depends on it.

Edit: I've removed the return statement, because it might be necessary to build a real render array with the embedded View.

lc flag
In this particular case, it works OK with the return, this is already the last line of the `build()` function. Thanks.
in flag
I have what I _think_ is a similar set-up, except that I should add that my single, custom block may display one _or more_ view blocks. The views are referenced from a multi-value viewfield on the node, and in some cases the same view is referenced with a different display and arguments. In these cases where the same view is referenced more than once, only the first display is used for all of them, and adding the `FALSE` argument to the `buildRenderable` method does not solve this issue.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.