Score:0

How do I disable caching of a custom token?

pl flag

I have created a custom token which reads values from the Apache Request Headers. It seems to work fine for my testing, but then when another user subsequently accesses the token, it uses my values. It seems that the token value is being cached. How do I disable caching on custom token values? NOTE: because of the use situation, I need to disable caching on the token value (or whole page), not just add a context.

I have searched for this and not found an answer. There are no cache related settings in hook_token_info nor hook_tokens. I have seen references to using bubbleable metadata in caching (https://www.drupal.org/node/2528662), but I am not clear on what I need to put there to disable caching.

/**
 * Implements hook_token_info().
 */
function fred_token_info() {
  $info = [
    'types' => [
      'fred_type' => [
        'name' => t('fred type'),
      ],
    ],
    'tokens' => [
      'fred_type' => [
        'APACHE_HEADERS' => [
          'name' => 'apache_request_headers()',
          'description' => t('Request headers as sent through Apache...not strictly a superglobal, but useful. Works in the Apache, FastCGI, CLI, and FPM webservers.'),
          'dynamic' => TRUE,
        ],
      ],
    ],
  ];
  return $info;
}

/**
 * Implements hook_tokens().
 */
function fred_tokens($type, $tokens, array $data, array $options, \Drupal\Core\Render\BubbleableMetadata $bubbleable_metadata) {
  $replacements = [];
  if ($type == 'fred_type') {
    foreach ($tokens as $name => $original) {
      $tokenParts = explode(':',$name);
      if (count($tokenParts) >= 2) {
        switch ($tokenParts[0]) {
          case 'APACHE_HEADERS':
            $fredType =& $_SERVER;
            break;
        }
        $FredKey = $tokenParts[1];
      }
      else {
        return;
      }

      if (!isset($tokenParts[2]) || $tokenParts[2]!=='raw') {
        $replacements[$original] = filter_var($superglobalType[$superglobalKey], FILTER_SANITIZE_STRING, FILTER_FLAG_NO_ENCODE_QUOTES|FILTER_FLAG_STRIP_LOW);        
      }
    }
  }

  return $replacements;
}
id flag
Show your implementation.
cn flag
You more likely want to vary caching, rather than disabling it. Have a read of https://www.drupal.org/docs/drupal-apis/cache-api/cache-contexts
apaderno avatar
us flag
Since this is a question about code, we need to see the code you wrote. Otherwise, the answer can just say it is necessary to set a cache context. We cannot say more because, without context, we cannot say what cache context must be set.
pl flag
Edited to add code as requested.
4uk4 avatar
cn flag
You don't have access to this superglobal in Drupal. You could access \Drupal::request()->headers->get('X-foo') and add the cache context `headers:X-Foo`, but it would be not a good idea to put this token in random places depending on how much the header value is varying from request to request. Also, this doesn't work with anonymous page caching.
Score:3
in flag

You need to provide a cache context with your token. It's essentially a way to tell Drupal to 1) still cache your value but 2) cache a value per context. Drupal has some built-in context values, the one you want is the "user" context (i.e. cache a value per user).

In order to provide the context, hook_tokens() provides a $bubbleable_metadata object as the fifth parameter. This object is passed down as Drupal recurses through the token segments. Use this object to attach the cache context using addCacheContexts().

id flag
Perhaps not the `user` context, but only because the OP did not specify exactly which headers, but otherwise, the above is always the answer to "why is this cached?".
Score:-2
pl flag

I eventually found an answer. Modify hook_tokens to add the following line so that it executes prior to the computation of the tokens (I put it as the first line of the function):

\Drupal::service('page_cache_kill_switch')->trigger();

Be aware that this prevents anything on the affected page from being cached, whether the end user is logged in or not, but this is perfect for our use case.

pl flag
If you are going to downvote, please give a reason. This solution matched my use and need case perfectly and I included the warnings of why it might not be a solution for everyone.
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.