Score:1

fclose only creates file at script end when running drush command

cn flag

I'm really struggling with Drupal php memory/file/cache management. I've tried a lot of different ways to avoid memory errors (php 8.1) when writing a multi-megabyte file out. My observation is even though I've tried fflush() and fclose() on a file in previous attempts, the file only gets created at script end, not when I tell the script to create the file. I understand that file open/close is one of the most expensive operations to perform, and php is (most likely?) doing me a favor by waiting until the end to actually create the file, but AARGH! it runs out of memory. I've tried Drupal batch operations and they even fail with OOM errors. Anyone have a working drush script that could process 1M nodes, export data to a file without memory errors? [EDIT] Per @clive (btw you're awesome help!), here's my complete script. I have tried for the file access JsonCollectionStreamReader that used yield, but I think this is certainly a Drupal cache (one of the two hardest things in CS:) issue as this is the error: PHP Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 20480 bytes) in /...../web/core/lib/Drupal/Core/Cache/DatabaseBackend.php on line 167

    <?php
    // Pass a premises nid as parameter to script.

    use Drupal\node\Entity\Node;
    use Drupal\Core\Cache\Cache;
    
    $premisesNid = $_SERVER['argv'][3];
    $premisesNode = Node::load($premisesNid);
    
    $premisesName = str_replace(' ', '', $premisesNode->getTitle());
    $permitsFile = 'public://PermitData-' . $premisesName . '.json';

    $fileOut = fopen($permitsFile, 'w') or die('Cant open output file!');

    $permitNids = Drupal::entityQuery('node')
      ->accessCheck()
      ->condition('status', 1)
      ->condition('type', 'permit')
      ->condition('field_property', $premisesNode->id(), '=')
      ->execute();

    $exported = 0;
    foreach ($permitNids as $permitNid) {
      $keyValueData = nodeKeyValuePairs($permitNid);
      $permit['permitData'] = $keyValueData['values'];

      // Collect the transactions for this permit.
      $transactionNids = Drupal::entityQuery('node')
        ->accessCheck()
        ->condition('status', 1)
        ->condition('type', 'transactions')
        ->condition('field_trans_parent', $permitNid, '=')
        ->execute();

      foreach ($transactionNids as $transactionNid) {
        $trans = nodeKeyValuePairs($transactionNid);
        $permit['trans'][] = $trans;
      }
      $line = json_encode($permit) . PHP_EOL;
      fwrite($fileOut, $line);
      $exported++;
      Drupal::service('entity.memory_cache')->deleteAll(); //<--Solution!
    }
    fclose($fileOut);
    echo 'Exported ' . $exported . ' permits.';

    function nodeKeyValuePairs($anyNid): ?array {
        $values = [];
        $anyNode = Node::load($anyNid);

        if (empty($anyNode)) { // Node does not exist.
          return $values;
        }

        foreach ($anyNode as $key => $value) {
          $values['values'][$key] = $anyNode->get($key)->getValue();
        }

        $tags = ['node:' . $anyNode->id()];
        Cache::invalidateTags($tags);
        return $values;
    }
Kevin avatar
in flag
Are you streaming to the file or trying to do all of this in a single loop?
cn flag
I (think) I should be able to do this in a single loop by opening/closing the file in a separate function. (VERY inefficient, but I'm searching for an answer as far as I can see php is not creating the file with fopen(fn, 'a'), fwrite, fclose, return.)
cn flag
BTW, I've tried streaming, yes, and yields and they run out of memory too.
leymannx avatar
ne flag
Exporting 1M nodes Can you maybe chunk the export into multiple files?
cn flag
That's the only (painful) way I've been able to make it work. Is there any way to force php to create a file and close it before the script finishes?
cn flag
This shouldn’t be an issue with the file, you’re opening a reference, not the whole thing (unless you’re using file_get_contents or similar). Are you sure the OOM errors aren’t coming from loading a million nodes into the static cache and not clearing it regularly, for example? Adding a bit of code to demonstrate what you’re doing would be useful
ru flag
Agree with @Clive, e.g. the people from simple_sitemap module had similar issues, take a look how they solved it: https://www.drupal.org/project/simple_sitemap/issues/3170261
cn flag
@Hudri NAILED IT! Simply clearing the cache before run and putting Drupal::service('entity.memory_cache')->deleteAll(); allowed the loop to execute to completion. Many thanks.
cn flag
Forgot to mention for my smaller test ( > 3000 nodes) the output was 1.6GB file without issues in 18 seconds.
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.