No articles found
Try different keywords or browse our categories
Fix: Fatal error: Allowed memory size exhausted error in PHP
Learn how to fix the 'Fatal error: Allowed memory size exhausted' error in PHP. Complete guide with solutions, optimization techniques, and memory management best practices.
The ‘Fatal error: Allowed memory size exhausted’ error is a critical PHP issue that occurs when a script exceeds the allocated memory limit. This fatal error stops script execution immediately and prevents your application from functioning properly. Understanding and resolving this error is crucial for building scalable PHP applications that handle data efficiently and maintain optimal performance.
This comprehensive guide explains what causes the memory exhausted error, why it happens, and provides multiple solutions to fix and prevent it in your PHP projects with practical code examples.
What is the ‘Allowed memory size exhausted’ Error?
The ‘Allowed memory size exhausted’ error occurs when:
- PHP script consumes more memory than the configured limit
- Large datasets are processed without proper memory management
- Infinite loops or recursive functions create memory leaks
- Large arrays or objects are stored unnecessarily
- File uploads exceed memory capacity
- Database queries return massive result sets
- Image processing operations consume excessive memory
- Caching mechanisms store too much data in memory
Common Error Messages:
Fatal error: Allowed memory size of X bytes exhausted (tried to allocate Y bytes)Fatal error: Out of memory (allocated Z) (tried to allocate W bytes)Fatal error: Maximum execution time exceeded(often related to memory issues)Fatal error: Allowed memory size exhausted at memory_limit=128M
Understanding the Problem
PHP allocates a specific amount of memory to each script execution, controlled by the memory_limit directive in php.ini. When a script exceeds this limit, PHP terminates execution with a fatal error. This commonly happens when processing large files, handling big datasets, performing intensive operations, or having inefficient code that doesn’t release memory properly.
Typical PHP Project Structure:
MyPhpApp/
├── index.php
├── config/
│ ├── memory.php
│ └── performance.php
├── includes/
│ ├── memory_helpers.php
│ └── file_processors.php
├── src/
│ ├── Services/
│ │ ├── MemoryOptimizer.php
│ │ ├── FileProcessor.php
│ │ └── DataProcessor.php
│ └── Utils/
│ ├── CsvReader.php
│ └── ImageProcessor.php
├── uploads/
├── cache/
└── logs/
Solution 1: Proper Memory Configuration and Management
The most fundamental approach to prevent memory exhaustion is to configure and manage memory properly.
❌ Without Proper Memory Management:
<?php
// ❌ Loading entire large file into memory
$largeFile = file_get_contents('huge_file.txt'); // ❌ Could exhaust memory
$lines = explode("\n", $largeFile);
// ❌ Creating massive arrays without limits
$massiveArray = [];
for ($i = 0; $i < 1000000; $i++) {
$massiveArray[] = str_repeat('data', 1000); // ❌ Memory intensive
}
// ❌ Recursive function without proper termination
function infiniteRecursion($n) {
return infiniteRecursion($n + 1); // ❌ Will exhaust memory
}
?>
✅ With Proper Memory Management:
config/memory.php:
<?php
/**
* Memory configuration and management
*/
class MemoryManager {
/**
* Get current memory usage
*/
public static function getMemoryUsage() {
return [
'current' => memory_get_usage(true),
'peak' => memory_get_peak_usage(true),
'limit' => self::getMemoryLimit(),
'used_percentage' => self::getMemoryUsagePercentage()
];
}
/**
* Get memory limit in bytes
*/
public static function getMemoryLimit() {
$limit = ini_get('memory_limit');
if ($limit == -1) {
return PHP_INT_MAX; // Unlimited
}
return self::parseMemoryLimit($limit);
}
/**
* Parse memory limit string to bytes
*/
private static function parseMemoryLimit($limit) {
$limit = trim($limit);
$last = strtolower($limit[strlen($limit)-1]);
$limit = (int) $limit;
switch($last) {
case 'g': $limit *= 1024;
case 'm': $limit *= 1024;
case 'k': $limit *= 1024;
}
return $limit;
}
/**
* Get memory usage percentage
*/
public static function getMemoryUsagePercentage() {
$usage = memory_get_usage(true);
$limit = self::getMemoryLimit();
if ($limit == PHP_INT_MAX) {
return 0;
}
return ($usage / $limit) * 100;
}
/**
* Check if memory is running low
*/
public static function isMemoryLow($threshold = 80) {
return self::getMemoryUsagePercentage() > $threshold;
}
/**
* Increase memory limit temporarily
*/
public static function increaseMemoryLimit($newLimit) {
$parsedLimit = self::parseMemoryLimit($newLimit);
$currentLimit = self::getMemoryLimit();
if ($parsedLimit > $currentLimit) {
ini_set('memory_limit', $newLimit);
return true;
}
return false;
}
/**
* Force garbage collection
*/
public static function forceGarbageCollection() {
if (function_exists('gc_collect_cycles')) {
gc_collect_cycles();
}
}
/**
* Monitor memory usage with callback
*/
public static function monitorMemoryUsage($callback, $threshold = 80) {
$usage = self::getMemoryUsagePercentage();
if ($usage > $threshold) {
if (is_callable($callback)) {
call_user_func($callback, $usage);
}
return true;
}
return false;
}
/**
* Get human-readable memory size
*/
public static function formatBytes($size, $precision = 2) {
$units = ['B', 'KB', 'MB', 'GB', 'TB'];
for ($i = 0; $size > 1024 && $i < count($units) - 1; $i++) {
$size /= 1024;
}
return round($size, $precision) . ' ' . $units[$i];
}
/**
* Check available memory
*/
public static function getAvailableMemory() {
$limit = self::getMemoryLimit();
$usage = memory_get_usage(true);
if ($limit == PHP_INT_MAX) {
return PHP_INT_MAX;
}
return $limit - $usage;
}
/**
* Estimate memory needed for operation
*/
public static function estimateMemoryNeeded($dataSize, $multiplier = 1.5) {
return $dataSize * $multiplier;
}
}
?>
includes/memory_helpers.php:
<?php
/**
* Memory helper functions
*/
/**
* Safely increase memory limit
*/
function safeIncreaseMemoryLimit($newLimit) {
$currentLimit = ini_get('memory_limit');
$newLimitBytes = parseMemoryLimit($newLimit);
$currentLimitBytes = parseMemoryLimit($currentLimit);
if ($newLimitBytes > $currentLimitBytes) {
ini_set('memory_limit', $newLimit);
return true;
}
return false;
}
/**
* Parse memory limit string
*/
function parseMemoryLimit($limit) {
$limit = trim($limit);
if ($limit == -1) return PHP_INT_MAX;
$last = strtolower($limit[strlen($limit)-1]);
$limit = (int) $limit;
switch($last) {
case 'g': $limit *= 1024;
case 'm': $limit *= 1024;
case 'k': $limit *= 1024;
}
return $limit;
}
/**
* Get memory usage in human readable format
*/
function getMemoryUsageFormatted() {
return [
'current' => formatBytes(memory_get_usage(true)),
'peak' => formatBytes(memory_get_peak_usage(true)),
'limit' => formatBytes(parseMemoryLimit(ini_get('memory_limit')))
];
}
/**
* Format bytes to human readable
*/
function formatBytes($size, $precision = 2) {
$units = ['B', 'KB', 'MB', 'GB', 'TB'];
for ($i = 0; $size > 1024 && $i < count($units) - 1; $i++) {
$size /= 1024;
}
return round($size, $precision) . ' ' . $units[$i];
}
/**
* Check if memory is running low
*/
function isMemoryLow($threshold = 80) {
$usage = memory_get_usage(true);
$limit = parseMemoryLimit(ini_get('memory_limit'));
if ($limit == PHP_INT_MAX) return false;
$percentage = ($usage / $limit) * 100;
return $percentage > $threshold;
}
/**
* Force garbage collection with safety check
*/
function safeGarbageCollect() {
if (function_exists('gc_collect_cycles')) {
if (isMemoryLow(90)) {
gc_collect_cycles();
return true;
}
}
return false;
}
/**
* Process large arrays in chunks
*/
function processLargeArray($array, $callback, $chunkSize = 1000) {
$chunks = array_chunk($array, $chunkSize);
foreach ($chunks as $chunk) {
call_user_func($callback, $chunk);
// ✅ Force garbage collection after each chunk
if (function_exists('gc_collect_cycles')) {
gc_collect_cycles();
}
}
}
/**
* Read large files in chunks
*/
function readLargeFile($filename, $callback, $chunkSize = 8192) {
$handle = fopen($filename, 'r');
if (!$handle) {
throw new Exception("Could not open file: $filename");
}
while (!feof($handle)) {
$chunk = fread($handle, $chunkSize);
call_user_func($callback, $chunk);
// ✅ Periodic garbage collection
if (ftell($handle) % ($chunkSize * 100) === 0) {
safeGarbageCollect();
}
}
fclose($handle);
}
/**
* Process CSV file without loading entire file into memory
*/
function processCsvFile($filename, $callback, $delimiter = ',') {
$handle = fopen($filename, 'r');
if (!$handle) {
throw new Exception("Could not open CSV file: $filename");
}
$row = 0;
while (($data = fgetcsv($handle, 0, $delimiter)) !== FALSE) {
call_user_func($callback, $data, $row);
$row++;
// ✅ Garbage collect every 1000 rows
if ($row % 1000 === 0) {
safeGarbageCollect();
}
}
fclose($handle);
}
/**
* Optimize array by removing unnecessary data
*/
function optimizeArray($array) {
// ✅ Remove null values and empty strings
return array_filter($array, function($value) {
return $value !== null && $value !== '' && $value !== [];
});
}
/**
* Get object size estimation
*/
function getObjectSize($object) {
$start_memory = memory_get_usage();
$serialized = serialize($object);
$size = strlen($serialized);
unset($serialized);
return $size;
}
?>
src/Utils/CsvReader.php:
<?php
/**
* Memory-efficient CSV reader
*/
class CsvReader {
private $filename;
private $handle;
private $delimiter;
private $header;
public function __construct($filename, $delimiter = ',', $hasHeader = true) {
$this->filename = $filename;
$this->delimiter = $delimiter;
if (!file_exists($filename)) {
throw new Exception("CSV file does not exist: $filename");
}
$this->handle = fopen($filename, 'r');
if (!$this->handle) {
throw new Exception("Could not open CSV file: $filename");
}
// ✅ Read header if exists
if ($hasHeader) {
$this->header = fgetcsv($this->handle, 0, $delimiter);
}
}
/**
* Process CSV row by row without loading entire file
*/
public function processRows($callback) {
$rowNumber = $this->header ? 1 : 0;
while (($row = fgetcsv($this->handle, 0, $this->delimiter)) !== FALSE) {
if ($this->header) {
// ✅ Convert to associative array
$rowData = array_combine($this->header, $row);
} else {
$rowData = $row;
}
// ✅ Call the callback function
$result = call_user_func($callback, $rowData, $rowNumber);
// ✅ Stop processing if callback returns false
if ($result === false) {
break;
}
$rowNumber++;
// ✅ Periodic memory cleanup
if ($rowNumber % 1000 === 0) {
$this->cleanupMemory();
}
}
}
/**
* Get specific rows without loading entire file
*/
public function getRows($startRow = 0, $limit = 100) {
$rows = [];
$currentRow = 0;
$headerOffset = $this->header ? 1 : 0;
// ✅ Seek to start position
rewind($this->handle);
if ($this->header) {
fgetcsv($this->handle, 0, $this->delimiter); // Skip header
}
while (($row = fgetcsv($this->handle, 0, $this->delimiter)) !== FALSE) {
if ($currentRow >= $startRow) {
if ($this->header) {
$rowData = array_combine($this->header, $row);
} else {
$rowData = $row;
}
$rows[] = $rowData;
if (count($rows) >= $limit) {
break;
}
}
$currentRow++;
}
return $rows;
}
/**
* Count total rows in CSV file
*/
public function countRows() {
$count = 0;
$headerOffset = $this->header ? 1 : 0;
rewind($this->handle);
while (($row = fgetcsv($this->handle, 0, $this->delimiter)) !== FALSE) {
$count++;
}
return $count - $headerOffset;
}
/**
* Get memory usage statistics
*/
public function getMemoryStats() {
return [
'current_usage' => memory_get_usage(true),
'peak_usage' => memory_get_peak_usage(true),
'limit' => ini_get('memory_limit'),
'estimated_rows_processed' => 0 // Would need to track this
];
}
/**
* Cleanup memory periodically
*/
private function cleanupMemory() {
// ✅ Force garbage collection
if (function_exists('gc_collect_cycles')) {
gc_collect_cycles();
}
// ✅ Check memory usage
if (isMemoryLow(80)) {
// ✅ Additional cleanup if needed
$this->clearstatcache();
}
}
/**
* Clear stat cache to free memory
*/
private function clearstatcache() {
clearstatcache();
}
/**
* Close file handle
*/
public function close() {
if ($this->handle) {
fclose($this->handle);
}
}
/**
* Destructor to ensure file is closed
*/
public function __destruct() {
$this->close();
}
}
?>
index.php:
<?php
/**
* Main application with memory management
*/
require_once 'includes/memory_helpers.php';
require_once 'config/memory.php';
require_once 'src/Utils/CsvReader.php';
// ✅ Initialize memory monitoring
echo "<h1>Memory Management Demo</h1>\n";
// ✅ Show current memory usage
$memoryUsage = getMemoryUsageFormatted();
echo "<h2>Current Memory Usage</h2>\n";
echo "<p>Current: {$memoryUsage['current']}</p>\n";
echo "<p>Peak: {$memoryUsage['peak']}</p>\n";
echo "<p>Limit: {$memoryUsage['limit']}</p>\n";
// ✅ Check if memory is running low
if (isMemoryLow(75)) {
echo "<p style='color: red;'>⚠️ Memory usage is high!</p>\n";
} else {
echo "<p style='color: green;'>✅ Memory usage is acceptable</p>\n";
}
// ✅ Example: Process large dataset efficiently
echo "<h2>Processing Large Dataset</h2>\n";
// ✅ Create a large array simulation
$largeDataset = [];
for ($i = 0; $i < 10000; $i++) {
$largeDataset[] = [
'id' => $i,
'name' => 'User ' . $i,
'email' => 'user' . $i . '@example.com',
'data' => str_repeat('x', 100) // Small data chunk
];
}
echo "<p>Created dataset with " . count($largeDataset) . " records</p>\n";
// ✅ Process in chunks to manage memory
$processedCount = 0;
$chunkSize = 1000;
for ($i = 0; $i < count($largeDataset); $i += $chunkSize) {
$chunk = array_slice($largeDataset, $i, $chunkSize);
// ✅ Process chunk
foreach ($chunk as $record) {
// ✅ Simulate processing
$processedCount++;
}
// ✅ Free memory after each chunk
unset($chunk);
// ✅ Force garbage collection
if (function_exists('gc_collect_cycles')) {
gc_collect_cycles();
}
// ✅ Show progress and memory usage
if ($i % (2 * $chunkSize) === 0) {
$currentUsage = formatBytes(memory_get_usage(true));
echo "<p>Processed {$processedCount} records, Memory: {$currentUsage}</p>\n";
}
}
echo "<p>✅ Successfully processed {$processedCount} records</p>\n";
// ✅ Example: Reading large CSV file efficiently
echo "<h2>Reading Large CSV File</h2>\n";
// ✅ Create a sample CSV file for demonstration
$csvFile = 'sample_data.csv';
if (!file_exists($csvFile)) {
$handle = fopen($csvFile, 'w');
fputcsv($handle, ['id', 'name', 'email', 'age']);
for ($i = 1; $i <= 5000; $i++) {
fputcsv($handle, [$i, "User {$i}", "user{$i}@example.com", rand(18, 65)]);
}
fclose($handle);
}
try {
$csvReader = new CsvReader($csvFile);
$rowCount = 0;
$csvReader->processRows(function($row, $rowNum) use (&$rowCount) {
$rowCount++;
// ✅ Simulate processing
if ($rowNum % 1000 === 0) {
$currentUsage = formatBytes(memory_get_usage(true));
echo "<p>Processed row {$rowNum}, Memory: {$currentUsage}</p>\n";
}
// ✅ Stop if memory is getting low
if (isMemoryLow(85)) {
echo "<p style='color: orange;'>⚠️ Stopping due to high memory usage</p>\n";
return false; // Stop processing
}
return true; // Continue processing
});
echo "<p>✅ Successfully processed {$rowCount} CSV rows</p>\n";
} catch (Exception $e) {
echo "<p style='color: red;'>❌ Error processing CSV: " . $e->getMessage() . "</p>\n";
} finally {
// ✅ Clean up
if (file_exists($csvFile)) {
unlink($csvFile);
}
}
// ✅ Show final memory usage
$finalUsage = getMemoryUsageFormatted();
echo "<h2>Final Memory Usage</h2>\n";
echo "<p>Current: {$finalUsage['current']}</p>\n";
echo "<p>Peak: {$finalUsage['peak']}</p>\n";
// ✅ Force final garbage collection
if (function_exists('gc_collect_cycles')) {
$collected = gc_collect_cycles();
echo "<p>Garbage collected: {$collected} cycles</p>\n";
}
echo "<h2>Memory Management Tips</h2>\n";
echo "<ul>\n";
echo "<li>Process large datasets in chunks</li>\n";
echo "<li>Unset variables when no longer needed</li>\n";
echo "<li>Use generators for large data sets</li>\n";
echo "<li>Monitor memory usage during execution</li>\n";
echo "<li>Consider increasing memory limit for specific operations</li>\n";
echo "</ul>\n";
?>
Solution 2: Using Generators for Memory Efficiency
Implement generators to process large datasets without consuming excessive memory.
src/Utils/ImageProcessor.php:
<?php
/**
* Memory-efficient image processor
*/
class ImageProcessor {
/**
* Process images in a memory-efficient way using generators
*/
public static function processImages($imagePaths) {
foreach ($imagePaths as $path) {
if (file_exists($path)) {
// ✅ Process image without storing in memory
$info = [
'path' => $path,
'size' => filesize($path),
'modified' => filemtime($path)
];
// ✅ Yield result instead of storing in array
yield $info;
// ✅ Clear any temporary resources
clearstatcache();
}
}
}
/**
* Resize images with memory management
*/
public static function resizeImage($sourcePath, $destPath, $maxWidth, $maxHeight) {
if (!file_exists($sourcePath)) {
throw new Exception("Source image does not exist: $sourcePath");
}
// ✅ Check if we have enough memory for this operation
$imageSize = filesize($sourcePath);
$estimatedMemory = $imageSize * 3; // Rough estimate for processing
if ($estimatedMemory > MemoryManager::getAvailableMemory()) {
throw new Exception("Not enough memory to process image: $sourcePath");
}
// ✅ Get image info
$imageInfo = getimagesize($sourcePath);
if (!$imageInfo) {
throw new Exception("Invalid image file: $sourcePath");
}
list($width, $height, $type) = $imageInfo;
// ✅ Calculate new dimensions
$ratio = min($maxWidth / $width, $maxHeight / $height);
$newWidth = (int)($width * $ratio);
$newHeight = (int)($height * $ratio);
// ✅ Create image resource based on type
switch ($type) {
case IMAGETYPE_JPEG:
$source = imagecreatefromjpeg($sourcePath);
break;
case IMAGETYPE_PNG:
$source = imagecreatefrompng($sourcePath);
break;
case IMAGETYPE_GIF:
$source = imagecreatefromgif($sourcePath);
break;
default:
throw new Exception("Unsupported image type: $type");
}
if (!$source) {
throw new Exception("Could not create image resource: $sourcePath");
}
// ✅ Create destination image
$destination = imagecreatetruecolor($newWidth, $newHeight);
// ✅ Preserve transparency for PNG and GIF
if ($type == IMAGETYPE_PNG || $type == IMAGETYPE_GIF) {
imagealphablending($destination, false);
imagesavealpha($destination, true);
$transparent = imagecolorallocatealpha($destination, 255, 255, 255, 127);
imagefilledrectangle($destination, 0, 0, $newWidth, $newHeight, $transparent);
}
// ✅ Resize image
imagecopyresampled(
$destination, $source,
0, 0, 0, 0,
$newWidth, $newHeight,
$width, $height
);
// ✅ Save resized image
switch ($type) {
case IMAGETYPE_JPEG:
$result = imagejpeg($destination, $destPath, 85);
break;
case IMAGETYPE_PNG:
$result = imagepng($destination, $destPath);
break;
case IMAGETYPE_GIF:
$result = imagegif($destination, $destPath);
break;
}
// ✅ Free memory
imagedestroy($source);
imagedestroy($destination);
if (!$result) {
throw new Exception("Could not save resized image: $destPath");
}
return [
'original_size' => $imageSize,
'new_size' => filesize($destPath),
'dimensions' => ['width' => $newWidth, 'height' => $newHeight],
'path' => $destPath
];
}
/**
* Process multiple images with memory management
*/
public static function processMultipleImages($imagePaths, $callback) {
$processed = 0;
$failed = 0;
foreach (self::processImages($imagePaths) as $imageInfo) {
try {
$result = call_user_func($callback, $imageInfo);
if ($result) {
$processed++;
} else {
$failed++;
}
// ✅ Periodic cleanup
if ($processed % 10 === 0) {
MemoryManager::forceGarbageCollection();
}
} catch (Exception $e) {
$failed++;
error_log("Image processing error: " . $e->getMessage());
}
}
return ['processed' => $processed, 'failed' => $failed];
}
/**
* Get memory-efficient image information
*/
public static function getImageInfo($path) {
if (!file_exists($path)) {
return null;
}
$size = filesize($path);
$info = getimagesize($path);
if ($info) {
return [
'path' => $path,
'size' => $size,
'width' => $info[0],
'height' => $info[1],
'mime_type' => $info['mime'],
'estimated_memory' => $size * 3 // Rough estimate
];
}
return null;
}
}
?>
src/Services/FileProcessor.php:
<?php
/**
* Memory-efficient file processor service
*/
require_once __DIR__ . '/../Utils/ImageProcessor.php';
class FileProcessor {
/**
* Process large files in chunks
*/
public function processLargeFile($filename, $chunkCallback, $chunkSize = 8192) {
if (!file_exists($filename)) {
throw new Exception("File does not exist: $filename");
}
$handle = fopen($filename, 'r');
if (!$handle) {
throw new Exception("Could not open file: $filename");
}
$position = 0;
$fileSize = filesize($filename);
while (!feof($handle)) {
$chunk = fread($handle, $chunkSize);
$bytesRead = strlen($chunk);
// ✅ Process chunk
$result = call_user_func($chunkCallback, $chunk, $position, $fileSize);
$position += $bytesRead;
// ✅ Check memory usage periodically
if ($position % ($chunkSize * 100) === 0) {
$this->manageMemory();
}
// ✅ Stop if callback returns false
if ($result === false) {
break;
}
}
fclose($handle);
}
/**
* Process uploaded files with memory management
*/
public function processUpload($uploadDir, $maxFileSize = 10485760) { // 10MB default
if (!isset($_FILES) || empty($_FILES)) {
return ['success' => false, 'message' => 'No files uploaded'];
}
$results = [];
foreach ($_FILES as $fieldName => $file) {
if ($file['error'] !== UPLOAD_ERR_OK) {
$results[$fieldName] = [
'success' => false,
'message' => $this->getUploadErrorMessage($file['error'])
];
continue;
}
// ✅ Check file size
if ($file['size'] > $maxFileSize) {
$results[$fieldName] = [
'success' => false,
'message' => 'File too large: ' . formatBytes($file['size'])
];
continue;
}
// ✅ Check available memory
$availableMemory = MemoryManager::getAvailableMemory();
if ($file['size'] > $availableMemory * 0.8) { // Leave 20% headroom
$results[$fieldName] = [
'success' => false,
'message' => 'Insufficient memory to process file'
];
continue;
}
// ✅ Generate safe filename
$extension = pathinfo($file['name'], PATHINFO_EXTENSION);
$safeName = uniqid() . '.' . $extension;
$destination = $uploadDir . '/' . $safeName;
// ✅ Move uploaded file
if (move_uploaded_file($file['tmp_name'], $destination)) {
$results[$fieldName] = [
'success' => true,
'message' => 'File uploaded successfully',
'path' => $destination,
'size' => $file['size']
];
} else {
$results[$fieldName] = [
'success' => false,
'message' => 'Failed to move uploaded file'
];
}
// ✅ Manage memory after each upload
$this->manageMemory();
}
return $results;
}
/**
* Process CSV files efficiently
*/
public function processCsvFile($filename, $rowCallback, $options = []) {
$delimiter = $options['delimiter'] ?? ',';
$hasHeader = $options['has_header'] ?? true;
$skipEmpty = $options['skip_empty'] ?? true;
$reader = new CsvReader($filename, $delimiter, $hasHeader);
$rowCount = 0;
$errorCount = 0;
try {
$reader->processRows(function($row, $rowNum) use (&$rowCount, &$errorCount, $rowCallback, $skipEmpty) {
$rowCount++;
// ✅ Skip empty rows if configured
if ($skipEmpty && $this->isEmptyRow($row)) {
return true; // Continue processing
}
try {
$result = call_user_func($rowCallback, $row, $rowNum);
if ($result === false) {
return false; // Stop processing
}
} catch (Exception $e) {
$errorCount++;
error_log("CSV processing error at row $rowNum: " . $e->getMessage());
}
// ✅ Manage memory every 1000 rows
if ($rowCount % 1000 === 0) {
$this->manageMemory();
}
return true; // Continue processing
});
} finally {
$reader->close();
}
return [
'success' => true,
'rows_processed' => $rowCount,
'errors' => $errorCount
];
}
/**
* Process JSON files efficiently
*/
public function processJsonFile($filename, $itemCallback) {
if (!file_exists($filename)) {
throw new Exception("JSON file does not exist: $filename");
}
$handle = fopen($filename, 'r');
if (!$handle) {
throw new Exception("Could not open JSON file: $filename");
}
$buffer = '';
$depth = 0;
$inString = false;
$escaping = false;
$itemStart = 0;
$position = 0;
$itemCount = 0;
while (!feof($handle)) {
$chunk = fread($handle, 8192);
$buffer .= $chunk;
for ($i = 0; $i < strlen($chunk); $i++) {
$char = $chunk[$i];
$absPos = $position + $i;
if (!$escaping) {
if ($char === '"') {
$inString = !$inString;
} elseif (!$inString) {
if ($char === '{' || $char === '[') {
if ($depth === 0) {
$itemStart = $absPos;
}
$depth++;
} elseif ($char === '}' || $char === ']') {
$depth--;
if ($depth === 0) {
// ✅ Extract complete JSON object/array
$jsonStr = substr($buffer, $itemStart, $absPos - $itemStart + 1);
try {
$jsonObj = json_decode($jsonStr, true);
if ($jsonObj !== null) {
call_user_func($itemCallback, $jsonObj, $itemCount);
$itemCount++;
// ✅ Manage memory periodically
if ($itemCount % 100 === 0) {
$this->manageMemory();
}
}
} catch (Exception $e) {
error_log("JSON parsing error: " . $e->getMessage());
}
}
}
}
}
$escaping = ($char === '\\' && !$escaping);
}
// ✅ Keep only unprocessed portion of buffer
if ($depth > 0) {
$buffer = substr($buffer, $itemStart);
} else {
$buffer = '';
}
$position += strlen($chunk);
}
fclose($handle);
return ['items_processed' => $itemCount];
}
/**
* Manage memory during processing
*/
private function manageMemory() {
// ✅ Force garbage collection
if (function_exists('gc_collect_cycles')) {
gc_collect_cycles();
}
// ✅ Check if memory is running low
if (isMemoryLow(85)) {
// ✅ Additional cleanup measures
clearstatcache();
// ✅ If still low, consider stopping or reducing chunk sizes
if (isMemoryLow(95)) {
throw new Exception("Memory exhausted during processing");
}
}
}
/**
* Check if row is empty
*/
private function isEmptyRow($row) {
if (!is_array($row)) {
return empty(trim($row));
}
foreach ($row as $value) {
if (!empty(trim($value))) {
return false;
}
}
return true;
}
/**
* Get upload error message
*/
private function getUploadErrorMessage($errorCode) {
$messages = [
UPLOAD_ERR_INI_SIZE => 'File exceeds upload_max_filesize',
UPLOAD_ERR_FORM_SIZE => 'File exceeds MAX_FILE_SIZE',
UPLOAD_ERR_PARTIAL => 'File was only partially uploaded',
UPLOAD_ERR_NO_FILE => 'No file was uploaded',
UPLOAD_ERR_NO_TMP_DIR => 'Missing temporary folder',
UPLOAD_ERR_CANT_WRITE => 'Failed to write file to disk',
UPLOAD_ERR_EXTENSION => 'File upload stopped by extension'
];
return $messages[$errorCode] ?? 'Unknown upload error';
}
}
?>
Solution 3: Database Query Optimization
Optimize database queries to prevent memory exhaustion from large result sets.
src/Services/DataProcessor.php:
<?php
/**
* Memory-efficient data processor
*/
class DataProcessor {
private $pdo;
public function __construct($pdo) {
$this->pdo = $pdo;
}
/**
* Process large database result sets efficiently
*/
public function processLargeResultSet($query, $params = [], $callback, $batchSize = 1000) {
// ✅ Prepare statement
$stmt = $this->pdo->prepare($query);
$stmt->execute($params);
$processed = 0;
$batch = [];
while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
$batch[] = $row;
$processed++;
// ✅ Process batch when it reaches the specified size
if (count($batch) >= $batchSize) {
call_user_func($callback, $batch);
// ✅ Clear batch to free memory
$batch = [];
// ✅ Manage memory
$this->manageMemory();
}
}
// ✅ Process remaining items
if (!empty($batch)) {
call_user_func($callback, $batch);
}
return $processed;
}
/**
* Paginate large result sets
*/
public function processPaginatedResults($baseQuery, $params = [], $callback, $pageSize = 1000) {
$offset = 0;
$totalProcessed = 0;
$hasMore = true;
while ($hasMore) {
$query = $baseQuery . " LIMIT $pageSize OFFSET $offset";
$stmt = $this->pdo->prepare($query);
$stmt->execute($params);
$results = $stmt->fetchAll(PDO::FETCH_ASSOC);
$count = count($results);
if ($count > 0) {
call_user_func($callback, $results, $offset);
$totalProcessed += $count;
// ✅ Manage memory
$this->manageMemory();
}
$hasMore = $count === $pageSize;
$offset += $pageSize;
}
return $totalProcessed;
}
/**
* Stream large result sets
*/
public function streamResults($query, $params = [], $callback) {
$stmt = $this->pdo->prepare($query);
$stmt->execute($params);
$processed = 0;
while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
$result = call_user_func($callback, $row, $processed);
$processed++;
// ✅ Stop if callback returns false
if ($result === false) {
break;
}
// ✅ Manage memory periodically
if ($processed % 1000 === 0) {
$this->manageMemory();
}
}
return $processed;
}
/**
* Bulk insert with memory management
*/
public function bulkInsert($tableName, $data, $batchSize = 1000) {
if (empty($data)) {
return 0;
}
$columns = array_keys($data[0]);
$placeholders = '(' . implode(',', array_fill(0, count($columns), '?')) . ')';
$sql = "INSERT INTO $tableName (" . implode(',', $columns) . ") VALUES $placeholders";
$inserted = 0;
for ($i = 0; $i < count($data); $i += $batchSize) {
$batch = array_slice($data, $i, $batchSize);
$this->pdo->beginTransaction();
try {
$stmt = $this->pdo->prepare($sql);
foreach ($batch as $row) {
$stmt->execute(array_values($row));
$inserted++;
}
$this->pdo->commit();
} catch (Exception $e) {
$this->pdo->rollback();
throw $e;
}
// ✅ Manage memory after each batch
$this->manageMemory();
}
return $inserted;
}
/**
* Export large dataset to file
*/
public function exportToCsv($query, $params, $filename, $options = []) {
$delimiter = $options['delimiter'] ?? ',';
$enclosure = $options['enclosure'] ?? '"';
$headers = $options['headers'] ?? true;
$handle = fopen($filename, 'w');
if (!$handle) {
throw new Exception("Could not open file for writing: $filename");
}
$stmt = $this->pdo->prepare($query);
$stmt->execute($params);
$exported = 0;
// ✅ Write headers if requested
if ($headers) {
$meta = $stmt->getColumnMeta(0);
$columnNames = [];
for ($i = 0; $i < $stmt->columnCount(); $i++) {
$meta = $stmt->getColumnMeta($i);
$columnNames[] = $meta['name'];
}
fputcsv($handle, $columnNames, $delimiter, $enclosure);
$exported++;
}
// ✅ Write data rows
while ($row = $stmt->fetch(PDO::FETCH_NUM)) {
fputcsv($handle, $row, $delimiter, $enclosure);
$exported++;
// ✅ Manage memory periodically
if ($exported % 1000 === 0) {
$this->manageMemory();
}
}
fclose($handle);
return $exported - ($headers ? 1 : 0);
}
/**
* Get memory usage statistics
*/
public function getMemoryStats() {
return [
'current_usage' => memory_get_usage(true),
'peak_usage' => memory_get_peak_usage(true),
'limit' => ini_get('memory_limit'),
'percentage_used' => MemoryManager::getMemoryUsagePercentage()
];
}
/**
* Manage memory during processing
*/
private function manageMemory() {
// ✅ Force garbage collection
if (function_exists('gc_collect_cycles')) {
gc_collect_cycles();
}
// ✅ Check memory usage
if (isMemoryLow(85)) {
// ✅ Additional cleanup
$this->pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);
}
}
}
?>
Solution 4: Configuration and Server Optimization
Optimize server configuration and PHP settings for better memory management.
config/performance.php:
<?php
/**
* Performance and memory configuration
*/
class PerformanceConfig {
/**
* Optimize PHP settings for memory usage
*/
public static function optimizeMemorySettings() {
// ✅ Increase memory limit if needed (be cautious)
// ini_set('memory_limit', '256M'); // Adjust based on your needs
// ✅ Enable garbage collection
if (function_exists('gc_enable')) {
gc_enable();
}
// ✅ Set appropriate execution time
ini_set('max_execution_time', 300); // 5 minutes
// ✅ Optimize opcache settings
if (extension_loaded('Zend OPcache')) {
ini_set('opcache.enable', 1);
ini_set('opcache.memory_consumption', 128);
ini_set('opcache.max_accelerated_files', 4000);
ini_set('opcache.validate_timestamps', 0); // Production: 0, Development: 1
}
// ✅ Optimize APCu if available
if (extension_loaded('apcu')) {
ini_set('apc.enabled', 1);
ini_set('apc.shm_size', '64M');
ini_set('apc.ttl', 3600);
}
}
/**
* Get current performance settings
*/
public static function getPerformanceSettings() {
return [
'memory_limit' => ini_get('memory_limit'),
'max_execution_time' => ini_get('max_execution_time'),
'max_input_time' => ini_get('max_input_time'),
'post_max_size' => ini_get('post_max_size'),
'upload_max_filesize' => ini_get('upload_max_filesize'),
'opcache_enabled' => extension_loaded('Zend OPcache') ? ini_get('opcache.enable') : false,
'apcu_enabled' => extension_loaded('apcu') ? ini_get('apc.enabled') : false
];
}
/**
* Temporarily increase memory for specific operations
*/
public static function withIncreasedMemory($callback, $memoryLimit = '256M') {
$originalLimit = ini_get('memory_limit');
try {
ini_set('memory_limit', $memoryLimit);
$result = call_user_func($callback);
return $result;
} finally {
// ✅ Restore original memory limit
ini_set('memory_limit', $originalLimit);
}
}
/**
* Check if system has enough resources
*/
public static function checkSystemResources() {
$memoryUsage = MemoryManager::getMemoryUsage();
$diskSpace = disk_free_space('.');
return [
'memory_available' => $memoryUsage['limit'] - $memoryUsage['current'],
'memory_percentage' => $memoryUsage['used_percentage'],
'disk_space_available' => $diskSpace,
'is_sufficient' => $memoryUsage['used_percentage'] < 80 && $diskSpace > 104857600 // 100MB
];
}
/**
* Optimize for large file processing
*/
public static function optimizeForLargeFiles() {
// ✅ Increase limits for file processing
ini_set('max_execution_time', 0); // Unlimited for file processing
ini_set('memory_limit', '512M'); // Higher limit for file processing
ini_set('post_max_size', '100M');
ini_set('upload_max_filesize', '100M');
// ✅ Disable output buffering for large files
if (ob_get_level()) {
ob_end_flush();
}
}
/**
* Optimize for batch processing
*/
public static function optimizeForBatchProcessing() {
// ✅ Optimize for batch operations
ini_set('max_execution_time', 600); // 10 minutes for batches
ini_set('memory_limit', '256M');
// ✅ Enable garbage collection
if (function_exists('gc_enable')) {
gc_enable();
ini_set('zend.enable_gc', 1);
}
}
/**
* Get memory usage report
*/
public static function getMemoryReport() {
$usage = MemoryManager::getMemoryUsage();
return [
'current_usage_formatted' => MemoryManager::formatBytes($usage['current']),
'peak_usage_formatted' => MemoryManager::formatBytes($usage['peak']),
'limit_formatted' => MemoryManager::formatBytes($usage['limit']),
'percentage_used' => number_format($usage['used_percentage'], 2) . '%',
'status' => $usage['used_percentage'] > 80 ? 'WARNING' : 'OK',
'recommendation' => $usage['used_percentage'] > 80 ? 'Consider optimizing code or increasing memory limit' : 'Memory usage is acceptable'
];
}
}
?>
Working Code Examples
Complete Memory-Efficient Application:
<?php
/**
* Complete example of memory-efficient PHP application
*/
require_once 'includes/memory_helpers.php';
require_once 'config/memory.php';
require_once 'config/performance.php';
require_once 'src/Services/FileProcessor.php';
require_once 'src/Services/DataProcessor.php';
// ✅ Initialize performance optimization
PerformanceConfig::optimizeMemorySettings();
echo "<h1>Memory-Efficient PHP Application</h1>\n";
// ✅ Show initial memory report
$report = PerformanceConfig::getMemoryReport();
echo "<h2>Initial Memory Report</h2>\n";
echo "<ul>\n";
foreach ($report as $key => $value) {
echo "<li><strong>" . ucfirst(str_replace('_', ' ', $key)) . ":</strong> $value</li>\n";
}
echo "</ul>\n";
// ✅ Example: Process large dataset efficiently
echo "<h2>Processing Large Dataset Efficiently</h2>\n";
// ✅ Create a generator for large dataset
function generateLargeDataset($count) {
for ($i = 0; $i < $count; $i++) {
yield [
'id' => $i,
'name' => "User $i",
'email' => "user$i@example.com",
'data' => str_repeat('x', rand(50, 200))
];
}
}
$processedCount = 0;
$datasetGenerator = generateLargeDataset(50000); // 50k records
foreach ($datasetGenerator as $record) {
$processedCount++;
// ✅ Simulate processing
$record['processed_at'] = time();
// ✅ Manage memory every 5000 records
if ($processedCount % 5000 === 0) {
safeGarbageCollect();
$currentUsage = formatBytes(memory_get_usage(true));
echo "<p>Processed $processedCount records, Memory: $currentUsage</p>\n";
}
}
echo "<p>✅ Successfully processed $processedCount records efficiently</p>\n";
// ✅ Example: File processing
echo "<h2>File Processing Example</h2>\n";
$fileProcessor = new FileProcessor();
// ✅ Create a large test file
$testFile = 'large_test_file.txt';
$handle = fopen($testFile, 'w');
for ($i = 0; $i < 10000; $i++) {
fwrite($handle, "Line $i: " . str_repeat('data', 50) . "\n");
}
fclose($handle);
echo "<p>Created test file with " . count(file($testFile)) . " lines</p>\n";
// ✅ Process file in chunks
$linesProcessed = 0;
$fileProcessor->processLargeFile($testFile, function($chunk, $position, $totalSize) use (&$linesProcessed) {
$lines = explode("\n", $chunk);
$linesProcessed += count($lines) - 1; // -1 because last element is empty due to trailing newline
if ($linesProcessed % 2000 === 0) {
$currentUsage = formatBytes(memory_get_usage(true));
echo "<p>Processed approximately $linesProcessed lines, Memory: $currentUsage</p>\n";
}
return true; // Continue processing
}, 4096); // 4KB chunks
echo "<p>✅ Successfully processed file with $linesProcessed lines</p>\n";
// ✅ Clean up
unlink($testFile);
// ✅ Example: Database processing (simulated)
echo "<h2>Simulated Database Processing</h2>\n";
// ✅ Simulate database connection
class MockPDO {
public function prepare($query) {
return new MockStatement();
}
}
class MockStatement {
public function execute($params = []) {}
public function fetch($mode = PDO::FETCH_BOTH) {
static $counter = 0;
if ($counter < 10000) {
$counter++;
return [
'id' => $counter,
'name' => "Record $counter",
'value' => rand(1, 1000)
];
}
return false;
}
public function fetchAll($mode = PDO::FETCH_ASSOC) {
$results = [];
for ($i = 0; $i < 1000; $i++) {
$results[] = [
'id' => $i,
'name' => "Batch Record $i",
'value' => rand(1, 100)
];
}
return $results;
}
public function columnCount() { return 3; }
public function getColumnMeta($column) {
$columns = ['id', 'name', 'value'];
return ['name' => $columns[$column] ?? 'unknown'];
}
}
$mockPdo = new MockPDO();
$dataProcessor = new DataProcessor($mockPdo);
$recordsProcessed = 0;
$dataProcessor->streamResults("SELECT * FROM large_table", [], function($row, $index) use (&$recordsProcessed) {
$recordsProcessed++;
// ✅ Simulate processing
$row['processed'] = true;
if ($recordsProcessed % 2000 === 0) {
$currentUsage = formatBytes(memory_get_usage(true));
echo "<p>Processed $recordsProcessed database records, Memory: $currentUsage</p>\n";
}
return true; // Continue processing
});
echo "<p>✅ Successfully processed $recordsProcessed simulated database records</p>\n";
// ✅ Show final memory report
$finalReport = PerformanceConfig::getMemoryReport();
echo "<h2>Final Memory Report</h2>\n";
echo "<ul>\n";
foreach ($finalReport as $key => $value) {
echo "<li><strong>" . ucfirst(str_replace('_', ' ', $key)) . ":</strong> $value</li>\n";
}
echo "</ul>\n";
// ✅ Force final garbage collection
$collected = gc_collect_cycles();
echo "<p>Garbage collected: $collected cycles</p>\n";
echo "<h2>Memory Optimization Best Practices</h2>\n";
echo "<ol>\n";
echo "<li>Process large datasets in chunks</li>\n";
echo "<li>Use generators for memory-efficient iteration</li>\n";
echo "<li>Unset variables when no longer needed</li>\n";
echo "<li>Implement proper garbage collection</li>\n";
echo "<li>Monitor memory usage during execution</li>\n";
echo "<li>Optimize database queries for large result sets</li>\n";
echo "<li>Configure appropriate memory limits</li>\n";
echo "<li>Use streaming for large file operations</li>\n";
echo "</ol>\n";
?>
Best Practices for Memory Management
1. Process Data in Chunks
<?php
// ✅ Process large arrays in chunks
$largeArray = range(1, 100000);
$chunkSize = 1000;
foreach (array_chunk($largeArray, $chunkSize) as $chunk) {
// Process chunk
foreach ($chunk as $item) {
// Process item
}
// Free memory
unset($chunk);
gc_collect_cycles();
}
?>
2. Use Generators for Large Datasets
<?php
// ✅ Use generators for memory efficiency
function getLargeDataset() {
for ($i = 0; $i < 1000000; $i++) {
yield ['id' => $i, 'data' => "Item $i"];
}
}
foreach (getLargeDataset() as $item) {
// Process item - memory efficient
}
?>
3. Unset Large Variables
<?php
// ✅ Unset large variables when done
$largeData = file_get_contents('huge_file.txt');
// Process $largeData...
unset($largeData); // Free memory
?>
4. Monitor Memory Usage
<?php
// ✅ Monitor memory during processing
if (isMemoryLow(80)) {
// Take corrective action
gc_collect_cycles();
}
?>
5. Optimize Database Queries
<?php
// ✅ Use LIMIT and OFFSET for large result sets
$stmt = $pdo->prepare("SELECT * FROM large_table LIMIT 1000 OFFSET ?");
$stmt->execute([$offset]);
while ($row = $stmt->fetch()) {
// Process row
}
?>
Common Mistakes to Avoid
1. Loading Entire Files into Memory
<?php
// ❌ Don't do this with large files
$contents = file_get_contents('huge_file.txt'); // ❌ Memory intensive
// ✅ Do this instead
$handle = fopen('huge_file.txt', 'r');
while (($line = fgets($handle)) !== false) {
// Process line
}
fclose($handle);
?>
2. Creating Massive Arrays
<?php
// ❌ Don't create huge arrays unnecessarily
$hugeArray = [];
for ($i = 0; $i < 1000000; $i++) {
$hugeArray[] = processData($i); // ❌ Memory intensive
}
// ✅ Process one at a time or in chunks
for ($i = 0; $i < 1000000; $i += 1000) {
$chunk = array_slice(range($i, $i + 999), 0, 1000);
processChunk($chunk);
}
?>
3. Not Managing Circular References
<?php
// ❌ Circular references can cause memory leaks
class Parent {
public $child;
}
class Child {
public $parent;
}
$parent = new Parent();
$child = new Child();
$parent->child = $child;
$child->parent = $parent; // ❌ Circular reference
// ✅ Break circular references when done
$parent->child = null;
$child->parent = null;
?>
4. Infinite Recursion
<?php
// ❌ Never ending recursion
function badRecursion($n) {
return badRecursion($n + 1); // ❌ Will exhaust memory
}
// ✅ Proper recursion with termination
function goodRecursion($n, $max = 1000) {
if ($n >= $max) return $n;
return goodRecursion($n + 1, $max);
}
?>
Debugging Steps
Step 1: Check Current Memory Usage
<?php
echo "Current memory usage: " . formatBytes(memory_get_usage(true)) . "\n";
echo "Peak memory usage: " . formatBytes(memory_get_peak_usage(true)) . "\n";
echo "Memory limit: " . ini_get('memory_limit') . "\n";
?>
Step 2: Identify Memory-Intensive Operations
<?php
// ✅ Monitor specific operations
$startMemory = memory_get_usage(true);
$largeOperation = performLargeOperation();
$endMemory = memory_get_usage(true);
$memoryUsed = $endMemory - $startMemory;
echo "Operation used: " . formatBytes($memoryUsed) . "\n";
?>
Step 3: Use Memory Profiling Tools
<?php
// ✅ Use tools like Xdebug profiler
// xdebug_start_trace('/tmp/php_trace.xt');
// Your code here
// xdebug_stop_trace();
?>
Step 4: Implement Memory Monitoring
<?php
// ✅ Monitor memory during long operations
function processWithMonitoring($data, $callback) {
$total = count($data);
$processed = 0;
foreach ($data as $item) {
call_user_func($callback, $item);
$processed++;
if ($processed % 1000 === 0) {
$usage = (memory_get_usage(true) / parseMemoryLimit(ini_get('memory_limit'))) * 100;
echo "Progress: " . round(($processed / $total) * 100, 2) . "%, Memory: " . round($usage, 2) . "%\n";
if ($usage > 80) {
gc_collect_cycles();
}
}
}
}
?>
Performance Considerations
1. Optimal Chunk Sizes
<?php
// ✅ Choose appropriate chunk sizes
// Too small: overhead from processing
// Too large: memory pressure
$optimalChunkSize = 1000; // Adjust based on your data size
?>
2. Memory vs Speed Trade-offs
<?php
// ✅ Balance memory usage with performance
// More memory = faster processing
// Less memory = slower but more sustainable
ini_set('memory_limit', '256M'); // Balance between performance and sustainability
?>
3. Caching Strategies
<?php
// ✅ Use disk-based caching for large datasets
// Instead of keeping everything in memory
function cacheToDisk($data, $cacheKey) {
file_put_contents("/tmp/cache_$cacheKey", serialize($data));
}
function getFromDiskCache($cacheKey) {
$file = "/tmp/cache_$cacheKey";
return file_exists($file) ? unserialize(file_get_contents($file)) : null;
}
?>
Security Considerations
1. Validate Input Size
<?php
// ✅ Validate file sizes before processing
$maxFileSize = 10485760; // 10MB
if ($_FILES['upload']['size'] > $maxFileSize) {
die('File too large');
}
?>
2. Limit Processing Resources
<?php
// ✅ Set reasonable limits on processing
$maxRecords = 100000;
if ($recordCount > $maxRecords) {
die('Too many records to process');
}
?>
3. Sanitize Processed Data
<?php
// ✅ Sanitize data during processing to prevent injection
$cleanData = filter_var($input, FILTER_SANITIZE_STRING);
?>
Migration Checklist
- Review all file processing code for memory efficiency
- Implement chunked processing for large datasets
- Add memory monitoring to long-running operations
- Optimize database queries for large result sets
- Use generators for memory-intensive iterations
- Implement proper garbage collection
- Set appropriate memory limits for different operations
- Test with realistic data volumes
- Monitor memory usage in production
- Document memory optimization strategies
Conclusion
The ‘Fatal error: Allowed memory size exhausted’ error is a critical PHP issue that occurs when scripts exceed memory limits. By following the solutions provided in this guide—implementing proper memory management, using generators, optimizing database queries, and following best practices—you can effectively prevent and resolve this error in your PHP applications.
The key is to understand PHP’s memory management mechanisms, implement efficient processing patterns, use modern PHP features like generators, and maintain clean, well-optimized code. With proper memory management, your PHP applications will handle large datasets efficiently and avoid critical memory exhaustion errors.
Remember to test your changes thoroughly, follow PHP best practices for memory management, implement proper monitoring, and regularly review your memory usage patterns to ensure your applications maintain optimal performance and avoid common memory-related fatal errors.
Related Articles
Fix: Maximum execution time exceeded error in PHP
Quick guide to fix 'Maximum execution time exceeded' errors in PHP. Essential fixes with minimal code examples.
Fix: Fatal error: Uncaught Error: Call to undefined function error in PHP - Complete Guide
Learn how to fix the 'Fatal error: Uncaught Error: Call to undefined function' error in PHP. This comprehensive guide covers function declaration, inclusion, and proper PHP configuration techniques.
Fix: Allowed memory size exhausted in WordPress error - Quick Solution
Quick fix for 'Allowed memory size exhausted' error in WordPress. Learn how to increase PHP memory limit and optimize WordPress performance.