MemoryError in Python: Causes and Fixes
MemoryError means Python tried to create or grow an object, but there was not enough memory available.
This usually happens when a program creates very large lists, strings, dictionaries, or reads too much data into memory at once. The fix is often not just “use more RAM.” In many cases, the better solution is to change how the program handles data.
Quick fix
A common fix is to avoid creating very large lists when you only need to loop through values.
numbers = range(10_000_000)
# Better than: numbers = list(range(10_000_000))
for n in numbers:
if n > 5:
break
range(10_000_000) is memory-efficient because it does not store all ten million numbers in a list.
If you want to understand this idea better, see generators in Python explained.
What MemoryError means
MemoryError happens when Python cannot get enough memory for an object.
In simple terms:
- Your code asked Python to store too much data
- Python could not fit that data into available memory
- The program stopped with a
MemoryError
This often appears when:
- Creating a huge list
- Building a very large string
- Loading a large file all at once
- Growing a data structure in a loop for too long
Common situations that cause it
Some common causes are:
- Building a huge list with
list(range(...)) - Reading a very large file all at once
- Creating many copies of large objects
- Using nested lists or dictionaries that grow too much
- Infinite loops that keep appending data
Here are some typical examples:
# Huge list
numbers = list(range(10**8))
# Reading the whole file into memory
with open("big_file.txt", "r") as file:
data = file.read()
# Infinite growth
items = []
while True:
items.append("hello")
In each case, memory use keeps growing until Python can no longer continue.
Example that causes MemoryError
A classic example is trying to create a very large list:
numbers = list(range(10**10))
This is a problem because Python tries to store every value in memory at the same time.
For a very large number like 10**10, that can require far more RAM than most computers have.
By comparison, this is much lighter:
numbers = range(10**10)
print(numbers[0])
print(numbers[1])
print(numbers[2])
Expected output:
0
1
2
range() does not build the full list in memory. It produces values as needed.
How to fix it
Use iterators or generators instead of large lists
If you only need to loop through values, avoid creating a full list.
for number in range(10_000_000):
if number > 5:
break
You can also use a generator expression:
squares = (n * n for n in range(1_000_000))
for value in squares:
print(value)
if value > 100:
break
This creates values one at a time instead of storing them all. For more on this idea, see generators in Python explained.
Process files line by line instead of reading the whole file
This is a very common fix.
Instead of this:
with open("big_file.txt", "r") as file:
data = file.read()
Use this:
with open("big_file.txt", "r") as file:
for line in file:
print(line.strip())
This reads one line at a time, which uses much less memory. See how to read a file line by line in Python and Python file handling basics: read and write.
Break work into smaller chunks
If your input is large, process part of it, then continue.
numbers = range(1_000_000)
chunk_size = 100_000
for start in range(0, 1_000_000, chunk_size):
end = start + chunk_size
chunk = range(start, end)
print(f"Processing {start} to {end - 1}")
This approach is useful when working with large datasets.
Remove data you no longer need
If a large object is finished, do not keep it around.
data = list(range(1_000_000))
print(len(data))
# Finished using data
del data
Deleting variables can help in some cases, especially when large objects are no longer needed.
Use more memory-efficient data structures when possible
Sometimes the problem is not just the amount of data, but how you store it.
For example:
- Use
range()instead oflist(range(...))when possible - Avoid unnecessary copies
- Avoid building large nested structures unless you really need them
Debugging steps
When you get a MemoryError, try these steps.
1. Find the line where large objects are created
Look for lines that:
- Build large lists
- Read full files
- Copy data
- Append in loops
2. Print sizes or lengths as data grows
These simple checks can help:
print(len(my_list))
print(type(data))
print(my_list[:5])
import sys
print(sys.getsizeof(my_list))
What these do:
len(my_list)shows how many items are in the listtype(data)confirms what kind of object you are usingmy_list[:5]shows the first few items without printing everythingsys.getsizeof(my_list)shows the size of the container object in bytes
3. Check for loops that append forever
A loop may be adding data without stopping.
items = []
count = 0
while count < 5:
items.append(count)
count += 1
print(items)
Expected output:
[0, 1, 2, 3, 4]
If you forget to update the loop condition, the list may keep growing.
4. Test with a much smaller input first
If this fails:
numbers = list(range(10**9))
Try a much smaller version:
numbers = list(range(1000))
print(len(numbers))
If the small version works, the problem is likely memory usage, not syntax.
5. Replace eager loading with streaming or chunking
“Eager loading” means loading everything at once. A better approach is often to process data as it comes in.
This is especially useful when working with files, CSV data, or API results.
When the problem is system memory
Sometimes the program is simply asking for more memory than the machine can provide.
A few things to know:
- Closing other programs may help
- Very large datasets may require a different algorithm, not just more RAM
- A 32-bit Python installation can hit memory limits sooner than 64-bit Python
So even if your code is correct, it may still fail if the approach needs too much memory.
Common causes
Here are the most common reasons beginners see MemoryError:
- Creating a huge list instead of using
range()or a generator - Using
file.read()on a very large file - Keeping unnecessary copies of data
- Appending inside an infinite or very long loop
- Loading a large JSON or CSV file fully into memory
A good habit is to ask:
- Do I need all this data at once?
- Can I process one item at a time?
- Can I stop storing old data?
FAQ
Is MemoryError the same as my computer running out of memory?
Usually yes, or Python reached the memory limit available to the process.
Can I fix MemoryError by using range() instead of list(range())?
Often yes, because range() does not store all numbers in memory at once.
Why does reading a file cause MemoryError?
If you read the whole file at once, Python may try to store too much data in RAM.
Will deleting variables help?
Sometimes. It can help if large objects are no longer needed, but better program design is usually the real fix.
See also
- Generators in Python explained
- How to read a file line by line in Python
- Python file handling basics: read and write
- OverflowError: numerical result out of range
- RuntimeError in Python: causes and fixes
Learning to fix MemoryError is a good step toward writing more memory-efficient Python code.