Why can a 352GB NumPy ndarray be used on an 8GB memory macOS computer?MemoryError on Windows but not MacOS...
The vanishing of sum of coefficients: symmetric polynomials
Can pricing be copyrighted?
Is there hidden data in this .blend file? Trying to minimize the file size
Avoiding morning and evening handshakes
Jumping Numbers
What are the advantages of using `make` for small projects?
Are there any outlying considerations if I treat donning a shield as an object interaction during the first round of combat?
Slow moving projectiles from a hand-held weapon - how do they reach the target?
Program that converts a number to a letter of the alphabet
What's the most convenient time of year in the USA to end the world?
How do you funnel food off a cutting board?
Can a hotel cancel a confirmed reservation?
Getting a UK passport renewed when you have dual nationality and a different name in your second country?
Cryptic with missing capitals
Can we use the stored gravitational potential energy of a building to produce power?
Eww, those bytes are gross
How would one buy a used TIE Fighter or X-Wing?
How can I deal with a significant flaw I found in my previous supervisor’s paper?
What makes the Forgotten Realms "forgotten"?
If I delete my router's history can my ISP still provide it to my parents?
Issues with new Macs: hardware makes them difficult to use … what options might be available in the future?
just upgraded iMac late 2015 ram from 12 GB to 28 GB and it became too slow to use
When does coming up with an idea constitute sufficient contribution for authorship?
Quenching swords in dragon blood; why?
Why can a 352GB NumPy ndarray be used on an 8GB memory macOS computer?
MemoryError on Windows but not MacOS when using np.zerosHow can the Euclidean distance be calculated with NumPy?What is the difference between ndarray and array in numpy?Pandas MemoryError on server with more MemoryNumpy and memory allocation on Mac OS X vs. LinuxLarge numpy arrays in shared memory for multiprocessing: Is something wrong with this approach?Numpy array error MemoryErrorI use Python 3.6 on windows and I can't install numpy even using bash on ubuntuUsing numpy and scipy from Anaconda in xcode on a macWhy does importing numpy add 1 GB of virtual memory on Linux?Python Memory Error using Numpy only with Ubuntu?
import numpy as np
array = np.zeros((210000, 210000)) # default numpy.float64
array.nbytes
When I run the above code on my 8GB memory MacBook with macOS, no error occurs. But running the same code on a 16GB memory PC with Windows 10, or a 12GB memory Ubuntu laptop, or even on a 128GB memory Linux supercomputer, the Python interpreter will raise a MemoryError. All the test environments have 64-bit Python 3.6 or 3.7 installed.
python macos numpy memory
add a comment |
import numpy as np
array = np.zeros((210000, 210000)) # default numpy.float64
array.nbytes
When I run the above code on my 8GB memory MacBook with macOS, no error occurs. But running the same code on a 16GB memory PC with Windows 10, or a 12GB memory Ubuntu laptop, or even on a 128GB memory Linux supercomputer, the Python interpreter will raise a MemoryError. All the test environments have 64-bit Python 3.6 or 3.7 installed.
python macos numpy memory
1
MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..
– Martijn Pieters♦
5 hours ago
@MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I usesysctl vm.swapusage
to see the real VM usage and got 1200 M
– Blaise Wang
5 hours ago
But they don't compress.
– Martijn Pieters♦
5 hours ago
@MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.
– Blaise Wang
2 hours ago
add a comment |
import numpy as np
array = np.zeros((210000, 210000)) # default numpy.float64
array.nbytes
When I run the above code on my 8GB memory MacBook with macOS, no error occurs. But running the same code on a 16GB memory PC with Windows 10, or a 12GB memory Ubuntu laptop, or even on a 128GB memory Linux supercomputer, the Python interpreter will raise a MemoryError. All the test environments have 64-bit Python 3.6 or 3.7 installed.
python macos numpy memory
import numpy as np
array = np.zeros((210000, 210000)) # default numpy.float64
array.nbytes
When I run the above code on my 8GB memory MacBook with macOS, no error occurs. But running the same code on a 16GB memory PC with Windows 10, or a 12GB memory Ubuntu laptop, or even on a 128GB memory Linux supercomputer, the Python interpreter will raise a MemoryError. All the test environments have 64-bit Python 3.6 or 3.7 installed.
python macos numpy memory
python macos numpy memory
edited 17 mins ago
Boann
37.1k1290121
37.1k1290121
asked 5 hours ago
Blaise WangBlaise Wang
728
728
1
MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..
– Martijn Pieters♦
5 hours ago
@MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I usesysctl vm.swapusage
to see the real VM usage and got 1200 M
– Blaise Wang
5 hours ago
But they don't compress.
– Martijn Pieters♦
5 hours ago
@MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.
– Blaise Wang
2 hours ago
add a comment |
1
MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..
– Martijn Pieters♦
5 hours ago
@MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I usesysctl vm.swapusage
to see the real VM usage and got 1200 M
– Blaise Wang
5 hours ago
But they don't compress.
– Martijn Pieters♦
5 hours ago
@MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.
– Blaise Wang
2 hours ago
1
1
MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..
– Martijn Pieters♦
5 hours ago
MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..
– Martijn Pieters♦
5 hours ago
@MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I use
sysctl vm.swapusage
to see the real VM usage and got 1200 M– Blaise Wang
5 hours ago
@MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I use
sysctl vm.swapusage
to see the real VM usage and got 1200 M– Blaise Wang
5 hours ago
But they don't compress.
– Martijn Pieters♦
5 hours ago
But they don't compress.
– Martijn Pieters♦
5 hours ago
@MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.
– Blaise Wang
2 hours ago
@MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.
– Blaise Wang
2 hours ago
add a comment |
1 Answer
1
active
oldest
votes
You are most likely using Mac OS X Mavericks or newer, so 10.9 or up. From that version onwards, MacOS uses virtual memory compression, where memory requirements that exceed your physical memory are not only redirected to memory pages on disk, but those pages are compressed to save space.
For your ndarray, you may have requested ~332GB of memory, but it's all a contiguous sequence of NUL bytes at the moment, and that compresses really, really well:
That's a screenshot from the Activity Monitor tool, with the process details of my Python process where I replicated your test (use the (I) icon on the toolbar to open it); this is from the Memory tab, where you can see that the Real Memory Size column is only 9.3 MB used, against a Virtual Memory Size of 332.71GB.
Once you start setting other values for those indices, you'll quickly see the memory stats increase to gigabytes instead of megabytes:
while True:
index = tuple(np.random.randint(array.shape[0], size=2))
array[index] = np.random.uniform(-10 ** -307, 10 ** 307)
or you can push the limit further by assigning to every index (in batches, so you can watch the memory grow):
array = array.reshape((-1,))
for i in range(0, array.shape[0], 10**5):
array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
The process is eventually terminated; my Macbook Pro doesn't have enough swap space to store hard-to-compress gigabytes of random data:
>>> array = array.reshape((-1,))
>>> for i in range(0, array.shape[0], 10**5):
... array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
...
Killed: 9
You could argue that MacOS is being too trusting, letting programs request that much memory without bounds, but with memory compression, memory limits are much more fluid. Your np.zeros()
array does fit your system, after all. Even though you probably don't actually have the swap space to store the uncompressed data, compressed it all fits fine so MacOS allows it and terminates processes that then take advantage of the generosity.
If you don't want this to happen, use resource.setrlimit()
to set limits on RLIMIT_STACK
to, say 2 ** 14
, at which point the OS will segfault Python when it exceeds the limits.
Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.
– inf
5 hours ago
@inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.
– Martijn Pieters♦
4 hours ago
Define "run out of memory", do you get aMemoryError
or just start filling RAM, swapping and get OOMed?
– inf
4 hours ago
@inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc
confirms Python has been given the memory allocation), there won't be aMemoryError
, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.
– Martijn Pieters♦
4 hours ago
I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence theMemoryError
.
– inf
3 hours ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54961554%2fwhy-can-a-352gb-numpy-ndarray-be-used-on-an-8gb-memory-macos-computer%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
You are most likely using Mac OS X Mavericks or newer, so 10.9 or up. From that version onwards, MacOS uses virtual memory compression, where memory requirements that exceed your physical memory are not only redirected to memory pages on disk, but those pages are compressed to save space.
For your ndarray, you may have requested ~332GB of memory, but it's all a contiguous sequence of NUL bytes at the moment, and that compresses really, really well:
That's a screenshot from the Activity Monitor tool, with the process details of my Python process where I replicated your test (use the (I) icon on the toolbar to open it); this is from the Memory tab, where you can see that the Real Memory Size column is only 9.3 MB used, against a Virtual Memory Size of 332.71GB.
Once you start setting other values for those indices, you'll quickly see the memory stats increase to gigabytes instead of megabytes:
while True:
index = tuple(np.random.randint(array.shape[0], size=2))
array[index] = np.random.uniform(-10 ** -307, 10 ** 307)
or you can push the limit further by assigning to every index (in batches, so you can watch the memory grow):
array = array.reshape((-1,))
for i in range(0, array.shape[0], 10**5):
array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
The process is eventually terminated; my Macbook Pro doesn't have enough swap space to store hard-to-compress gigabytes of random data:
>>> array = array.reshape((-1,))
>>> for i in range(0, array.shape[0], 10**5):
... array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
...
Killed: 9
You could argue that MacOS is being too trusting, letting programs request that much memory without bounds, but with memory compression, memory limits are much more fluid. Your np.zeros()
array does fit your system, after all. Even though you probably don't actually have the swap space to store the uncompressed data, compressed it all fits fine so MacOS allows it and terminates processes that then take advantage of the generosity.
If you don't want this to happen, use resource.setrlimit()
to set limits on RLIMIT_STACK
to, say 2 ** 14
, at which point the OS will segfault Python when it exceeds the limits.
Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.
– inf
5 hours ago
@inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.
– Martijn Pieters♦
4 hours ago
Define "run out of memory", do you get aMemoryError
or just start filling RAM, swapping and get OOMed?
– inf
4 hours ago
@inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc
confirms Python has been given the memory allocation), there won't be aMemoryError
, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.
– Martijn Pieters♦
4 hours ago
I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence theMemoryError
.
– inf
3 hours ago
add a comment |
You are most likely using Mac OS X Mavericks or newer, so 10.9 or up. From that version onwards, MacOS uses virtual memory compression, where memory requirements that exceed your physical memory are not only redirected to memory pages on disk, but those pages are compressed to save space.
For your ndarray, you may have requested ~332GB of memory, but it's all a contiguous sequence of NUL bytes at the moment, and that compresses really, really well:
That's a screenshot from the Activity Monitor tool, with the process details of my Python process where I replicated your test (use the (I) icon on the toolbar to open it); this is from the Memory tab, where you can see that the Real Memory Size column is only 9.3 MB used, against a Virtual Memory Size of 332.71GB.
Once you start setting other values for those indices, you'll quickly see the memory stats increase to gigabytes instead of megabytes:
while True:
index = tuple(np.random.randint(array.shape[0], size=2))
array[index] = np.random.uniform(-10 ** -307, 10 ** 307)
or you can push the limit further by assigning to every index (in batches, so you can watch the memory grow):
array = array.reshape((-1,))
for i in range(0, array.shape[0], 10**5):
array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
The process is eventually terminated; my Macbook Pro doesn't have enough swap space to store hard-to-compress gigabytes of random data:
>>> array = array.reshape((-1,))
>>> for i in range(0, array.shape[0], 10**5):
... array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
...
Killed: 9
You could argue that MacOS is being too trusting, letting programs request that much memory without bounds, but with memory compression, memory limits are much more fluid. Your np.zeros()
array does fit your system, after all. Even though you probably don't actually have the swap space to store the uncompressed data, compressed it all fits fine so MacOS allows it and terminates processes that then take advantage of the generosity.
If you don't want this to happen, use resource.setrlimit()
to set limits on RLIMIT_STACK
to, say 2 ** 14
, at which point the OS will segfault Python when it exceeds the limits.
Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.
– inf
5 hours ago
@inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.
– Martijn Pieters♦
4 hours ago
Define "run out of memory", do you get aMemoryError
or just start filling RAM, swapping and get OOMed?
– inf
4 hours ago
@inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc
confirms Python has been given the memory allocation), there won't be aMemoryError
, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.
– Martijn Pieters♦
4 hours ago
I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence theMemoryError
.
– inf
3 hours ago
add a comment |
You are most likely using Mac OS X Mavericks or newer, so 10.9 or up. From that version onwards, MacOS uses virtual memory compression, where memory requirements that exceed your physical memory are not only redirected to memory pages on disk, but those pages are compressed to save space.
For your ndarray, you may have requested ~332GB of memory, but it's all a contiguous sequence of NUL bytes at the moment, and that compresses really, really well:
That's a screenshot from the Activity Monitor tool, with the process details of my Python process where I replicated your test (use the (I) icon on the toolbar to open it); this is from the Memory tab, where you can see that the Real Memory Size column is only 9.3 MB used, against a Virtual Memory Size of 332.71GB.
Once you start setting other values for those indices, you'll quickly see the memory stats increase to gigabytes instead of megabytes:
while True:
index = tuple(np.random.randint(array.shape[0], size=2))
array[index] = np.random.uniform(-10 ** -307, 10 ** 307)
or you can push the limit further by assigning to every index (in batches, so you can watch the memory grow):
array = array.reshape((-1,))
for i in range(0, array.shape[0], 10**5):
array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
The process is eventually terminated; my Macbook Pro doesn't have enough swap space to store hard-to-compress gigabytes of random data:
>>> array = array.reshape((-1,))
>>> for i in range(0, array.shape[0], 10**5):
... array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
...
Killed: 9
You could argue that MacOS is being too trusting, letting programs request that much memory without bounds, but with memory compression, memory limits are much more fluid. Your np.zeros()
array does fit your system, after all. Even though you probably don't actually have the swap space to store the uncompressed data, compressed it all fits fine so MacOS allows it and terminates processes that then take advantage of the generosity.
If you don't want this to happen, use resource.setrlimit()
to set limits on RLIMIT_STACK
to, say 2 ** 14
, at which point the OS will segfault Python when it exceeds the limits.
You are most likely using Mac OS X Mavericks or newer, so 10.9 or up. From that version onwards, MacOS uses virtual memory compression, where memory requirements that exceed your physical memory are not only redirected to memory pages on disk, but those pages are compressed to save space.
For your ndarray, you may have requested ~332GB of memory, but it's all a contiguous sequence of NUL bytes at the moment, and that compresses really, really well:
That's a screenshot from the Activity Monitor tool, with the process details of my Python process where I replicated your test (use the (I) icon on the toolbar to open it); this is from the Memory tab, where you can see that the Real Memory Size column is only 9.3 MB used, against a Virtual Memory Size of 332.71GB.
Once you start setting other values for those indices, you'll quickly see the memory stats increase to gigabytes instead of megabytes:
while True:
index = tuple(np.random.randint(array.shape[0], size=2))
array[index] = np.random.uniform(-10 ** -307, 10 ** 307)
or you can push the limit further by assigning to every index (in batches, so you can watch the memory grow):
array = array.reshape((-1,))
for i in range(0, array.shape[0], 10**5):
array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
The process is eventually terminated; my Macbook Pro doesn't have enough swap space to store hard-to-compress gigabytes of random data:
>>> array = array.reshape((-1,))
>>> for i in range(0, array.shape[0], 10**5):
... array[i:i + 10**5] = np.random.uniform(-10 ** -307, 10 ** 307, 10**5)
...
Killed: 9
You could argue that MacOS is being too trusting, letting programs request that much memory without bounds, but with memory compression, memory limits are much more fluid. Your np.zeros()
array does fit your system, after all. Even though you probably don't actually have the swap space to store the uncompressed data, compressed it all fits fine so MacOS allows it and terminates processes that then take advantage of the generosity.
If you don't want this to happen, use resource.setrlimit()
to set limits on RLIMIT_STACK
to, say 2 ** 14
, at which point the OS will segfault Python when it exceeds the limits.
edited 1 hour ago
answered 5 hours ago
Martijn Pieters♦Martijn Pieters
715k13825002312
715k13825002312
Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.
– inf
5 hours ago
@inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.
– Martijn Pieters♦
4 hours ago
Define "run out of memory", do you get aMemoryError
or just start filling RAM, swapping and get OOMed?
– inf
4 hours ago
@inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc
confirms Python has been given the memory allocation), there won't be aMemoryError
, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.
– Martijn Pieters♦
4 hours ago
I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence theMemoryError
.
– inf
3 hours ago
add a comment |
Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.
– inf
5 hours ago
@inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.
– Martijn Pieters♦
4 hours ago
Define "run out of memory", do you get aMemoryError
or just start filling RAM, swapping and get OOMed?
– inf
4 hours ago
@inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (tracemalloc
confirms Python has been given the memory allocation), there won't be aMemoryError
, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.
– Martijn Pieters♦
4 hours ago
I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence theMemoryError
.
– inf
3 hours ago
Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.
– inf
5 hours ago
Memory compression should only matter after allocation has already succeeded. The problem here is probably rather either memory limits (ulimits on linux for example) or more likely that the allocator doesn't find a 300GB sized chunk. If you split those up into 100 3GB pieces it would probably work on windows or linux (with big enough swap) as well.
– inf
5 hours ago
@inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.
– Martijn Pieters♦
4 hours ago
@inf: I don't have 300GB free on my SSD. I do run out of memory when I start filling the array, randomly.
– Martijn Pieters♦
4 hours ago
Define "run out of memory", do you get a
MemoryError
or just start filling RAM, swapping and get OOMed?– inf
4 hours ago
Define "run out of memory", do you get a
MemoryError
or just start filling RAM, swapping and get OOMed?– inf
4 hours ago
@inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (
tracemalloc
confirms Python has been given the memory allocation), there won't be a MemoryError
, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.– Martijn Pieters♦
4 hours ago
@inf: I'm a little reluctant to actually let it run.. As the memory has been allocated by the OS (
tracemalloc
confirms Python has been given the memory allocation), there won't be a MemoryError
, so it'll start swapping and eventually OOMed. But before that point this laptop will be hard to use for a while as everything else is swapped out first.– Martijn Pieters♦
4 hours ago
I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence the
MemoryError
.– inf
3 hours ago
I understand :) But that's what I mean. The allocation doesn't even succeed on ubuntu and linux and hence the
MemoryError
.– inf
3 hours ago
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54961554%2fwhy-can-a-352gb-numpy-ndarray-be-used-on-an-8gb-memory-macos-computer%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
MacOS extends memory with virtual memory on your disk. Check your process details with Activity Monitor and you'll find a Virtual Memory: 332.71 GB entry. But it's all zeros, so it compresses really, really well..
– Martijn Pieters♦
5 hours ago
@MartijnPieters but Windows 10 and Linux also have similar mechanisms. Windows 10 has virtual memory and Linux have swap. Activity Monitor doesn't have VM for 332.71 GB. I use
sysctl vm.swapusage
to see the real VM usage and got 1200 M– Blaise Wang
5 hours ago
But they don't compress.
– Martijn Pieters♦
5 hours ago
@MartijnPieters The problem is that Windows 10 added support of RAM compression science build 10525. But still cannot run the above code.
– Blaise Wang
2 hours ago