Blender ошибка out of memory

Only 25 million faces and lots of particles in 6k resolution… why would you be running out of memory?

As pointed out in the comments you have no RAM left in your system.

Working in 3D is very easy to create situations where your scene is much greater than the resources on your computer.

You really need to reassess where you need to optimize. The trick is to think of where it is that you need great detail and where you can get away with less. The art of 3D is part precision but is mostly optimization.

For example, textures that are going to be just a few pixels on the screen can be rougher than those that are going to be featured in close-up.
Do you need that many particles or would a surface with a bump map would give you the same?

Not every strand of grass in the distance has to be a particle, The ones you don’t see don’t have to exist, the statues don’t have to be as detailed if they are going to be so small, etc.

When Rendering on GPU you have even less resources than when you render on CPU, so it is unlikely that you will be able to use it to render such an ambitious projecet.

You need to slice the project in different layers that your computer can handle and composite later.

Here’s a link that details how to do this: https://www.blenderguru.com/articles/how-to-render-a-complex-scene-without-crashing

Blender Artists Community

Loading

Only 25 million faces and lots of particles in 6k resolution… why would you be running out of memory?

As pointed out in the comments you have no RAM left in your system.

Working in 3D is very easy to create situations where your scene is much greater than the resources on your computer.

You really need to reassess where you need to optimize. The trick is to think of where it is that you need great detail and where you can get away with less. The art of 3D is part precision but is mostly optimization.

For example, textures that are going to be just a few pixels on the screen can be rougher than those that are going to be featured in close-up.
Do you need that many particles or would a surface with a bump map would give you the same?

Not every strand of grass in the distance has to be a particle, The ones you don’t see don’t have to exist, the statues don’t have to be as detailed if they are going to be so small, etc.

When Rendering on GPU you have even less resources than when you render on CPU, so it is unlikely that you will be able to use it to render such an ambitious projecet.

You need to slice the project in different layers that your computer can handle and composite later.

Here’s a link that details how to do this: https://www.blenderguru.com/articles/how-to-render-a-complex-scene-without-crashing

$begingroup$

I’ve been struggling a little with Cycles and my GeForce GTX 750 graphic card. Most of the time it works well, but when I try to work with scenes that have custom shaders (like a skin shader with multiple textures, including for displacement), I get an error, both on the viewport and at rendering:

Cancel | CUDA error: Out of memory in cuLaunchKernel(cuPathTrace, xblocks, yblocks, 1, xthreads, ythreads, 1, 0, 0, args, 0)

Or something like that. Here’s a screenshot so you can check it out:

enter image description here

I’m using a PC and Windows 7, with 8Gb of RAM. I can’t render this scene with GPU, but using CPU, it renders ok.

My question is: What is causing this issue? I have the latest drivers installed for my graphic card, so I have no idea what this is.

360ueck's user avatar

360ueck

1,4961 gold badge17 silver badges27 bronze badges

asked Oct 31, 2014 at 18:51

Diego de Oliveira's user avatar

$endgroup$

7

$begingroup$

The short answer is that SSS on the GPU eats up a lot of memory, so much so that it is recommended to have more than 1 GB of memory on for your GPU. This was mentioned in one of the videos from the Blender Conference (unfortunately I can’t remember which one). Updating your drivers won’t really help as that can’t add more memory, so for now you are stuck with having to render on the CPU. If you have multiple objects that have SSS shaders on them then you could try just rendering one at a time and then compositing them back together.

answered Oct 31, 2014 at 20:01

BlendingJake's user avatar

BlendingJakeBlendingJake

2,5371 gold badge20 silver badges37 bronze badges

$endgroup$

4

$begingroup$

I assume you are using the same graphics card for both display and compute/rendering. If so you could use hundreds of megabytes to windows and applications for display purposes.

If you then attempt to render with cycles it needs to allocate another chunk of memory in order to support running a program ( cycles ) on your GPU and on top of that all the scene data and textures.

This first bit of memory we cannot accurately measure currently. But it is the case that running an experimental kernel uses significantly more memory then the normal one ( again can be multi hundreds of megabytes).

So if you are unlucky and have a 2GB graphics cards you might only be able to use 700 mb of ram for textures + the scene data ( this is what blender measures and reports ). If you go over this you might place the card into a mode where it can no longer allocate enough for displaying and this might result in artefacts like the ones you show in the screen-shot.

answered Nov 2, 2014 at 21:05

Martijn Berger's user avatar

$endgroup$

1

$begingroup$

I found that reducing tile size (under the performance tab in the render tab) will reduce memory usage. This will be slightly slower (https://youtu.be/8gSyEpt4-60?t=671) but will consume less ram, and allow the render to complete.

answered Nov 18, 2018 at 2:58

Neil Macneale's user avatar

$endgroup$

$begingroup$

Sounds like the same Issue as I have with cuCtxCreate, I found out that it hs something to do with the Memory transfer rate of the graphics card, the workaround that I did is to lower the Memory Transfer Rate offset.
though I am using Linux Fedora 25 and tweak some files in order for me to lower it in nvidia X server.

  • my current transfer rate is 810Mhz both min and max

My proof of this is that I can now run GPU rendering without error, though what I’m rendering right now might not even come close to what you’re rendering, Graphics card wise.

[my temp reached up to 61 in this render, with a electric fan to cool down my laptop

I have dedicated nvidia Gt640m 2Gb.
by the way have you tried to check the blenders console? you might find something useful in that console that can help you.

The idea wasn’t mine I’m only sharing what I experienced and hope that I could share it and help someone, Lowering your GPU’s Memory Transfer Rate offset is up to you, cause to me it’s like I’m playing 2 AAA games in a frying pan disguised laptop, my GPU temp reached up to 81 degree Celsius at first try no electric fan to cool off my laptop.

  • GPU Rendering issue «Cuda error at cuCtxCreate: Illegal Address»
    this is the link to my previous issue that got answered.

Community's user avatar

answered Jan 30, 2017 at 4:13

Donato Esparas Queyquep's user avatar

$endgroup$

$begingroup$

I solved the problem by doing the rendering using CPU+GPU in Blender 2.8 (linux, Ubuntu). You have to install nvdia driver from nvidea in update menu.

answered Nov 25, 2018 at 9:39

koen's user avatar

$endgroup$

$begingroup$

My previous message was only a temporary solution and as my file got bigger I had to do losts of trick not to get this CUDA error / stop the program crashing during rendering. I only have problems with rendering in render image mode (F12) and rendering in viewport never causes any problems. For animation and less fireflies I need rendering render mode unfortunately. My drawing is 1 GB now and 11 GB video card and 16 GB internal memory is not enough to render without using harddisk as memory.

steps to improve the situation:
— use alt d instead of shift d to use copy of object and keep file smaller as alt d doesn’t increase file size.
— don’t use HDR and the best is to choose as background RGB input.
— Use the decimate modifier for object with lots of verticate. I could reduce number of verticates for some object by 70 % with hardly visible less quality.
You can also bake normals for bigger decrease in object size with less loss in quality.
— Even after doing all this I got problems again and solved this by increasing swap file size with 32 GB on ubuntu.

If I get problems again guess I will try using rendering services where you sent
what you want to render and can download the result when ready.

answered Feb 17, 2019 at 11:01

koen's user avatar

$endgroup$

$begingroup$

I am running 3 graphics cards in my system. 1 2g and 2 4g. I kept getting the cuda out of memory error. After reading through these postings I realized that by disabling my 2 g card (primary display) from the cuda stack it works fine now. My primary display card was running out of memory and blender couldn’t utilize the other two.

So, try disabling your primary display card from the Cuda stack and see if that helps.

answered Mar 3, 2019 at 16:25

kaber13's user avatar

$endgroup$

1

$begingroup$

I’ve been struggling a little with Cycles and my GeForce GTX 750 graphic card. Most of the time it works well, but when I try to work with scenes that have custom shaders (like a skin shader with multiple textures, including for displacement), I get an error, both on the viewport and at rendering:

Cancel | CUDA error: Out of memory in cuLaunchKernel(cuPathTrace, xblocks, yblocks, 1, xthreads, ythreads, 1, 0, 0, args, 0)

Or something like that. Here’s a screenshot so you can check it out:

enter image description here

I’m using a PC and Windows 7, with 8Gb of RAM. I can’t render this scene with GPU, but using CPU, it renders ok.

My question is: What is causing this issue? I have the latest drivers installed for my graphic card, so I have no idea what this is.

360ueck's user avatar

360ueck

1,4961 gold badge17 silver badges27 bronze badges

asked Oct 31, 2014 at 18:51

Diego de Oliveira's user avatar

$endgroup$

7

$begingroup$

The short answer is that SSS on the GPU eats up a lot of memory, so much so that it is recommended to have more than 1 GB of memory on for your GPU. This was mentioned in one of the videos from the Blender Conference (unfortunately I can’t remember which one). Updating your drivers won’t really help as that can’t add more memory, so for now you are stuck with having to render on the CPU. If you have multiple objects that have SSS shaders on them then you could try just rendering one at a time and then compositing them back together.

answered Oct 31, 2014 at 20:01

BlendingJake's user avatar

BlendingJakeBlendingJake

2,5371 gold badge20 silver badges37 bronze badges

$endgroup$

4

$begingroup$

I assume you are using the same graphics card for both display and compute/rendering. If so you could use hundreds of megabytes to windows and applications for display purposes.

If you then attempt to render with cycles it needs to allocate another chunk of memory in order to support running a program ( cycles ) on your GPU and on top of that all the scene data and textures.

This first bit of memory we cannot accurately measure currently. But it is the case that running an experimental kernel uses significantly more memory then the normal one ( again can be multi hundreds of megabytes).

So if you are unlucky and have a 2GB graphics cards you might only be able to use 700 mb of ram for textures + the scene data ( this is what blender measures and reports ). If you go over this you might place the card into a mode where it can no longer allocate enough for displaying and this might result in artefacts like the ones you show in the screen-shot.

answered Nov 2, 2014 at 21:05

Martijn Berger's user avatar

$endgroup$

1

$begingroup$

I found that reducing tile size (under the performance tab in the render tab) will reduce memory usage. This will be slightly slower (https://youtu.be/8gSyEpt4-60?t=671) but will consume less ram, and allow the render to complete.

answered Nov 18, 2018 at 2:58

Neil Macneale's user avatar

$endgroup$

$begingroup$

Sounds like the same Issue as I have with cuCtxCreate, I found out that it hs something to do with the Memory transfer rate of the graphics card, the workaround that I did is to lower the Memory Transfer Rate offset.
though I am using Linux Fedora 25 and tweak some files in order for me to lower it in nvidia X server.

  • my current transfer rate is 810Mhz both min and max

My proof of this is that I can now run GPU rendering without error, though what I’m rendering right now might not even come close to what you’re rendering, Graphics card wise.

[my temp reached up to 61 in this render, with a electric fan to cool down my laptop

I have dedicated nvidia Gt640m 2Gb.
by the way have you tried to check the blenders console? you might find something useful in that console that can help you.

The idea wasn’t mine I’m only sharing what I experienced and hope that I could share it and help someone, Lowering your GPU’s Memory Transfer Rate offset is up to you, cause to me it’s like I’m playing 2 AAA games in a frying pan disguised laptop, my GPU temp reached up to 81 degree Celsius at first try no electric fan to cool off my laptop.

  • GPU Rendering issue «Cuda error at cuCtxCreate: Illegal Address»
    this is the link to my previous issue that got answered.

Community's user avatar

answered Jan 30, 2017 at 4:13

Donato Esparas Queyquep's user avatar

$endgroup$

$begingroup$

I solved the problem by doing the rendering using CPU+GPU in Blender 2.8 (linux, Ubuntu). You have to install nvdia driver from nvidea in update menu.

answered Nov 25, 2018 at 9:39

koen's user avatar

$endgroup$

$begingroup$

My previous message was only a temporary solution and as my file got bigger I had to do losts of trick not to get this CUDA error / stop the program crashing during rendering. I only have problems with rendering in render image mode (F12) and rendering in viewport never causes any problems. For animation and less fireflies I need rendering render mode unfortunately. My drawing is 1 GB now and 11 GB video card and 16 GB internal memory is not enough to render without using harddisk as memory.

steps to improve the situation:
— use alt d instead of shift d to use copy of object and keep file smaller as alt d doesn’t increase file size.
— don’t use HDR and the best is to choose as background RGB input.
— Use the decimate modifier for object with lots of verticate. I could reduce number of verticates for some object by 70 % with hardly visible less quality.
You can also bake normals for bigger decrease in object size with less loss in quality.
— Even after doing all this I got problems again and solved this by increasing swap file size with 32 GB on ubuntu.

If I get problems again guess I will try using rendering services where you sent
what you want to render and can download the result when ready.

answered Feb 17, 2019 at 11:01

koen's user avatar

$endgroup$

$begingroup$

I am running 3 graphics cards in my system. 1 2g and 2 4g. I kept getting the cuda out of memory error. After reading through these postings I realized that by disabling my 2 g card (primary display) from the cuda stack it works fine now. My primary display card was running out of memory and blender couldn’t utilize the other two.

So, try disabling your primary display card from the Cuda stack and see if that helps.

answered Mar 3, 2019 at 16:25

kaber13's user avatar

$endgroup$

1

Create an account to follow your favorite communities and start taking part in conversations.

r/blender

r/blender - System is out of GPU memory? How can this fixed ?

level 1

Your scene in to big, to many faces or your textures are more then card could handle. Make sure you are using wire frame or shaded view, don’t use cycles in active viewport cause that can eat a good chunk of memory. Goodluck👍

level 2

thank you for this, it actually made a huge difference, i can load up to 6gb of vram just by switching to wire frame

level 2

5 months ago and still helpful. Thanks!

level 2

it worked for me too. hope you get everything you want in life sir!

level 2

Thank you very much! It worked!

level 1

I don’t really know, but I guess the solution is getting more GPU memory, or what is the same in almost all cases, a more expensive graphics card

level 2

Not necessarily. A 3080 costs more than a 3060 but has 2 gb less vram. OP will need to optimize the scene so it uses less vram. Hi-res textures are often to blame here.

level 1

Your scene in to big, to many faces or your textures are more then card could handle. Make sure you are using wire frame or shaded view, don’t use cycles in active viewport cause that can eat a good chunk of memory. Goodluck👍

level 1

Edit >preferences> system >change opticx to cuda

level 1

If you have lots of copies of the same thing, use instances rather then duplicates. Also don’t use subsurf modifiers at high levels. If you have a high poly sculpt, you’re going to need to learn to bake the details into a normal, bump and/or displacement maps, retopologise the sculpt to much lower resolution then apply the baked textures to it.

level 1

Alternatives to buying a new GPU with more VRAM:

  • Optimize the scene by reducing resolution on textures HDRIs etc., reduce render setting values such a Light Bounces etc., experiment with Tile size.

  • Use a render farm. They can be very cost effective if not needed in a hurry and way cheaper than a new GPU if it’s just a one-off project that has this issue for you.

level 2

You can also render on layers and recombine in compositing

level 1

You can render layers, set texture limits under under settings, etc.

also, don’t have your viewport set to render mode that kills your vram.

level 1

· 9 mo. agoContest winner: 2021 January

Along with the other suggestions…

Try command line rendering — when you use blender’s render window you’re using vram to display the image. Command line rendering doesn’t, but you don’t get to see the render as it happens.

level 2

Ohhhh thank you will try this out

level 1

Restarting your pc can sometimes help, because memory can be used by some other stuff that just hasn’t been cleared.

level 2

Yes restarting also helps but in this case it seems like the scence. Is taking more memory

level 1

Two key addons for this are Polycount which allows you to see what objects are misbehaving, and Material Utilities which allows you to deduplicate materials. If you have materials with. 001. 002 etc suffixes, they are likely duplicates. Even if they’re identical in every way they still need their own chunk of VRAM. With high res textures this can rapidly cripple your scene.

About Community

Subreddit Icon

Blender is an awesome open-source software for 3D modelling, animation, rendering and more.

Get it for free at blender.org



Error: System is out of GPU and shared host memory

As the error message implies, you are out of memory. When trying yo render these scene on my computer, also with 64GB of RAM, it fills it all up plus an extra 30GB of SWAP (Pagefile) before I decided to close Blender. That means your scene takes up 90GB or more when trying to render, this easily exceeds the 8GB of VRAM + 64GB of RAM and probably the pagefile you have, and that’s why you get that error.

As for what’s causing it, I did some testing and this is what I observed:
The cause for this issue is the materials used by objects. Specifically the StarlightAtmosphere node group from the Physical Starlight and Atmosphere add-on. The node group is rather complex, and as such it takes up a bit of RAM. However, this isn’t the issue, the amount of RAM is probably measured in tens of megabytes at the most. The issue is that you’ve re-used the node group a lot. You’ve given the node group to every material in the scene meaning that it now has to be stored for every material, which means more RAM is taken up. But once again, this probably isn’t the issue, it’s probably only going to take up hundreds of megabytes at the most. The issue is that you gave this node group to the «Grasswald» objects, the ones scattered using a particle system onto your landscape. The node group takes information about object position and distance from the camera and such that needs to be re-computed every time the Grasswald objects are scattered. Which is 472,024 times. And these calculations have to be stored in RAM. So although the node group only takes up a couple megabytes at most, it has to be duplicated 472,024 times, which means it takes up lots of RAM and results in the issue you’re experiencing.

Removing the StarlightAtmosphere node group from the Grasswald objects resolves the issue with running out of memory. As such, I do not believe this is a bug and this report can probably be closed? I will wait for a developer to decide on this.

In theory Cycles or parts of Blender could be optimized to reduce the RAM usage, but I believe that falls under the catagory of «accidental features request» which we usually respond to with this message:

Thanks for the report, but the issue reported here is a request for modified/improved behavior and not a bug in current behavior. Closing as this bug tracker is only for bugs and errors.

For user requests and feedback, please use other channels: https://wiki.blender.org/wiki/Communication/Contact#User_Feedback_and_Requests

For more information on why this isn’t considered a bug, visit: https://wiki.blender.org/wiki/Reference/Not_a_bug


Go to blender


When rendering, it keeps saying «Out of memory». Help!

Hello! I’m a beginner of Blender. So I was following some instructions and made this.

(This is in the «Rendered» view. Takes around 106MB.)

When I tried to render it, it keeps saying «Out of memory», for some reason. After seeing some posts about this, I’m considering upgrading my PC. Below is some information about my PC:

Any recommendations on which and how should I upgrade my PC? I’m planning to use it to model some interior projects when my skills are good. My family is having a not-too-tight-but-not-too-large budget too.

Понравилась статья? Поделить с друзьями:
  • Blender ошибка 2503 при установке
  • Blender как проверить модель на ошибки
  • Blender выдает ошибку opengl
  • Blender exe ошибка приложения
  • Bleach immortal soul ошибка