Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 21

Thread: Page file usage

  1. #11
    Join Date
    Dec 2017
    Posts
    117

    Default

    So still dealing with this, I upped my pagefile settings to see if that's simply the problem, and it sort of helped? I did still once get this weird out of memory crash on opening up the NamedView panel without iRay even on, like it's trying to spawn multiple instances of the whole render model or something?

    With 32GB RAM installed I've upped my maximum page file to 64GB, and at present on this file while rendering and with Chrome open now my total RAM commit is at 51.1/60 GB. My physical RAM usage is only at 50%, so my question is why is it not using more of it? What is that tens of GB of page file actually representing? Looking at the disk activity it barely seems to be virtually none while rendering, what's it for? If I upgrade to 64GB is it going to actually make a difference in speed, if that's even going to be enough to stop it from shoving that data to the page file?

  2. #12
    Join Date
    Dec 2017
    Location
    Melbourne, Australia
    Posts
    310

    Default

    It's a little tricky for us to know what Rhino is doing internally with its memory allocation. However when it runs out of memory, it would be because it has asked for an amount that it is unable to get from the operating system. Usually memory is allocated in relatively small chunks, when we see something wanting to allocate a very large amount of memory we tend to suspect something specific causing a problem, hence my earlier suggestion to progressively remove elements from the scene to see if there is a particular element causing the issue. This would likely be the first test we would also perform.

    The reason I would strongly suggest this is that if for example there was an attempt to allocate a huge amount of memory that was not for a valid reason (i.e., due to some problem with a specific object), then it may be that no amount of extra memory may help, so before investing in any additional memory you'd want to eliminate that possibility. If that isn't the issue it may well be that more memory would help.

  3. #13
    Join Date
    Dec 2017
    Posts
    117

    Default

    Okay I tracked down the culprit, it's the environment texture, a 16384x8192 HDR translates into close to 20GB of space in memory all by itself(minus the 1.5GB for Rhino before I turn iRay on.) I guess the question is, does that make sense? The smallest HDRs I have, 1000 pixels square, one surface in the file, take up 10GB.

  4. #14
    Join Date
    Dec 2017
    Location
    Melbourne, Australia
    Posts
    310

    Default

    No, that seems excessive. The full Iray logs might be helpful here as they report various statistics. A 1000px square environment should definitely not take up 10GB of memory.

  5. #15
    Join Date
    Dec 2017
    Posts
    117

    Default

    Okay, will send the log file.

  6. #16
    Join Date
    Dec 2017
    Location
    Melbourne, Australia
    Posts
    310

    Default

    There is no obvious reason from the logs. Can you test with OptiX Prime disabled (very bottom of the settings page).

    I also notice from the logs you have some emittitng geometry (something with a self-illuminating material assigned), it's not too much, less than 2k triangles but I would be interested in whether it still does it without that material assigned if disabling OptiX Prime doesn't help.

  7. #17
    Join Date
    Dec 2017
    Posts
    117

    Default

    No difference with Optix Prime disabled, and the huge commit size is still seen in an empty file.

    On the large file I'm working on, I see that the actual Working Set of Rhino+iRay is right now about 8GB, but the Commit is 26.3, which I understand a bit better today what that means, and...yeah that seems a bit excessive?

  8. #18
    Join Date
    Dec 2017
    Location
    Melbourne, Australia
    Posts
    310

    Default

    Well the committed memory is for the entire system, not the individual process. If you temporarily unload the Iray plugin what kind of usage is seen?

  9. #19
    Join Date
    Dec 2017
    Posts
    117

    Default

    With the file open, meshes generated, iRray not yet started, the Working set is 2.6GB, Commit is 3.6. That's via the Resource Monitor, so that's just that process, as was the 26GB with iRay on (total "Commited" system memory being 42.5GB with iRray on.)

    After gradually doubling after starting iRray, the massive spike in Commit seems to happen at the moment the display starts actually drawing.

  10. #20
    Join Date
    Dec 2017
    Location
    Melbourne, Australia
    Posts
    310

    Default

    I don't think this is really indicating the actual memory used by Iray. When I run our stand-alone Iray application I also see fairly high commit spikes on my system here around 12GB when the Iray process itself is only using 2GB. I believe the commit figure may include various other elements relating to how GPU memory is pinned with system memory and so on. I don't think there is anything fundamentally wrong with the amount. One thing I would be curious about, do you see any difference if you disable three of your GPUs? You can select which to use in the Iray settings.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •