The Designer's Guide Community
Forum
Welcome, Guest. Please Login or Register. Please follow the Forum guidelines.
Mar 18th, 2024, 7:50pm
Pages: 1 2 
Send Topic Print
Virtuoso Slow Down after Large Memory exhaustion (Read 416 times)
Andrew Beckett
Senior Fellow
******
Offline

Life, don't talk to
me about Life...

Posts: 1742
Bracknell, UK
Re: Virtuoso Slow Down after Large Memory exhaustion
Reply #15 - Feb 20th, 2018, 3:28am
 
Thanks.

When you have a moment, I'd really appreciate knowing some more details on the characteristics of your data size (as I requested) as well as whether gc() helps, as this will help understand why automatic garbage collection is not working in your case. Of course, the real life design work comes first!

Regards,

Andrew.
Back to top
 
 
View Profile WWW   IP Logged
Andrew Beckett
Senior Fellow
******
Offline

Life, don't talk to
me about Life...

Posts: 1742
Bracknell, UK
Re: Virtuoso Slow Down after Large Memory exhaustion
Reply #16 - Feb 23rd, 2018, 6:27am
 
A bit further information. I decided to try to reproduce a similar scenario to yours (with a bit of guess work on the sizes of the data that would be enough to show the problem). I picked about 1000 points in the time-domain results, and 20 points in the sweeps around the pss.

With this, I see the problem you're facing. Also, in this case gc() does help a bit, but not hugely (it reduces the memory footprint to 75%-85% of the memory footprint without the gc() calls). I do see evidence of garbage collection being performed during the script run - but the memory continued to grow (up to about 12Gbytes in my example - the amount would depend on the number of sweep points and time points). I was using a script that was a modified version of yours.

Anyway, I've passed this onto R&D - we need to identify what in the ViVA result management is not re-using memory, and this example is probably good enough to do it. Simply calling gc() as part of closeResults() won't solve the problem - it really helped in my other example which had a relatively small number of very large signals - but clearly doesn't help so much in the situation where you have a lot of medium-sized signals.

It would still make sense to log a case with Cadence customer support and have them file a duplicate of CCR 1881844 so that it has the weight of a customer request behind it.

Kind Regards,

Andrew.
Back to top
 
 
View Profile WWW   IP Logged
cheap_salary
Senior Member
****
Offline



Posts: 162

Re: Virtuoso Slow Down after Large Memory exhaustion
Reply #17 - Apr 9th, 2018, 6:53am
 
Andrew Beckett wrote on Feb 19th, 2018, 5:00am:
I'd be interested in hearing if calling gc() in your code improves matters.
gc() is not helpful at all.

Code:
\# Memory report: using	   301 MB, process size 1,405 MB at UTC 2018.04.09 00:50:43.056
\# Memory report: using	   830 MB, process size 1,933 MB at UTC 2018.04.09 00:50:50.018
\# Memory report: using	 1,368 MB, process size 2,472 MB at UTC 2018.04.09 00:50:50.465
\# Memory report: Maximum memory size now 47,998 MB at UTC 2018.04.09 00:50:51.508

\w *WARNING* Low Memory: Less than 1903 megabytes of system memory remain available to this program (3.9456% of a maximum of 47.1 gigabytes).
\w *WARNING* Low Memory: No further low memory warnings will be output.

\# Available memory:	    3,971 MB at UTC 2018.04.09 06:55:25.431

\# Available memory:	    2,785 MB at UTC 2018.04.09 07:08:12.687
\# Memory report: using	47,612 MB, process size 48,715 MB at UTC 2018.04.09 07:08:12.863
\# test ***warning***: memory usage seems to be dangerously high.
\# Memory report: using	47,612 MB, process size 48,716 MB at UTC 2018.04.09 07:09:57.267
\# test ***warning***: memory usage seems to be dangerously high.

\w *WARNING* Swap Activity: Excessive swap activity has been detected, which can cause decreased performance.  The application will continue to run, but if excessive swapping continues
\w *WARNING* Swap Activity: due to the memory requirements of all the programs currently running on this system, the performance of this program may continue to be negatively affected.
\# Memory report: using	48,070 MB, process size 49,174 MB at UTC 2018.04.09 07:25:37.283

\# test ***warning***: memory usage seems to be dangerously high.
\# Available memory:	    1,909 MB at UTC 2018.04.09 07:34:01.053
\# Memory report: using	48,315 MB, process size 49,418 MB at UTC 2018.04.09 07:34:01.053
\# test ***warning***: memory usage seems to be dangerously high.
\# Memory report: using	48,322 MB, process size 49,425 MB at UTC 2018.04.09 07:34:02.100
\# test ***warning***: memory usage seems to be dangerously high.
\# Memory report: using	48,331 MB, process size 49,434 MB at UTC 2018.04.09 07:34:06.139
\# test ***warning***: memory usage seems to be dangerously high. 

Ocean died at this point.

I have to use Skill in IC5.
Back to top
 
 
View Profile   IP Logged
Andrew Beckett
Senior Fellow
******
Offline

Life, don't talk to
me about Life...

Posts: 1742
Bracknell, UK
Re: Virtuoso Slow Down after Large Memory exhaustion
Reply #18 - Apr 17th, 2018, 12:46am
 
Thanks for confirming - that doesn't surprise me given that my attempt to replicate what I guessed was your structure from what you had described led to a similar conclusion; I found that repeatedly calling gc() during the loops slowed the memory growth a bit, but not massively.

It still would be best if you report this to Cadence customer support, referencing the CCR I mentioned above. Having to use a long-unsupported version of the tools is not a sustainable solution. Then we can work out how best to optimise the ViVA memory usage so that it doesn't exhaust the memory (as I said before, this is a tradeoff between avoiding unnecessary re-reading of results over and over again and consuming memory, but that trade-off needs to be adjusted based on what you and I are seeing here).

Kind Regards,

Andrew.
Back to top
 
 
View Profile WWW   IP Logged
Pages: 1 2 
Send Topic Print
Copyright 2002-2024 Designer’s Guide Consulting, Inc. Designer’s Guide® is a registered trademark of Designer’s Guide Consulting, Inc. All rights reserved. Send comments or questions to editor@designers-guide.org. Consider submitting a paper or model.