8

I wrote a branch-and-price algorithm in Python 2.7 which solves several small LP and MIP models by Gurobi 8.1 in each node. The algorithm runs fine at first, it produces both lower bound and upper bounds, solves all of the mentioned models many times. However after solving roughly 200 nodes suddenly Gurobi returns an Out of Memory error!

I have 128 GB of RAM, of which 70% was free all the time during the algorithm execution. The models are written in different files and functions and called (imported) by the main file when necessary.

Is there any hidden option that limits the RAM usage for Gurobi? Is it possible that the previous model builds affect the current one, considering that all of them are in the separate files?

Update: I am using Window 10, Python 2.7.10 64-bit version and Gurobi 8.0.1 64-bit. I tried solving another single model with Gurobi and it could use the RAM without any problem. So I guess there is something wrong with my implementation.

Update2: I have a Class called "Node". During the algorithm I generate many instances of this class and assign three gurobi.Model instances to each them as their attributes. This obviously leads to excessive use of memory. Even frequently deleting Node instances by "del" command did not solve the problem. So I had to reorganize the algorithm to not store any of these gurobi.Model instances which significantly reduced the memory usage. This helped to avoid the Out of memory error. However, I am not sure what was causing the error.

Michael Feldmeier
  • 2,856
  • 14
  • 40
Mehdi
  • 683
  • 6
  • 17
  • Did you install the 32-bit or 64-bit version? – Nat Jun 23 '19 at 08:09
  • 2
    @Nat My workstation running on win 10 64bit Python version : Python 2.7.10 (default, May 23 2015, 09:44:00) [MSC v.1500 64 bit (AMD64)] on win32 Gurobi version :8.0.1 win64 – Mehdi Jun 23 '19 at 08:50
  • 4
    If you're running Gurobi under 32 bit Windows then you simply won't be able to use 128 Gigabytes of RAM. – Brian Borchers Jun 23 '19 at 15:31
  • 1
    @BrianBorchers or Rob, maybe move your comment to an answer? – LarrySnyder610 Jun 24 '19 at 01:54
  • It's unclear to me exactly what OS and versions of Python and GuRoBi are being used. My comment could have been more properly written as a query- What do you mean by "on win32 Gurobi version :8.0.1 win64"? – Brian Borchers Jun 24 '19 at 03:38
  • 1
    @Rob Thanks! I did what you suggest and it says I have 128GB of RAM. I updated my question. So it is not a compatibility issue – Mehdi Jun 24 '19 at 06:35
  • @BrianBorchers I updated my question to clarify this. I use Gurobi 8.0.1 64-bit version and Python 2.7.10 64-bit on windows 10 again 64-bit. I tested an other model and it could use all the RAM without any problem. – Mehdi Jun 24 '19 at 06:41
  • @LarrySnyder610 It's a request for clarification, everything being said doesn't align - so he's done some testing and updated the question; now you can answer it. – Rob Jun 24 '19 at 07:16
  • It sounds like you're able to use all of your RAM, except with that specific model? That sounds like a bug to report, I guess. A detailed error message copy/paste'd into this question, along with what does and doesn't work, could potentially help here. If it's truly running out of memory when you have like 70% of 128GB free, and assuming that it's a proper memory-allocator throwing that error (rather than some code that preemptively tries to check if there's RAM before requesting it), then I'd guess that the model might be trying to allocate, say, an absurdly large array. – Nat Jun 24 '19 at 15:17
  • 2
    We need a MCVE to be able to recreate your setup and offer a helpful answer. Our Meta hasn't asked for a magic link for MCVE so I've used SO's link instead. We do have make question relevant to others where it's suggested that asking something of general interest (and not a 'somewhere something is wrong' question) will generate more answers. It's not that we don't want to help, it's how could we help with this? – Rob Jun 25 '19 at 08:47
  • 1
    @Rob I solved my problem by doing some memory management. I believe the problem caused by storing a growing number of Gurobi model instances. I agree a MCVE should be provided to give a better understanding of the problem. I try to provide one for this question and my future questions. – Mehdi Jun 26 '19 at 12:13
  • 7
    @Mehdi if you solved your problem, you can (should?) write up your solution in an answer, and you can accept it if you want. That way the solution will be easier for others to find in the future (comments are not ideal for this), and also you can get some rep if people upvote your answer. – LarrySnyder610 Jun 26 '19 at 15:28
  • 1
    This seems to be about setup of the software and implementation of the algorithm, a sample of which is not included. Is this useful for future visitors, I think not. Exactly what caused the problem in the first place isn't well explained and how it was fixed is unclear. See also: Non-OR Q about software OT? and "needs reprex". – Rob Jul 27 '19 at 18:10
  • I am sure explaining the algorithm or providing any example, wouldn't help anyone to understand the origin of the problem. Since it was probably because of keeping these model instances in the RAM and the algorithm structure wouldn't have anything to do with this error. – Mehdi Aug 05 '19 at 10:08

2 Answers2

3

For anyone reading this after the fact, it is extremely likely that OP was using 32-bit python on 64 bit Windows. This will restrict your memory space to 2GB. (see here: https://stackoverflow.com/questions/18282867/python-32-bit-

Connor
  • 431
  • 2
  • 9
2

The problem was that I stored too many Gurobi models during the algorithm execution, which increase the RAM usage by seconds!! I still don't know why Gurobi returned the "Out of memory" error when there was plenty of memory available!

To future readers, pay attention to what you actually store in your RAM. Keep it simple and store just what you need (for example a float instead of an object!). RAM usage can add up very quickly!

Mehdi
  • 683
  • 6
  • 17