Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

get_connections mem leak #230

Closed
giampaolo opened this issue May 23, 2014 · 15 comments
Closed

get_connections mem leak #230

giampaolo opened this issue May 23, 2014 · 15 comments

Comments

@giampaolo
Copy link
Owner

From tiberius...@gmail.com on November 18, 2011 19:44:12

import psutil
import time
from pympler import muppy

def monitorConnections():

    proc_obj_iter = psutil.process_iter()    
    for proc_obj in proc_obj_iter:  
        conn = proc_obj.get_connections()
        del conn

while True:        
    monitorConnections()
    muppy.print_summary()        
    time.sleep(2)  

The function get_conenctions has mem leaks creating tuple references
and the mem is increased at every iteration.(after a while no ram remaining:)

script output:
                       types |   # objects |   total size
============================ | =========== | ============
                        dict |         979 |      2.28 MB
                         str |       19279 |      1.24 MB
is increasing ---->    tuple |        8050 |    615.80 KB
                        code |        2040 |    239.06 KB
                        type |         252 |    220.50 KB
          wrapper_descriptor |        1103 |     77.55 KB
  builtin_function_or_method |         894 |     55.88 KB
                        list |         297 |     39.97 KB
                         int |        1536 |     36.00 KB
                     weakref |         430 |     33.59 KB
           method_descriptor |         413 |     25.81 KB
           member_descriptor |         289 |     18.06 KB
         <class 'abc.ABCMeta |          19 |     16.63 KB
         function (__init__) |         116 |     12.69 KB
                         set |          57 |     12.47 KB

Original issue: http://code.google.com/p/psutil/issues/detail?id=230

@giampaolo giampaolo self-assigned this May 23, 2014
@giampaolo
Copy link
Owner Author

From jlo...@gmail.com on November 18, 2011 10:48:03

Thanks for the report! Can you clarify which platform and Python version this 
is seen on? The C code for each platform is independent so we'd need to know 
which arch to be looking at.

@giampaolo
Copy link
Owner Author

From tiberius...@gmail.com on November 20, 2011 05:27:47

windows XP,7 both 32 and 64, python version 2.6 both 32 and 64, psutil verison last.
Thank you.

@giampaolo
Copy link
Owner Author

From g.rodola on November 20, 2011 08:10:42

This should now be fixed as of r1228 .
We have a similar problem on OSX.
The patch below should fix it but I don't have an OSX box to test against right now.
Jay, could you do it?


Index: psutil/_psutil_osx.c
===================================================================
--- psutil/_psutil_osx.c    (revisione 1224)
+++ psutil/_psutil_osx.c    (copia locale)
@@ -989,6 +989,7 @@
             char lip[200], rip[200];
             char *state;
             int inseq;
+            int _family, _type;

             fd = (int)fdp_pointer->proc_fd;
             family = si.psi.soi_family;
@@ -1002,10 +1003,15 @@
                 continue;

             // apply filters
-            inseq = PySequence_Contains(af_filter, 
PyLong_FromLong((long)family));
+            _family = PyLong_FromLong((long)family);
+            Py_DECREF(_family);
+            inseq = PySequence_Contains(af_filter, _family);
             if (inseq == 0)
                 continue;
-            inseq = PySequence_Contains(type_filter, 
PyLong_FromLong((long)type));
+
+            _type = PyLong_FromLong((long)type);
+            Py_DECREF(_type);
+            inseq = PySequence_Contains(type_filter, _type));
             if (inseq == 0)
                 continue;

Labels: -Priority-Medium Priority-High OpSys-Windows OpSys-OSX Milestone-0.4.1

@giampaolo
Copy link
Owner Author

From jlo...@gmail.com on November 23, 2011 14:04:42

Ok, I got a chance to look at this, noted a few things: 

1) I can't reproduce a tuple leak on OS X even before the changes. List count 
goes up with each iteration however... not sure if this is the actual problem 
we should be looking at? 

2) your patch above has a couple issues:  
2a) _family and _type should be type PyObject* not ints if you're going to pass 
them to PySequence_Contains like that 
2b) I think you want Py_DECREF() called AFTER you call PySequence_Contains() - 
it doesn't make any sense to dereference the variable before you're done using it.
2c) I thought we had agreed to avoid the if statement form without brackets? It 
makes future changes more likely to result in a logic bug and/or makes code 
harder to read, and is generally not recommended. 

3) I tried the changes anyway and regardless, I see no difference in behavior 
with the number of tuple objects remaining stable and only the list count climbing.

@giampaolo
Copy link
Owner Author

From g.rodola on November 23, 2011 14:20:09

> I can't reproduce a tuple leak on OS X even before the changes. 
> List count goes up with each iteration however... not sure if 
> this is the actual problem we should be looking at? 

Note that for reproducing this issue I haven't used the OP's script. I used 
this one instead:

import socket, psutil
s = socket.socket()
s.bind(('', 0))
s.listen(1)
p = psutil.Process(os.getpid())
while 1:
    p.get_connections()
    print p.get_memory_info()

If the memory info printed on stdout keeps increasing it means there's a leak.
For Windows this was due to 2 factors:
- Py_DECREF wasn't called against the address tuple
- the object returned by PyLong_FromLong((long)type) wasn't Py_DECREFed.

With this two fixes the script above stopped showing increasing values.
Therefore, for starters I'd see what the script above shows.


> 2b) I think you want Py_DECREF() called AFTER you call 
> PySequence_Contains() - it doesn't make any sense to 
> dereference the variable before you're done using it.

Yeah, right. 

> 2c) I thought we had agreed to avoid the if statement 
> form without brackets?

My bad. Feel free to fix it.

@giampaolo
Copy link
Owner Author

From jlo...@gmail.com on November 23, 2011 14:32:14

Ok, I tested with your script and the changes, and memory usage is not 
increasing. Looks ok from my end so I've committed as r1229

@giampaolo
Copy link
Owner Author

From jlo...@gmail.com on November 23, 2011 14:32:45

Status: FixedInSVN

@giampaolo
Copy link
Owner Author

From jlo...@gmail.com on November 23, 2011 14:33:21

Note: I only checked in the OS X side, if you didn't commit changes to Windows 
you'll want to do that as well.

@giampaolo
Copy link
Owner Author

From g.rodola on November 23, 2011 14:37:01

You mean the memory wasn't increasing before your change?

@giampaolo
Copy link
Owner Author

From jlo...@gmail.com on November 23, 2011 14:48:50

> You mean the memory wasn't increasing before your change?

I didn't test with your script until after I applied the patch changes, but it 
looks good with the patch applied so I've committed it in r1229

@giampaolo
Copy link
Owner Author

From g.rodola on November 23, 2011 14:55:03

It was good to figure out whether it afftected OSX as well for the record.
Nevermind... since we weren't DECREFing PyLong_FromLong I assume it almost 
certainly did.

@giampaolo
Copy link
Owner Author

From jlo...@gmail.com on November 23, 2011 14:59:07

Just for completeness' sake, I checked with the code reverted to the old 
version and yes, there was a memory leak prior to this patch.

@giampaolo
Copy link
Owner Author

From g.rodola on December 14, 2011 15:18:36

Status: Fixed

@giampaolo
Copy link
Owner Author

From g.rodola on December 14, 2011 15:51:42

This is now fixed in 0.4.1 version.

@giampaolo
Copy link
Owner Author

From g.rodola on March 02, 2013 04:05:42

Updated csets after the SVN -> Mercurial migration: r1228 == revision 
0adfedd760c3 r1229 == revision abd221389d83

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant