Memory Usage Issues with CV2.imread()

问题内容:

I have built a faster r-cnn model and am in the process of labeling images with a bunch of bounding boxes. My issue is that the python kernel eventually crashes while labeling the first ~3000 out of 70,000 images. I have narrowed down the issue to image = cv2.imread().Regardless of using del image or image = None, the excessive memory usage persists. Is there a none memory leak issue with the CV2 library? I am using version 3.3. Is there anyway to clear memory? Thanks.

gc.enable()
images_path = glob.glob('../..../*.jpg')
bi = 0
image_batch = []
for p in images_path:
    image = cv2.imread(p) #this is the memory leak
    name = p.split('/')[7]
    name = name.split('.jpg')[0]
    image_batch.append((name, image))
    image = None
    if bi % 100 == 0:
        print(len(image_batch))
        for pair in image_batch:
            image = pair[1]
            name = pair[0]
     .......................................


        image_batch = []
    print("----------------------------")
    plt.close("all")
    print(gc.collect())
bi += 1

问题评论:

1  
does not your image_batch simply grow continuously without releasing any of its contents?
    
I set image_batch = [] within the for loop
– James Chartouni
27 mins ago
    
ok, that’s not in the code you show..
    
sorry. I just have some confidential code I have to redact
– James Chartouni
25 mins ago
    
that’s fine, have you tried using PIL instead of cv2? also can you explain what you have done to narrow down the memory leak so we don’t have redundant suggestions.

原文地址:

https://stackoverflow.com/questions/47756176/memory-usage-issues-with-cv2-imread

Add a Comment