Message238683
| Author |
wolma |
| Recipients |
bkabrda, ethan.furman, georg.brandl, ncoghlan, paul.moore, python-dev, sYnfo, serhiy.storchaka, vstinner, wolma |
| Date |
2015年03月20日.14:38:15 |
| SpamBayes Score |
-1.0 |
| Marked as misclassified |
Yes |
| Message-id |
<1426862296.08.0.221969878889.issue23700@psf.upfronthosting.co.za> |
| In-reply-to |
| Content |
so let's look at this step-by-step (and I hope I fully understood this myself):
- calling fileobj.__iter__ creates a generator because the method uses yield from
- that generator does not get assigned to any reference so it will be garbage-collected
- when the generator is garbage-collected, the subgenerator specified to the rigth of the yield from is finalized (that is PEP380-mandated behavior) and, in this case, that is iter(self.file)
- for an io module-based fileobject, iter(f) is f and finalizing it means that its close method will be called
So this is not about the file object getting garbage-collected, it is about it getting closed.
Since PEP380 explicitly mentions that problem with yield from and a shared subiterator, I don't think you can call it a bug, but I think it is very problematic behavior as illustrated by this issue because client code is supposed to know whether a particular generator uses yield from or not. |
|