Title: ElementTree.iterparse "leaks" file descriptor when not exhausted · Issue #101438 · python/cpython · GitHub
Open Graph Title: ElementTree.iterparse "leaks" file descriptor when not exhausted · Issue #101438 · python/cpython
X Title: ElementTree.iterparse "leaks" file descriptor when not exhausted · Issue #101438 · python/cpython
Description: The PR #31696 attempts to fix the "leak" of file descriptors when the iterator is not exhausted. That PR fixes the warning, but not the underlying issue that the files aren't closed until the next tracing garbage collection cycle. Note t...
Open Graph Description: The PR #31696 attempts to fix the "leak" of file descriptors when the iterator is not exhausted. That PR fixes the warning, but not the underlying issue that the files aren't closed until the next ...
X Description: The PR #31696 attempts to fix the "leak" of file descriptors when the iterator is not exhausted. That PR fixes the warning, but not the underlying issue that the files aren't closed u...
Opengraph URL: https://github.com/python/cpython/issues/101438
X: @github
Domain: github.com
{"@context":"https://schema.org","@type":"DiscussionForumPosting","headline":"ElementTree.iterparse \"leaks\" file descriptor when not exhausted","articleBody":"The PR https://github.com/python/cpython/pull/31696 attempts to fix the \"leak\" of file descriptors when the iterator is not exhausted. That PR fixes the warning, but not the underlying issue that the files aren't closed until the next tracing garbage collection cycle.\r\n\r\nNote that there isn't truly a leak of file descriptors. The file descriptors are eventually closed when the file object is finalized (at cyclic garbage collection). The point of the `ResourceWarning` (in my understanding) is that waiting until the next garbage collection cycle means that you may temporarily have a lot of unwanted open file descriptors, which could exhaust the global limit or prevent successful writes to those files on Windows.\r\n\r\n\r\n```python\r\n# run with ulimit -Sn 1000\r\nimport xml.etree.ElementTree as ET\r\nimport tempfile\r\n\r\nimport gc\r\ngc.disable()\r\n\r\ndef run():\r\n with tempfile.NamedTemporaryFile(\"w\") as f:\r\n f.write(\"\u003cdocument /\u003ejunk\")\r\n\r\n for i in range(10000):\r\n it = ET.iterparse(f.name)\r\n del it\r\n\r\nrun()\r\n```\r\n\r\nOn my system, after lowering the file descriptor limit to 1000 (via `ulimit -Sn 1000`) I get:\r\n\r\n```\r\nOSError: [Errno 24] Too many open files: '/tmp/tmpwwmd9gp6'\r\n```\n\n\u003c!-- gh-linked-prs --\u003e\n### Linked PRs\n* gh-114269\n* gh-114499\n* gh-114500\n\u003c!-- /gh-linked-prs --\u003e\n","author":{"url":"https://github.com/colesbury","@type":"Person","name":"colesbury"},"datePublished":"2023-01-30T20:47:39.000Z","interactionStatistic":{"@type":"InteractionCounter","interactionType":"https://schema.org/CommentAction","userInteractionCount":4},"url":"https://github.com/101438/cpython/issues/101438"}
| route-pattern | /_view_fragments/issues/show/:user_id/:repository/:id/issue_layout(.:format) |
| route-controller | voltron_issues_fragments |
| route-action | issue_layout |
| fetch-nonce | v2:7f0004c1-154c-58bf-cc86-5ce4cd4975de |
| current-catalog-service-hash | 81bb79d38c15960b92d99bca9288a9108c7a47b18f2423d0f6438c5b7bcd2114 |
| request-id | A7C4:2BBCB0:2B59703:390EFC9:696B2E8E |
| html-safe-nonce | 9b4430cb02d73ff36367743721f61fc6bba0a3f499df2f6a95b9239022bcc1c8 |
| visitor-payload | eyJyZWZlcnJlciI6IiIsInJlcXVlc3RfaWQiOiJBN0M0OjJCQkNCMDoyQjU5NzAzOjM5MEVGQzk6Njk2QjJFOEUiLCJ2aXNpdG9yX2lkIjoiODAwMTMwMDAzNTgzNzU3MDcwMiIsInJlZ2lvbl9lZGdlIjoiaWFkIiwicmVnaW9uX3JlbmRlciI6ImlhZCJ9 |
| visitor-hmac | 041f2562de07b80706316a502734bb7034fb8e790f1ea15a0d478b5270645265 |
| hovercard-subject-tag | issue:1563178000 |
| github-keyboard-shortcuts | repository,issues,copilot |
| google-site-verification | Apib7-x98H0j5cPqHWwSMm6dNU4GmODRoqxLiDzdx9I |
| octolytics-url | https://collector.github.com/github/collect |
| analytics-location | / |
| fb:app_id | 1401488693436528 |
| apple-itunes-app | app-id=1477376905, app-argument=https://github.com/_view_fragments/issues/show/python/cpython/101438/issue_layout |
| twitter:image | https://opengraph.githubassets.com/54e4245c954a8bebe21216a3dbfe98ade576188c4dd22702790b94a2cb77e305/python/cpython/issues/101438 |
| twitter:card | summary_large_image |
| og:image | https://opengraph.githubassets.com/54e4245c954a8bebe21216a3dbfe98ade576188c4dd22702790b94a2cb77e305/python/cpython/issues/101438 |
| og:image:alt | The PR #31696 attempts to fix the "leak" of file descriptors when the iterator is not exhausted. That PR fixes the warning, but not the underlying issue that the files aren't closed until the next ... |
| og:image:width | 1200 |
| og:image:height | 600 |
| og:site_name | GitHub |
| og:type | object |
| og:author:username | colesbury |
| hostname | github.com |
| expected-hostname | github.com |
| None | 5f99f7c1d70f01da5b93e5ca90303359738944d8ab470e396496262c66e60b8d |
| turbo-cache-control | no-preview |
| go-import | github.com/python/cpython git https://github.com/python/cpython.git |
| octolytics-dimension-user_id | 1525981 |
| octolytics-dimension-user_login | python |
| octolytics-dimension-repository_id | 81598961 |
| octolytics-dimension-repository_nwo | python/cpython |
| octolytics-dimension-repository_public | true |
| octolytics-dimension-repository_is_fork | false |
| octolytics-dimension-repository_network_root_id | 81598961 |
| octolytics-dimension-repository_network_root_nwo | python/cpython |
| turbo-body-classes | logged-out env-production page-responsive |
| disable-turbo | false |
| browser-stats-url | https://api.github.com/_private/browser/stats |
| browser-errors-url | https://api.github.com/_private/browser/errors |
| release | 82560a55c6b2054555076f46e683151ee28a19bc |
| ui-target | full |
| theme-color | #1e2327 |
| color-scheme | light dark |
Links:
Viewport: width=device-width