Wikipedia:Articles for deletion/Garbage Collected Filesystem
- The following discussion is an archived debate of the proposed deletion of the article below. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the article's talk page or in a deletion review). No further edits should be made to this page.
The result was delete. Sandstein 18:37, 28 November 2008 (UTC)[reply]
- Garbage Collected Filesystem (edit | talk | history | protect | delete | links | watch | logs | views) (delete) – (View log)
Note: Prodded but contested by anonymous so taking here instead.
Article does not indicate that this sort of filesystem is notable in any way whatsoever. Theory given sounds very speculative and I can't imagine production code written to the vague notions described. Only example given is an obscure research project. That seems more than likely to connected to the original author of the article.
Furthermore, I don't see any need or tangible benefit for a file system to be garbage collected, even without taking into account the probably hairy and delicate implementation of a garbage collector on a fragile block-level mud brick. An interesting idea, but hardly practical... ~ Jafet•work•play•watch 12:36, 18 November 2008 (UTC)[reply]
- Note: This debate has been included in the list of Computing-related deletion discussions. -- • Gene93k (talk) 16:26, 18 November 2008 (UTC)[reply]
- Delete for notability reasons. Nobody uses GCFS, and the author (David Madore) only wrote it for fun anyway. NTFS and Unix file systems do use reference counting garbage collection, but the term "garbage collected filesystem" is never used to describe them, to my knowledge. I don't think this article would serve any purpose even as a redirect. -- BenRG (talk) 18:49, 18 November 2008 (UTC)[reply]
- This is an idea that has not escaped from its creator and gained traction in the world at large, to become part of the general corpus of human knowledge. Per our Wikipedia:No original research policy, things don't belong here unless they have been through a process of fact checking and peer review, and have been published. Reading the LKML for when Madore announced this idea, back on 2000-01-10, it is clear that it failed at the peer review and fact checking stages (with one person pointing out that Madore had his facts wrong, for example, and others observing several flaws). The idea was peer reviewed, fact checked, and found wanting. This is a novel idea that didn't catch on, and there are no reliable sources documenting it that have been peer reviewed and fact checked. Delete. Uncle G (talk) 21:28, 18 November 2008 (UTC)[reply]
- Relisted to generate a more thorough discussion so that consensus may be reached.
Please add new comments below this notice. Thanks, Black Kite 18:23, 23 November 2008 (UTC)[reply]
- Delete. Interesting idea, but the theory has very little notability outside of a research paper on someone's website that is eight years old. The author admits that it's experimental and may not be efficient, which doesn't detract from its encyclopedia-ness; it's the lack of peer review and notability. --Amwestover (talk|contrib) 18:49, 23 November 2008 (UTC)[reply]
- Strong delete. Based on a single primary source, poorly written, and misleading. Due to lack of discussion in third-party sources, no criticism is presented. This is exactly why we shouldn't have articles on marginal topics. I'm going to indulge in some of my own crticism here, but adding this to the article would be WP:OR. The original presentation [1] gives (i) hard links to directories, and (ii) background deletion of entire trees as the advantages. Regarding (i), it incorrectly clams that these are not possible in a traditional Unix (read FFS and derivatives) file systems. Actually, hard links to directories are possible in most Unix file systems, but there's no mechanism for preventing loops in the graph. So, in most Unix systems only root can create hard links to directories. GCFS is a complicated mechanism for enabling something useless: loops in the file system. The second claim to novelty (ii), is also not really new (except for doing it on graphs that have loops). Practically all versioning/checkpointing file systems that eventually delete old versions (e.g. WAFL) also have to garbage collect entire trees, except they do this for a much better reason. Pcap ping 12:40, 27 November 2008 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the article's talk page or in a deletion review). No further edits should be made to this page.