Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hardly. "insane uncontrolled nesting" seems quite a natural way to manage complexity an a automated way. I can see no other principle but the Miller's law (the 7±2 "magic number") to label some level of nesting as "insane" and it doesn't apply for automated processing. What are, if any, good reasons to limit a path length so severely other than human readability? If long paths puts non-negligible penalty on performance I'd say the file system is defective by design.


Good reasons not to have many modules: Compile time, Website loads times and sizes, Project maintainability.

Nested dependencies have negative effects on all of those. They encourage uncontrolled addition of modules, and even addition of modules in multiple versions. They lead to wrong "isolationist" thinking.

In other words, they do not manage complexity but produce unneeded complexity.


Not having many modules means having big modules. The bigger a module is (anything bigger than half a screen) the lower my productivity is. Every module should fit into your mind easily.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: