Hacker Newsnew | past | comments | ask | show | jobs | submit | mockingloris's commentslogin

..and time.


TLDR: Psychological Egoism

Think of egoism like a single-threaded algorithm assuming all actions optimize for self. Altruism suggests a multi-threaded model where some processes prioritize others’ well-being. Data from user behavior (empathy-driven actions) and system design (evolutionary efficiency) supports a hybrid model—humans aren’t just “selfish” codebases.

Debated since Hobbes, it’s challenged by:

  - Butler’s Stone: Pleasure is a byproduct, not the goal.
  - Science: Biology (altruism aids survival), neuroscience (motivation ≠ pleasure), and psychology (empathy drives genuine care) suggest mixed motives.
  - Analogy: Egoism is a single-threaded “selfish” algorithm; altruism adds threads for others’ benefit. Data leans toward a hybrid model.
Whenever I dive into a creative project, whether it’s freelancing for a client or tinkering on my own stuff, I know exactly what’s coming. The dopamine hit from shipping code is unreal, like a high from solving a puzzle that’s been nagging me for days. The rep boost, the financial payoff (or potential for my own projects).

But deep down, it’s not just about me. Crafting something that users love or that makes their lives easier? That’s the real magic. It’s a mix of selfish thrill and selfless impact, like a perfectly balanced commit.


Level 5: Rotation was very satisfying.


@jayspiel

> As A designer I was trying to skate to where the puck was going technically.

Resonates big time! At the end of the day, this isn't a full-proof science - it's an art. Req-con-fin(Requires continuous finessing).

I also assume, the project revolved around many roles and as you mentioned, the project was iteratively built around user feedback.

NLM disrupted the space and I know just like with the early days of bard/gemini this will only get insanely better; UI/UX especially.

Dey Well


I use a linux distro as my driver; I'm a linux bro and also phone's still awaiting repairs). I opened my Firefox Developer Edition and tried opening Youtube and found I had been logged out. Tried gmail, same thing - My other gmail accounts too. Gmail redirected me to url[0] to re-authenticate.

Anybody else?

[0] https://workspace.google.com/gmail/


> Also, could you consider making each fourth (or first) column a very slightly lighter grey

This could be a component logic; a row of drop downs for customizing the UI and a good examples are color and grid count. This could even be a toml/json config file that can be imported/exported.

My own addition is ability to import samples from my own device.


Importing samples is on my TODO list. Thank you for the feedback!


> From the official SQLite Database File Format page.

The maximum size database would be 4294967294 pages at 65536 bytes per page or 281,474,976,579,584 bytes (about 281 terabytes).

Usually SQLite will hit the maximum file size limit of the underlying filesystem or disk hardware long before it hits its own internal size limit.


The kioxia lc9 is sold with capacities up to 245TB, so we are like 1 year max away from having a single disk with more than 281TB


"Usually"? I'm not saying there are literally no computers in existence that might have this much space on a single filesystem, but...has there ever been a known case of someone hitting this limit with a single SQLite file?


That's just 10 30TB HDDs. Throw in two more for redundancy and mount them in a single zfs raidz2 (a fancy RAID6). At about $600 per drive that's just $7200. Half that if you go with 28TB refurbished drives (throw in another drive to make up for lost capacity). That is in the realm of lots of people's hobby projects (mostly people who end up on /r/datahoarder). If you aren't into home-built NAS hardware you can even do this with stock Synology or QNAP devices

The limit is more about how much data you want to keep in sqlite before switching to a "proper" DBMS.

Also the limit above is for someone with the foresight that their database will be huge. In practice most sqlite files use the default page size of 4096, or 1024 if you created the file before the 2016 version. That limits your file to 17.6TB or 4.4TB respectively.


Last week I threw together a 840TB system to do a data migration. $1500 used 36-bay 4U, 36 refurbished Exos X28 drives, 3x12 RAIDz2. $15000 all in.


Where did you source the drives?


Never underestimate the ability of an organization to throw money at hardware and use things _far_ past their engineered scale as long as the performance is still good enough to not make critical infrastructure changes that, while necessary, might take real engineering.

Though to be fair to those organizations. It's amazing the performance someone can get out of a quarter million dollars of off the shelf server gear. Just imagine how much RAM and enterprise grade flash that can get someone off of AMD or Intel's highest bin CPU even at that budget!


Poking around for only a minute, the largest SQLite file I could find is 600GB https://www.reddit.com/r/learnpython/comments/1j8wt4l/workin...

The largest filesystems I could find are ~1EB and 700PB at Oak Ridge.

FWIW, I took the ‘usually’ to mean usually the theoretical file size limit on a machine is smaller than theoretical SQLite limit. It doesn’t necessarily imply that anyone’s hit the limit.


Wondered the same thing. That's a lot of data for just one file!

Did a full-day deep dive into SQLite a while back; funny how one tiny database runs the whole world—phones, AI, your fridge, your face... and like, five people keep it alive.

Blows my mind.


> I'm not saying there are literally no computers in existence that might have this much space on a single filesystem

I don't use it for sqlite, but having multi-petabyte filesystems, in 2025, is not rare.


With block level compression you might manage it. But you'd have to be trying for it specifically.


Seen bigger files on HPC systems. Granted, these were not generated intentionally. But still, they were.


Kudos to you and the journey. I appreciate your honesty honestly about giving up on open source alternative for a quite cheap alternative for something you get to keep as your own is not a bad tradeoff.

Your story resonates. I am a self-taught creative and I get stubborn at times about wanting to use/bend a specific technology/tool to achieve a task; maybe it's a sunk cost fallacy OCD thing.

Your site design has character.

PS: Bookmarked your site with the tags - fonts, developer-blog, creative-sites, boutique-designs... on my firefox browser.


> They say that you remember more when handwriting than when typing

I believe that too.

Writing is one of my tools for taming my ADHD tendencies. I have journals of different sizes and when I am in the zone, I capture the moment in my own words and in my own way. I draw lines and art on my notes and just scanning a few lines on another day instantly immerses me back in the moment when the ideas/words/thought hit me.

For those who type better than they write, I don't see any reason to not do that, even though for me, it's pen on paper.

We're all different and I am a strong believer of the idiom that goes "One man's food..."


@Terretta > So close.

What do you think is missing from Prince’s pay-per-crawl to fully bridge the gap between user experience and sustainable content monetization?

I'm curious to hear your take on this. Do you think new, simple sustainable content models will emerge to support the diverse content the current web offers, or are we heading for a significant consolidation of information where only the largest players can survive?

I am thinking along the lines of a new open protocol for for-profit web content, content creators could even be paid on the DNS level. Technical users can roll their own SSCP(simple-sustainable-content-protocol(s)) linked with a blockchain wallet or other and every user's browser will have a SSCP wallet for content spends.

Always enjoy your thoughtful perspectives.


Not OP, but I think one way things shake out (at least for news) is this:

Step 1: News journalists offer an LLM friendly private API to OpenAI and others. Maybe they create some special templated markdown articles.

Step 2: OpenAI offers a subscription add-on: Want up to date news for your chat sessions? Pay $5/month, and we will give a bundled "News" tool from various paywalled journals. Maybe NYTimes and Economist and whatnot.

Step 3: Now, you can ask ChatGPT to give you current events in the morning.

Now you can imagine this sort of market for other useful things as well. Basically, selling curated information to chat bots.

You can play around with the exact arrows here. Maybe you instead buy an LLM access key directly from NYTimes, and then you can plug it into multiple chat providers.

The idea is that you create a market for first-party LLM MCP or similar. Like maybe one for YouTube tooling bundled together with YouTube premium.

I'm not sure what people would pay for. But I think it should be centred around curation and new information somehow.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: