Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, and everyone will know you did it for the money.


Isn't pretty much everyone working at OpenAI already clearly motivated by money over principle? OpenAI had a very public departure from being for-good to being for-money last year...


Everyone works for money unless you are refusing to take your salary.


Lots of people working for AI labs have other AI labs they could work for, so their decisions will be made based on differences of remuneration, expected work/role, location, and employer culture/mission.

The claim above is that OpenAI loses to other labs on most of the metrics (obviously depends on the person) and so many researchers have gone there based on higher compensation.


Not what the phrase means, when you decide to take a vastly less lucrative offer you’re working for something other than money.


How many people do that out of the working population?


Millions take a noticeable pay cut, it suppress wages in many fields.

It’s one of the reasons so many CEO’s hype up their impact. SpaceX would’ve needed far higher compensation if engineers weren’t enthusiastic about space etc.


Probably most people working for non-profits or any level of government.


Arguably anyone who's working in a something they're "passionate" about.


Well - only if they had alternatives.


Obviously this is not the case, and you're deliberately choosing to misunderstand the point.


> OpenAI had a very public departure from being for-good to being for-money last year...

Were they ever “for good”? With Sam “let’s scam people for their retinas in exchange for crypto” Altman as CEO? I sincerely question that.

There was never a shift, the mask just fell off and they stopped pretending as much.


It was originally called "open" and run as a not-for-profit and a lot of people joined - and even joined the board - on that understanding.


I'm not sure that's an answer to the question of whether or not it was ever for good


It’s not like tech companies have a playbook for becoming “sticky” in peoples’ lives and businesses by bait and switch.

They still call it “open” by the way. Every other nonprofit is paying equivalent salaries and has published polemics about essentially world takeover, right?


Who would have believed it in the first place? Not I.


There are options other than money and virtue signaling for why you'd work a given job.

Some people might just like working with competent people, doing work near the forefront of their field, while still being in an environment where their work is shipped to a massively growing user base.

Even getting 1 of those 3 is not a guarantee in most jobs.


While your other comment stands, there is no separating yourself with the moral impetus of who you're working for.

If your boss is building a bomb to destroy a major city but you just want to work on hard technical problems and make good money at it, it doesn’t absolve you of your actions.


I don't see how this counter to my point.

If you worked at OpenAI post "GPT-3 is too dangerous to open source, but also we're going to keep going", you are probably someone who more concerned the optics of working on something good or world changing.

And realistically most people I know well enough who work at Open AI and wouldn't claim the talent, or the shipping culture, or something similar are people who love the idea of being able to say they're going to solve all humanity's problems with "GPT 999, Guaranteed Societal Upheaval Edition."


> There are options other than money and virtue signaling for why you'd work a given job.

Doing good normally isn't for virtue signaling.


Working at a employer that says they're doing good isn't the same as actually doing good.

Especially when said employer is doing cartoonishly villainous stuff like bragging how they'll need to build a doomsday bunker to protect their employees from all from the great evi... er good, their ultimate goal would foist upon the wider world.


Good point. I was thinking the "actually doing good". Absolutely there's a lot of empty corporate virtue signalling, and also some individuals like that. But there's still individuals who genuinely want to actually do good.


Are people sacrificing 40 hours of their lives every week to mega corps for anything other than money???


40?!? That's not hardcore at all!


I think most of us work for money ;)


As opposed to?


I'm really confused by this comment section, is no one is considering the people they'll have to work with, the industry, the leadership, the customers, the nature of the work itself, the skillset you'll be exercising... literally anything other than TC when selecting a job?

I don't get why this is a point of contention, unless people think Meta is offering $100M to a React dev...

If they're writing up an offer with a $100M sign on bonus, it's going to a person who is making comparable compensation staying at OpenAI, and likely significantly more should OpenAI "win" at AI.

They're also people who have now been considered to be capable of influencing who will win at AI at an individual level by two major players in the space.

At that point even if you are money motivated, being on the winning team when winning the race has unfathomable upside is extremely lucrative. So it's still not worth taking an offer that results in you being on a less competitive team.

(in fact it might backfire, since you do probably get some jaded folks who don't believe in the upside at the end of the race anymore, but will gladly let someone convert their nebulous OpenAI "PPUs" into cash and Meta stock while the coast)


> even if you are money motivated, being on the winning team when winning the race has unfathomable upside

.. what sort of valuation are you expecting that's got an expected NPV of over $100m, or is this more a "you get to be in the bunker while the apocalypse happens around you" kind of benefit?


$100M doesn't just get pulled out of thin air, it's a reflection of their current compensation: it's reasonable that their current TC is probably around 8 figures, with good portion that will 10x on even the most miserable timelines where OpenAI manages to reach the promised land of superintelligence...

Also at that level of IC, you have to realize there's an immense value to having been a pivotal part of the team that accomplished a milestone as earth shattering as that would be.

-

For a sneak peak of what that's worth, look at Noam Shazeer: funded a AI chatbot app, fought his users on what they actually wanted, and let the product languish... then Google bought the flailing husk for $2.7 Billion just so they could have him back.

tl;dr: once you're bought into the idea that someone will win this race, there's no way that the loser in the race is going to pay better than staying on the winning team does.


Imagine!! I would never live down the humiliation of getting a $100m signing bonus (I'd really like the opportunity to try though).


This isn't punk, nobody cares if you're a ""sellout"".


I believe the Sex Pistols were quite happy to take the man's money! Maybe hippies would have more scruples in that area.


Ehh. I think much less of people who “sellout” for like $450k TC. It’s so unnecessary at that level yet thousands of people do it. $100M is far more interesting




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: