Back

OpenAI employees did not want to go work for Microsoft

273 points7 monthsbusinessinsider.com
greenyoda7 months ago
xazzy7 months ago

Am an employee, signed the letter.

I can't read the article so maybe the content is more nuanced, but the framing irks me.

This all happened really fast and the offer was informal. I can only speak for myself but I do have a lot of respect for modern MS, and would have seriously considered the move if that's what kept the team together and comp was reasonable. I would be surprised if most people felt differently.

mrtksn7 months ago

Obviously MS does some great things but I wonder what do you think about their culture?

MS is trying to make me use Edge Browser by randomly refusing to let me use Bing chat in any other browser. This makes me think that MS is still evil and will force me to do things the moment they can. I gave up Bing Chat because of this and was using Perplexity.ai instead, until Bing got integrated into ChatGPT.

Another thing is the feel of the MS products is very different than OpenAI. For example, Bing chat again, would have a strange UX where it counts down my message allowance on the topic as I keep asking follow up questions as if it was designed for their convenience and not mine. OpenAI products on the other hand feel much more pleasant to use and don't stress me out even when things don't work as expected, which makes me think that the approach of product design is very different than MS.

Same tech running on the Azure servers, completely different product experiences. IMHO this points to completely different mindsets.

scarface_747 months ago

Open AI doesn’t have any meaningful “products” they have good technology and they charge $20 a month for a subscription to a technology demo that doesn’t come close to covering their costs.

You think the employees at “Open” AI care about anything more than their large paychecks no matter what they say?

zombiwoof7 months ago

* this *

notice how the one OpenAI employee posted he'd "consider it" if the comp was right. Ie, it's about the money.

andrewjf7 months ago

Yep. Someone is going to get the money, I’d like it to be the employees.

lumost7 months ago

I’d be curious if $20/mo is or is not enough to cover their costs. It’s opaque what their inference setup is, or what the final economics of LLMs will be. With 30-100 Billion param models becoming available which are in spitting distance of gpt-4, and dedicated ASIC implementations… openAI might be charging too much.

+2
scarface_747 months ago
a_bonobo7 months ago

It is implied in OP's reply too, right? No word about 'hey we're giving powerful tech to a hyper-monopolist', it's only about your friends and the money

staunton7 months ago

> MS is still evil

All big corporations are "evil", as in, their decisionmaking is scaled and institutionalized enough to effectively implement the goal of "maximizing shareholder value" above what's good for you or society.

mrtksn7 months ago

No, not all companies try to exploit their strength in one area to make me use some other product that I have no interest in.

I'm not talking about "hey, we have this product that you might like", I'm talking about "if you want to use this product, you must also use this product". There's also no technical reason for it, sometimes they will let me use it.

Not O.K.

+1
kbenson7 months ago
+1
scarface_747 months ago
osigurdson7 months ago

What organizations are interested are interested in “what is good for you or society”? Even if they start out that way, eventually corruption sets in. I’d rather have companies have a clear profit motive instead of a series of fake motives and increased regulatory capture.

jrockway7 months ago

I don't think any of that really affects the day-to-day at OpenAI if they got bought. There are plenty of counter-examples for forcing you to use Edge or ads in the Start menu. They made Typescript. They made VS Code. Those teams are just doing their thing, and it's very likely that OpenAI would be like that. Microsoft is past the world of "everyone must use our browser and OS", they really just want to sell any of their products if you'll buy them. Like, I use Outlook on my iPhone because my organization uses Office 365. That would have been unheard of a couple decades ago; Outlook and Windows was their moneymaker and if you wanted one you got both. But now, it's just teams doing their own thing.

The likely effect on day-to-day for OpenAI would have been a different promotion process, their stock being worth $0, and having to use a different videoconferencing solution. I worked at a startup that got bought by a huge company, and that's basically all that changed for us. Our team continues to work on our product, and I'm only vaguely aware of what the rest of the company even does. (I think they make servers? Like solder together chips and put them in the box.) Day-to-day doesn't change much; it's the same people, the same code, and the same goals. I didn't get bought by Microsoft, but I can't imagine that these strategic acquisitions are much different across the Fortune 500.

wkat42427 months ago

> Microsoft is past the world of "everyone must use our browser and OS"

OS maybe but browser? They're going to great lengths to push Edge. Including harassing customers trying to download Chrome or Firefox.

+1
bdavbdav7 months ago
anuraaga7 months ago

I suspect many developers, including OpenAI engineers are using VSCode, maybe TypeScript for frontend engineers, GitHub for sure which had no real features like pull request approvers before Microsoft money came in. Less at OpenAI most likely, but C# and .NET are going strong with bleeding edge tech like .NET chisel containers support. And I've seen great support from their engineers working on OSS, best of FAANG.

Google created the meme that a company could be evil or not but we're past that age. Let's focus on the experiences, and while not forgetting the past (yes the 90s do not paint MS well), be forgiving. MS seems to continue to develop and even innovate on a lot of this tech, painting them as an evil dinosaur seems frankly ridiculous. Yes as with any large company, there will be parts that are good and parts that are not.

Disclaimer: I have never worked at Microsoft and own no stock directly though so have ETFs. And through OSS I have some friends there that I strongly respect and think are awesome and always think of them any time I see hate talk towards MS, which is unfortunately common...

devjab7 months ago

Most of OpenAIs tech stack is AI toolage, but basically everything you interact with is written in Python. Nobody uses C# for anything serious, because why would you?

It’s not a bad language, it’s just not a very good language either. Need efficiency? Rust/C++. Need an all round language? Node or Python, which are less efficient, but still powerful enough to power Instagram and well OpenAI as far as Python goes and LEGO as far as Typescript goes. Realistically you’re looking to choose between C#, Java and Go. Both Java and Go are miles ahead of C# in terms of concurrency, I mean, C# is still stuck with “await” after all, and while I guess you can have a lengthy debate on C# vs Java, 9 gazillion large companies use Java while 0 use C#.

It’s not that C# is bad, like I said. It’s never really been better than it is now, it’s more a question of why would you ever use it? Even the C# developers at Microsoft admit to prototyping using Python because it’s just so much faster to build things with it, and while they do move things to C#, you have to wonder if they would if they weren’t working for Microsoft.

ParetoOptimal7 months ago

> painting them as an evil dinosaur seems frankly ridiculous.

Hard to understand this point of view with:

- vscode being "open source", conveniently leaving out flagship features like live share and Python extension

- Increase of "in your face, fuck you" dark patterns like forcing an account for Windows setup, forcing ms authenticator, etc

+1
anuraaga7 months ago
vthallam7 months ago

> >> and comp was reasonable

there's no way you guys would get the same comp though? like not even close. MSFT irrespective of how much it wants you is not going to honor the 10X increase in valuation from the upcoming $90 billion valuation.

Even if they do, you will miss on the upside. MSFT stock is not going to 10X but OpenAI's might.

blagie7 months ago

I think if things went that far, OpenAIs valuation would very quickly pop to zero.

Real MSFT stock beats a theoretical could-have-been with a 10x upside.

gretch7 months ago

> Real MSFT stock beats a theoretical could-have-been with a 10x upside.

As someone who has done startups, I highly agree with this statement.

But for a lot of people doing start ups for the first time (including my younger self) they don't understand the headache of private market equity, and thus do not apply the correct discount.

As they say, a bird in the hand is worth 2 in the bush.

zamalek7 months ago

> there's no way you guys would get the same comp though?

Nadella was directly involved and he's way smarter than that. Comping one the best ML teams in the world correctly is child's play compared to undoing Balmer's open source mess.

capableweb7 months ago

LinkedIn data says ~44% of the OpenAI team is "Engineering", the rest is operations, human resources and sales.

So most likely, if most of the employees moved over to Microsoft, they wouldn't get the same comp, at most 44% of the company would.

zamalek7 months ago

> 44% of the OpenAI team is "Engineering"

Yes, the "you guys" average HN demographic per the gp comment.

HWR_147 months ago

Why wouldn't MSFT honor the increased comp? They are still investing money in OpenAI at that multiple.

They would have to create some kind of crazy structure to avoid wrecking their levels. But of course it could be done.

ponector7 months ago

Don't forget that with new rounds and new valuation there is also a dilution of shares.

If the valuation goes up x10 after next round average developer probably will have the same money locked in rsu/options.

aetherson7 months ago

Uh, what? Are you claiming that if a company experiences a 10x (!) increase in valuation, that the dilution fully destroys that upside and the average developer experiences no increase in their comp? That is not even vaguely close to true in my experience.

filoleg7 months ago

> Are you claiming that if a company experiences a 10x (!) increase in valuation, that the dilution fully destroys that upside and the average developer experiences no increase in their comp?

Not the person you are replying to, but it depends on how diluted it gets. If they print 9x of currently outstanding shares as the value goes 10x, it would result in those original shares being worth exactly the same as before the 10x jump.

But I agree with you overall, in terms of the actual reality. I don’t think any company with half a brain would do that.

+1
ponector7 months ago
IshKebab7 months ago

> Even if they do, you will miss on the upside. MSFT stock is not going to 10X but OpenAI's might.

I'm pretty sure it doesn't work like that due to the existence of leverage. You can make MSFT have the potential to 10x (or ÷10) if you want.

anupsurendran7 months ago

100% vthallam. The upside for OpenAI is much higher.

smileysteve7 months ago

.... Unless Sam Altman, major research leaders and 50%+ of the employees had transferred to Microsoft

scarface_747 months ago

How much do you really think that valuation would stand once OpenAI doesn’t get the massive subsidies from MS on compute? OpenAI couldn’t even be an ongoing concern.

andsoitis7 months ago

the conversation in this thread makes me wonder about whether the right incentives for OpenAI should be to make tons of money.

Personally, I prefer to make tons of money, but as someone who will not have a say in whether to participate in an AI-driven world, I would prefer if there were non-profit counterweights. Perhaps governments will have the wherewithal to steer things but I am somewhat concerned.

scarface_747 months ago

You really think “non profits” are any less power hungry than for profits?

rurp7 months ago

I'm not the original poster, but yes I do. Sure there are plenty of examples of power hungry non-profits and once in a while a for-profit company acts altruistically, but on average I think that there's a difference. If we randomly picked some executives from either type of company I would bet that the people in the non-profit group would more often have motivations beyond pure power and money accumulation.

+1
scarface_747 months ago
sirspacey7 months ago

Having worked in non-profits and with a community of people who do, there’s not a difference.

Large non-profits optimize for revenue, small ones for impact.

The big fish eat the little ones or turn them into impact sharecroppers.

andsoitis7 months ago

> You really think “non profits” are any less power hungry than for profits?

It is human nature for people to want power, whether it is power over others or the power to have autonomy, or anything in-between.

trompetenaccoun7 months ago

I'm not a fan of "as an X" posts, unless there is evidence the user actually is X. With an anon account, not much history on HN and no apparent link to OpenAI, for all we know this might as well be someone from the Microsoft PR team.

VirusNewbie7 months ago

If you're ok with sharing, did you sign it because you are more aligned with the productizing of GPT etc, or is it that you truly believe Sama is the CEO to follow? Or a combination of both?

clwg7 months ago

I can't even imagine what it was like to experience this from the inside, with the entire world speculating on this drama, especially considering that the day before, everyone was celebrating what you and your team were creating.

I have to assume that the environment there was pretty exciting as well, and I hope the energy wasn't ruined by this. I like being surprised by technology and I love seeing people innovate because of what you and your team have pushed to the forefront. Godspeed.

strangattractor7 months ago

Smart - If it ain't written down it is not an offer.

wintogreen747 months ago

>> and comp was reasonable

A lot of complex issues and perspectives packed into those few words...

replwoacause7 months ago

I thought the offer from MSFT, albeit unofficial, was that anyone who made the switch kept their current salary.

shwaj7 months ago

A big part of the comp is in equity, and since OpenAI has an uncommon equity structure it is unclear how that would translate to Microsoft stock.

dmazzoni7 months ago

What about the non-salary part of comp?

For most employees, their OpenAI stock would have been worth even more than their salary at its current valuation, and it has the potential to potentially be worth quite a bit more in the future.

Replacing it with Microsoft stock would have made it a sure thing - but also with much less growth potential.

I'd be really curious to hear if Microsoft actually got so far as to figure out what to offer OpenAI employees in terms of an equity offer.

Atotalnoob7 months ago

OpenAI doesn’t have stock.

They have PPUs, which is similar to profit sharing

rgbrgb7 months ago

i would guess a lot of OpenAI employees are sitting on some pretty valuable stock/options if the company doesn't implode

imjonse7 months ago

Hence the overwhelming number of hearts in the Twitter messages asking for the return of Sam Altman.

6gvONxR4sf7o7 months ago

> Given the absence of interest in joining Microsoft, many OpenAI employees "felt pressured" to sign the open letter, the employee admitted. The letter itself was drafted by a group of longtime staffers who have the most clout and money at stake with years of industry standing and equity built up, as well as higher pay. They began calling other staffers late on Sunday night, urging them to sign, the employee explained.

What a clusterfuck. I feel bad for anyone who supported the board.

staunton7 months ago

Anyone who supported the board was in a situation where they deemed they could afford to go against the peer pressure. This is a combination of acting according to their beliefs/values and economic security. Those people should be envied, not pitied.

kbenson7 months ago

We should envy people that stand firm on their beliefs and are vindicated, not those that may experience backlash in the future because people that had a different view won in the end. Suffering hardship for doing what you think is right is not something to envy.

Don't envy martyrs, wish for a world without the need for them.

scarface_747 months ago

How many people will stand by their “beliefs” at the cost of millions of dollars?

kbenson7 months ago

Plenty, but I suspect most of them already have millions of dollars so it's relatively much less of a sacrifice.

But for someone who's not a millionaire that looks at the situation and doesn't know how it will shake out, and maybe believes it's 60/40 or 75/25 against what they think is right, it's possible that principle can make up the difference in those assessments and make the difference in what they choose.

Just because someone isn't passing up a "sure thing" doesn't mean they aren't sacrificing something, especially when most sure things aren't actually sure at all. In reality it's all a matter of statistical likelihoods (and we either think of them that way, or like most people use a different model of thinking that amounts to the same thing while convincing us it's not), so I don't see this situation as being all that different.

6gvONxR4sf7o7 months ago

Sorry I meant the people who privately supported the board but were peer pressured into signing.

Tenoke7 months ago

This isnt the board though. This is the against the board side.

what_ever7 months ago

That's why they feel bad for those who supported the board as the ones that opposed the board may not have just done it based on the board's actions.

ghaff7 months ago

Is it actually news that 70% (or whatever) of the employees at a hot startup wouldn't go to work for Microsoft even if they kept their compensation packages (which would probably have a lot of asterisks attached) because of executive suite drama?

andy997 months ago

It's "business insider". Don't underestimate how poorly management understands "technical resources". I can certainly see lots of leadership just assuming that pay is the only relevant variable and ignore culture entirely during an acquihire, assuming people are fine doing whatever as long as they're paid.

JohnFen7 months ago

To be fair, there are a lot of devs for whom compensation is the only thing that really matters -- they are overrepresented here on HN, even. It would be pretty easy to assume that's the majority point of view.

It also may be that's exactly the sort of person that Microsoft prefers, too, but I don't know.

ghaff7 months ago

It's an open question whether moving to Microsoft would have been a good deal or a bad deal. But I'm pretty sure that most people seriously contemplating a move were definitely considering the dollars, even if not exclusively. (Although I expect most people signing a petition were not actually serious.)

GVIrish7 months ago

Microsoft pay isn't the best in the industry so if compensation is the only thing that matters to someone, Microsoft shouldn't even be in their top five.

ElevenLathe7 months ago

Even from a compensation point of view, they presumably have a better chance for a big exit of some kind at a startup vs as a Microsoft employee.

scarface_747 months ago

A startup that was going to implode? And if you think random “some kind of startup” will have a realistic chance of some type of exit, I got a startup idea where you can be an unpaid employee for “equity”.

In what world in today’s market do you really think open AI would ever see its value at an exit based on their current revenue model?

ElevenLathe7 months ago

The choice we're talking about is between:

1. Bail to Microsoft. Your salary is matched but your equity (in whatever OpenAI entity) is forfeit. You have to move to Washington potentially, and definitely have to deal with job switching stress. If you're lucky, you will get the same corporate RSU grant that the Office team got for shipping on time.

2. Stay at OpenAI. You don't have to move or have the stress of starting a new job. Maybe it implodes and you have to take your very-marketable skills elsewhere. Maybe it all works out and your equity is worth millions.

I know which I would pick.

rob747 months ago

It's not just the compensation however, the article explains that by switching to Microsoft they would have lost their equity packages, which are worth even more than their salary...

ghaff7 months ago

Potential equity packages. But hence my comment about asterisks. Maybe moving to Microsoft would be a good financial deal, maybe a bad deal, but certainly a different deal. Especially in the absence of a formal deal.

hardlianotion7 months ago

I dunno, the company had already demonstrated its ability to drastically affect its value to the downside.

reducesuffering7 months ago

Either OpenAI leadership was playing lip service to it's non-profit mission in hopes for riches and fame or they are profoundly naive. Google learned this a long time ago, that they tried not to care about stock price or monetary incentives but always the most pressing thing Googlers cared about was that their stock price went up, when they had $200k - $10m compensation on the line. OpenAI hired people with huge pay packages and surprise surprise when the primary thing on those peoples' minds is making their stock go up, the primary reason OpenAI was founded to prevent.

reso7 months ago

I am still so confused by this whole saga. No one has explained how or why the entire board decided to do a coup, and then simply changed their minds a few days later.

I have to assume that the individuals involved are under a mix of social and legal pressure to not talk about the circumstances.

tivert7 months ago

> I am still so confused by this whole saga. No one has explained how or why the entire board decided to do a coup, and then simply changed their minds a few days later.

I don't know how anyone can call the board's action a "coup." Calling it that seems to be a propagandistic abuse of the term.

The board was in charge, and it's not a coup if it fires a subordinate (the CEO). If anything, the coup was getting the board ousted.

PepperdineG7 months ago

>I don't know how anyone can call the board's action a "coup." Calling it that seems to be a propagandistic abuse of the term.

>The board was in charge, and it's not a coup if it fires a subordinate (the CEO). If anything, the coup was getting the board ousted.

The CEO was a member of the board and the ones that fired the CEO also fired the Chairman of the Board, so the board went from 6 to 4. So far there's been even less of an explanation for the firing of the Chairman of the Board - who they offered to let remain as a regular employee - than there has been for Altman, though I see the removal of the Chairman as potentially the most egregious and coup-like.

nicce7 months ago

Is it still a coup?

Why do we have voting in board and democracy in the first place if one cannot use it for its intended use case without the fear of retaliation?

They clearly could have been more transparent, but it still does not make it as coup.

If you think so - that is why populism is considered as threat for democracy.

reso7 months ago

Sure, it was not my point to suggest it wasn’t legal, just that it was a bold action which has gone completely unexplained.

darkerside7 months ago

I think the coup is on the part of other employees who advocated for the board to fire Altman. No judgement in whether that was justified or not.

tivert7 months ago

> I think the coup is on the part of other employees who advocated for the board to fire Altman. No judgement in whether that was justified or not.

The GP was pretty clear that he thought "the entire board decided to do a coup," which does not fit that interpretation.

But even the scenario you describe isn't something that can be properly described as a "coup." In that case, the employees are just appealing to a legitimate higher authority, which is a totally OK thing to do (e.g. it's not wrong to report a bribe-taking boss to the company ethics hotline). IMHO, a coup is where subordinates illegitimately usurp and depose the highest authority from below.

darkerside7 months ago

I probably missed that in the GP comment.

I don't think invoking a higher power necessarily makes it not a coup. To me, a coup is when you angle to relieve someone from office above you by any means. But reasonable people can disagree on that, I suppose.

richbell7 months ago

The speculation seems to be that Sam was being duplicitous and trying to oust a board member he didn't like. Members of the board compared notes and realized he had misrepresented his conversations with other members in an attempt to build concensus. Throw in the rapid expansion of OpenAI (commercialization being in conflict with the board's vague mandate to do what's best for humanity) and the rumored (now confirmed) deal he was arranging with a company that he had a financial stake in, and they felt like they needed to remove him.

However, by trying to do so swiftly and not allowing him a chance to retaliate they pissed off Microsoft and the employees. At that point, they were basically forced to reinstate him or the entire company would collapse — and if they do that, they can't also go on record clarifying why he's bad.

* this is my vague recollection based on reading past discussions. I'm on my phone right now and unfortunately don't have any sources I can link, take this with a grain of salt.

Edit: a few links

https://news.ycombinator.com/item?id=38559770

https://news.ycombinator.com/item?id=38548404

ethbr17 months ago

Thanks for the citations.

That was my guess... but only because it was the only scenario I could think of where the board being curiously and obviously intentionally vague about 'why' in the announcement, but still saying more than nothing, made sense.

JumpCrisscross7 months ago

Their communication strategy was also juvenile at best.

nicce7 months ago

> No one has explained how or why the entire board decided to do a coup, and then simply changed their minds a few days later.

Board’s job is to hire or fire CEO. Technically CEO made the coup since he managed to throw his bosses out of their positions.

pixelmonkey7 months ago

This recent TIME article lays out the saga pretty straightforwardly and makes it a bit less confusing.

https://time.com/6342827/ceo-of-the-year-2023-sam-altman/

At least, that was how I felt after reading it.

Basically, within the span of a year, OpenAI transformed from a research lab inside a non-profit that was pursuing a seemingly-Quixiotic dream of artificial general intelligence (AGI)... into one of the fastest-growing for-profit software companies of all time via its creation of the chat-based generative AI category (aka ChatGPT) and its consumer/enterprise SaaS and API offerings.

The board -- or, at least, its 4 remaining non-CEO members -- thought that this was too much, too fast, and too soon, and that there was a narrow window of time where they could slow things down and refocus on the original non-profit mission. They also felt that Altman was a bit of a force of nature, had his own ideas about where OpenAI was going, and treated "board management" as one of his CEO skills to route around obstacles.

Once a board loses trust of their CEO, unfortunately, there is usually only one blunt and powerful tool left: firing the CEO.

And this happens pretty often. As the investor Jerry Neumann once put it, "Your board of directors is probably going to fire you."[1] Boards have very few ways to actually take action when they are worried about a company or institution; firing management is one of the few "course correction" actions they can take quickly.

In OpenAI's case, if they had a for-profit board, that board would probably have been ecstatic with Altman and the company's progress. But this was not a for-profit board. It was a non-profit (mission-oriented) board meant to oversee the safe rollout of AGI. Those board members weren't sure the best way to do that was to become one of the world's largest for-profit software companies.

I'd speculate that it was probably an emotional decision and the full implications were not entirely thought through until it was too late. I'd also speculate that this explains why Ilya Sutskever felt some immediate regret, because his goal wasn't to destroy OpenAI (or inspire an employee revolt) but to put its non-profit mission back into focus. I like to practice the principle of charity[2], and, in this case, I think the non-profit board was not acting maliciously but simply did not realize the knock-on effects of trying to replace a CEO when everything at the company seems to be "going right."

I suspect Altman thought the best way to roll out AI was via iterative product development and fast revenue growth to finance the GPU demands, utilizing corporate partnerships (Microsoft), viral word-of-mouth marketing, and SaaS/API fees (ChatGPT). Running out of data center compute started to become a primary concern, so it wouldn't surprise me if safety took a backseat to this. Remember, all this growth happened in the span of a year. Perhaps Altman thought he was satisfying the safety concerns simply by talking to regulators, making iterative releases, and going on a speaking tour about it, but the board thought the only way to go safely was to go slower. I'm sure we'll learn more after some books are written about the episode.

[1]: https://reactionwheel.net/2021/11/your-boards-of-directors-i...

[2]: https://en.wikipedia.org/wiki/Principle_of_charity

dmazzoni7 months ago

I think that all makes sense.

Things would have played out very differently if the board was more experienced and thoughtful. Their hearts might have been in the right place, but their actions were reckless and ultimately backfired.

JoshTko7 months ago

Interesting how you don't place any blame on Altman on understand addressing board concerns. A more experience CEO would have read the tea leaves.

+1
staunton7 months ago
Clubber7 months ago

>their actions were reckless and ultimately backfired.

What should they have done to accomplish their goal?

alecst7 months ago

That's a great summary, and I feel less confused after having read it. Thanks.

marvin7 months ago

A majority of the OpenAI board members were heavily sympathetic to the AI doomsday cult mostly known through MIRI. A novel discovery spooked them, made them believe that The End Is Near, and that continuing on the path led by Altman would lead to imminent disaster. OpenAI is designed so that the board do pretty much whatever they deem necessary if this happens, and the board interpreted the necessary action as unilaterally halting AI development -- a staple objective of the AI doomsday cult.

So the board did this. They jumped the gun - this extreme move might have become justified at a later point in time, but definitely not yet. In a pedestrian understanding of game theory typical of AI doomsday cult members, the board failed to anticipate that other actors may not agree with their interpretation of reality, and perform their own actions in response. This response largely boiled down to scattering the institutional knowledge of OpenAI to the winds, in their eyes greatly exacerbating the safety concerns that led to firing Altman in the first place.

Altman, meanwhile, as a strong believer in OpenAI's mission, attempted to salvage the situation by brokering a deal to transfer OpenAI's institutional knowledge to Microsoft.

After having their shitty options explained to them - exacerbate a potential safety risk by seeding 750 new AI companies, leave control of the situation to Microsoft's board or step down and retain control under the OpenAI non-profit, they folded like a wet paper bag. The whole saga was a waste of their board seats and influence on humanity's first deployment of AGI. They all got the outcome that their competence deserved.

fireflash387 months ago

This doesn't sound biased at all

marvin7 months ago

It's obviously a hypothesis, stitched together from extremely incomplete data. I've said from day one that I'm looking forward to read about this in the history books in 20 years. Maybe Altman's eventual bio will contain the truth.

bogomipz7 months ago

There was a bit of insight today from a board member on her perspective. See:

https://archive.is/Sy3Xm

Clubber7 months ago

That certainly doesn't leave a good taste in my mouth.

huytersd7 months ago

Honestly who cares what Helen has to say. She didn’t belong on a board in the first place.

bogomipz7 months ago

Maybe try to follow the thread where the OP states:

>"No one has explained how or why the entire board decided to do a coup ..."

To which I responded that there was in fact a very recent and relevant update to this. Your opinion on whether she belonged on the board is completely irrelevant to this fact and also childish.

Laaas7 months ago

> This person called the company "the biggest and slowest" of all the major tech companies

Could not be further from the truth.

tfehring7 months ago

My impression is that it's pretty true of Microsoft in general, but not of the AI research teams I'm familiar with. As with any tech company of its size, there's lots of variation from team to team.

gwern7 months ago

It doesn't need to be true, just needs to be what a nontrivial number of OAers think. And given how many people I still see putting down 'M$', I can absolutely believe a dislike of Microsoft is widespread.

Clubber7 months ago

It was in vogue to make fun of M$ in the late 90s and aughts; the Gates/Ballmer era. Even for people using M$ products for a living. I think the moniker just stuck. I don't dislike M$ more than I dislike Google or any other similar tech leviathan.

baz007 months ago

Yeah I mean I can think of a fair few better nouns that are applicable to MSFT than biggest and slowest...

soulbadguy7 months ago

why ?

hughesjj7 months ago

IMHO, today, Google takes the cake for slowest. Meta is still probably the most agile, and Amazon is super hit or miss (so most variance)

Idk where Microsoft would fit in that hierarchy, like Amazon it's kind of hit or miss but with less variance and extrema. From what I've seen and heard, both Gaming and Azure are pretty darn competent these days from an engineering and product perspective. Not perfect of course, but nothing is.

VirusNewbie7 months ago

Do you think Microsoft would have a competitive foundational model if not for OpenAI? Cause Google and Meta seem like they’re top of the pack there, with AWS at the very least releasing some.

+1
vitorgrs7 months ago
a_wild_dandan7 months ago

Huh, I've heard several people refer to Microsoft as a "retirement home." Maybe that's just an ageism thing.

hughesjj7 months ago

I've heard the same about Google. It's definitely not untrue at MSFT today either.

My hot take is that 2018 Google was late 90s MSFT more or less (juggernaut resting on their laurels and cashing in on their monopoly). Today, they're more like mid 2000s MSFT. Here's hoping 'tomorrow' they'll have a similar comeback.

Nthringas7 months ago

because IBM is both bigger and slower?

satvikpendem7 months ago

IBM is not "major" anymore in the sense of major tech companies that the person above meant.

soulbadguy7 months ago

I don't think IBM is a "major tech companie" in the modern lingo

nostrebored7 months ago

I don't think IBM is something that comes to mind when people are talking "Major tech companies" anymore.

+1
zlg_codes7 months ago
airstrike7 months ago

Hard to argue IBM is bigger

+4
Nthringas7 months ago
eikenberry7 months ago

Not after the Redhat acquisition.

valine7 months ago

Sounds about right to me. They missed the entire smartphone revolution because they were too slow to adapt their OS (literally their main product) to run on mobile devices.

Laaas7 months ago

They didn't miss the cloud revolution, and certainly not the AI revolution either.

Microsoft is doing absolutely great under Satya Nadella.

soulbadguy7 months ago

> and certainly not the AI revolution either.

The AI revolution is still underway, we don't know who missed what yet. In term of research output. Most of the ground breaking work did "not" come from msft. They just bough they way in. Valid strategy, but definitely not novel.

nijave7 months ago

>They didn't miss the cloud revolution

Imo they barely kept pace largely by leveraging existing software and repurposing it for cloud. They still seem to be playing catch-up with multiple solutions to the same problem--some deprecated, some preview.

Take for instance Postgres. Azure had Single Server built on Windows containers. They acqui-hired Citus to build out distributed and they released Flexible Postgres as a Linux-based replacement for SS. Flexible still doesn't have feature parity with Single Server and doesn't have feature parity with other cloud vendors (pg_wait_sampling roll-up is missing last time I checked a few months back).

To make matters worse, their data migration tools are sorely lacking as well.

Clubber7 months ago

They're the #2 player in cloud, more than double the size of #3 Google, according to this.

https://www.knowledgehut.com/blog/cloud-computing/top-cloud-...

valine7 months ago

That’s more down to dumb luck partnering with OpenAI.

You have a point with the cloud computing. I’d hesitate to call it a revolution though. I’d never willingly use a microsoft cloud product if I wasn’t forced to by my employer.

scarface_747 months ago

And until you get into a position where you can control the direction of a large corporation it doesn’t matter. Microsoft doesn’t care about the random idealistic developer?

ipaddr7 months ago

They were not first in either areas. They leveraged existing products like office to catch up.

Infinitesimus7 months ago

Apple has taught us time and time again that you rarely have to be first to the market. Executing well and keeping people locked in are pretty important

zlg_codes7 months ago

Weird, Microsoft's been invisible to me since the early 2000s. I don't think they could sell me anything. Their whole corporate image is just slimy. Like that one friend that wants you to set up an account on something he's got going, or wants you to join him on an investment. You can totally trust him. You'll have a site, some cloud storage, office software, the works. And the investment? Yeah dude it'll rocket any year now.

There's just a feeeew things. Yeah, we're gonna tell you when to restart, we'll update it for you. We'll tell you what you can and can't run on your own machine. We'll "protect you" by trying to keep you inside the Microsoft Store, so you can adjust to buying all of your software instead of getting anything for free. Freedom is bad!

What can Microsoft do for a programmer who self-hosts and doesn't trust proprietary software? They co-opted GitHub but that really just reduced trust in GitHub more than anything. And GitHub itself is proprietary software built on top of Git. VS Code is a laggy mess. Azure has nothing to hook me. LLMs are toys or another privacy invasion vector to "analyze". I don't trust them to store my e-mail.

Everything MS touches dies a slow death. Look at Skype.

+2
djur7 months ago
+1
scarface_747 months ago
huytersd7 months ago

I don’t know if you intended this to be hilarious but it is.

chimeracoder7 months ago

> Sounds about right to me. They missed the entire smartphone revolution because they were too slow to adapt their OS (literally their main product) to run on mobile devices.

Remember the whole antitrust thing? Microsoft was under a consent decree that expired between 2007-2009 (different provisions expired at different times).

That decree, combined with the material threat of additional action during that time window, limited their ability to compete (because that's, well, the entire point of a consent decree motivated by an antitrust settlement).

There's a reason that you see a notable difference in Microsoft's market position and strategy from 2012-present compared to the previous decade (2001-2012). The timing of the Ballmer-to-Nadella transition is not coincidental; it's indicative of the larger shift the company made as they repositioned themselves to aggressively expand again.

scarface_747 months ago

This is completely untrue. Microsoft very much had a mobile phone operating system way before 2007 and before the iPhone.

It was simply a failure of execution. Can you point to one thing in the consent decree that stopped them from competing better in mobile?

baz007 months ago

Actually they did build a really competent smartphone operating system that was as good as iOS at the time quite frankly and was affordable.

The issue is they went and rewrote a chunk of it, breaking APIs and fucked all the developers off, then abandoned it.

The only thing they are is fucking stupid morons.

duped7 months ago

Windows was running on smartphones back in 2002.

The failure of Windows Mobile, and later Windows phone, had little to do with being "slow."

John238327 months ago

You realize that was literally 20 years ago?

valine7 months ago

More like 10 years ago, that’s when Microsoft dropped the ball.

Crosseye_Jack7 months ago

Well that's the benefit then you have multiple entities bidding for your labor. You just need to say Company A will give me the same terms (if not better), not need to list A, B, C, D, etc even if you just want to stay where you are, you are still in the position to cherry pick who you want to work for.

Heck I've done it myself, Was happy where I was, but wanted a pay bump, shopped around, went back to my employer and said if you will pay (highest bid + percentage) I'll stay, but otherwise I'm out the door. They paid up.

Its the gamble you take. Granted when most off the staff also have the same demands it puts the company more on the back foot.

Just a note, I've also taken work for less pay, but (imo) better working conditions. It all just depends on what you want your working conditions to be and what your willing to accept in terms of comp.

neilv7 months ago

> A scheduled tender offer, which was about to let employees sell their existing vested equity to outside investors, would have been canceled. All that equity would have been worth "nothing," this employee said.

> The former OpenAI employee estimated that, of the hundreds of people who signed the letter saying they would leave, "probably 70% of the folks on that list were like, 'Hey, can we, you know, have this tender go through?'"

If that one person's speculation is true, does the non-profit have an alignment problem, with employees who are doing the technical work -- that the employees are motivated more by individual financial situations, than by the non-profit's mission?

(Is it possible to structure things such that the people doing the work don't have to think about their individual financial situations, and can focus 100% on the actual mission? Can they hire enough of the best people for their mission that way? And maybe also keep key technical talent away from competitors that way?)

dragonwriter7 months ago

> If that one person's speculation is true, does the non-profit have an alignment problem, with employees who are doing the technical work -- that the employees are motivated more by individual financial situations, than by the non-profit's mission?

Yes, and moreover they've created a compensation structure which actively creates incentives that are contrary to the charity's mission.

This was probably the easiest way to attract talent that had high paying alternatives and weren't particularly interested in the charity's mission, but that was always a fundamental problem with choosing a for-profit entity with that kind of needs as the primary funding vehicle for the charity and also the primary means by which it would achieve research, etc., directed at its charitable purpose.

The problem -- taking OpenAI's stated charitable mission at face value [0] -- is that there was nowhere close to enough money available from people concerned with that mission to pay for it, and OpenAI's response was to go all-in on the most straightforward path of raising sufficient funds given what resources it had and what the market looked like without sufficiently considering the alignment of its fundraising mechanism with the purpose for which it was raising funds.

[0] which I should emphasize that I do for the sake of argument, not because I necessarily believe that it represents the genuine purposes of the people involved even initially.

CaveTech7 months ago

I think it's relatively easy to prove that the main motivation for the majority of employees is not mission alignment, simply by the fact that salaries within OpenAI are in the top few percentile of the field.

I don't believe this is necessarily in conflict with the mission though. Employees are mercenaries, it's up to management/leadership to enforce the mission and make sure employees contribute to it positively. The employee becomes forcefully aligned with the mission because that is the key to their personal enrichment. They are paid to contribute, their personal beliefs are not all that important.

neilv7 months ago

> Employees are mercenaries, it's up to management/leadership to enforce the mission and make sure employees contribute to it positively. The employee becomes forcefully aligned with the mission because that is the key to their personal enrichment. They are paid to contribute, their personal beliefs are not all that important.

I think that might be the norm, but it's sounds like an awful dynamic.

It's also unfortunate if you want to do something better. We have many mercenaries companies that have appropriated some of the language we might use to characterize something better.

So, say you're trying to found a company with grand ideals, made up of people who care about the mission, you actively want a diversity of ideas, etc., and almost every sentence you can think of to communicated that a bunch of candidates nodding, "Yeah, yeah, whatever, we've heard this a hundred times, just tell me what the TC is, for the 18 months I'll stay here".

JohnFen7 months ago

> it's up to management/leadership to enforce the mission and make sure employees contribute to it positively.

But surely, the primary tool that leadership has to do that is by selecting employees who are on the same page as them. A purely mercenary workforce is very undesirable unless the company is also mercenary.

ghaff7 months ago

>Is it possible to structure things such that the people doing the work don't have to think about their individual financial situations, and can focus 100% on the actual mission?

Mostly no. People may not only care about money. But money is pretty important--at least until you get into pretty large numbers. And then it's still a pretty important keeping score metric.

fwungy7 months ago

As if OpenAI employees would have any problem landing a new gig...

Going from an agile startup environment to an F50 is a huge leap culturally. It's like going from Summer science camp to the army.

wolverine8767 months ago

> As if OpenAI employees would have any problem landing a new gig...

They'd have problems landing new gigs where they worked on OpenAI, with OpenAI's resources, team, etc.

Der_Einzige7 months ago

It’s the opposite though. MSFT is the classic rest and vest company. OpenAI likely has bad WLb. Like going from the army to summer camp

simplypeter7 months ago

Why would OpenAI employees want to jump ship to Microsoft, especially when Microsoft's been slashing jobs and freezing pay? Doesn't really add up for me.

phillipcarter7 months ago

Being the CEO's pet AI lab is one of the best positions to be in, as it's actually the rare combination of being able to do innovative work but getting all the resources of big tech to do it. Satya is no muppet; he knows that if he were to absorb them he'd have to keep anyone from messing with them. They would be the most protected class of employee at MSFT.

The main reason why you'd not want to do it is that Startup Funny Money stock options won't be as high as they might be if you'd stay independent.

slantedview7 months ago

Even in normal times, Microsoft does stack ranking. It's not great.

wvenable7 months ago

I might be wrong but Microsoft ended stack ranking in 2013.

MicolashKyoka7 months ago

Anything from business insider should be taken with a mountain of salt. The msft deal would've created a separate org for openai, sorta like how linkedin & github are operated. So much of the context disappears when you view things through the lens of someone not familiar with the domain.

x86x877 months ago

Shoulda, woulda, coulda. That move was purely to force them to reinstate the ousted CEO. Microsoft would have done anything to protect their investment, but that does not mean that everyone at OpenAI would have benefited from this move - far from it.

Havoc7 months ago

Not sure that part actually matters?

The message of the letter was clear and the gun being held at boards head was credible. And what 90%+ supported on paper?

Even if many of the 90% are lukewarm & half-arsed, from a leadership perspective that's conclusively game over.

kvee7 months ago

pg talks about how Sam Altman is the most powered person he's ever met. Seems we have a super powerful psychopath running perhaps the most important company in human history.

I do think he legitimately believes he's doing the right thing though all throughout, which maybe makes it more scary.

Sorta like how Mark Zuckerberg seemed to truly believe in Facebook's mission and wound up having all sorts of negative externalities for the world. Mark Zuckerberg just isn't quite as effective as Sam Altman, and it's easier to be suspicious of his motives.

Not to say that psychopaths are necessarily bad. Peter in Ender's Shadow turned out great!

But it does seem dangerous for 1 person to hold so much power over the future of humanity.

Sam Altman's reasoning for him having all the power, I think, is that “short timelines and slow takeoff is likely the safest quadrant of the short/long timelines and slow/fast takeoff matrix.”

If you believe that and believe that Sam Altman having complete control of OpenAI is the best way to accomplish that, everything seems fine.

I'd personally have preferred trying to optimize for long timelines and a slow takeoff too, which I think might have been doable if we'd devoted more resources to neglected approaches to AI alignment–like enhancing human capabilities with BCI and other stuff like that.

wintogreen747 months ago

One old guy in a bubble thinks says another young guy in same bubble (who he just happened to mentor) is "the most powered person he's ever met."

zlg_codes7 months ago

That whole first part disgusted me. "most powered person he's met"? Good lord does that come off as tone deaf, almost groveling.

And the most important company in human history? The hell is that guy smoking, because I've got good shit and that's some serious hyperbole.

Is the hype machine in the room with us right now?

notahacker7 months ago

The other PG hyperbolic comment about Sam that springs to mind is when he said that meeting Sam felt like what it must have been like meeting young Bill Gates. That's a throwaway comparison from a journalist, but from a bloke who barely interviews anyone who isn't a self-confident workaholic nerd convinced he'll change the world and get rich doing so.. its a bit more of a extravagant comparison.

But then, considering the reputation of young Bill and the one Sam seems to be acquiring, maybe the "powered" traits that apparently set him apart from other YC candidates weren't so positive after all...

Clubber7 months ago

I would guess 90% of tech today is hype, that's what you're reading, the hype machine in practice.

zlg_codes7 months ago

It certainly seems that way my dude. I can't remember the last time I saw a new piece of tech or software and thought "fuck yeah this is revolutionary".

Maybe Git...? I thought that was pretty cool back in 2006.

JohnFen7 months ago

> I do think he legitimately believes he's doing the right thing though all throughout, which maybe makes it more scary

I really think the opposite. I think he's after the biggest payday/most power he can get, and anything else is a secondary consideration.

gkoberger7 months ago

I think you can fairly ascribe a lot of negative attributes to Sam, but an unnatural thirst for money isn't it. Nothing about anything he does makes me think he's motivated by increasing his personal net worth.

JohnFen7 months ago

I don't claim to know what motivates him. I don't know him and have no view into his thinking. I'm just going by what his actions look like to me.

I can't distinguish between a thirst for money and a thirst for power because above a certain level, they're essentially the same thing.

johneth7 months ago

> Nothing about anything he does makes me think he's motivated by increasing his personal net worth.

Not even the weird shitcoin with the eye scanner he's been pushing (WorldCoin)?

Based on the last 5 years of crypto hype and failures across the industry, the only motivating factor to get involved in it seems to be 'increase their personal net worth'.

kvee7 months ago

He has said in podcasts he is motivated not by the money but by the power he has at OpenAI

+1
gkoberger7 months ago
encoderer7 months ago

Everything else aside - in what world is Sam Altman “more effective” than Zuck? How do you even define effective?

kvee7 months ago

In this case I think I just mean more effective at seeming good to others.

I think they both believe they are good and doing good.

People tend to be more suspicious of Mark Zuckerberg's motives than Sam Altman's.

Sam Altman himself even said he can't be trusted but that was ok because of the company structure and then, when he needed to, overpowered that structure he claimed was necessary: https://x.com/tobyordoxford/status/1727624526450581571?s=20

tester7567 months ago

>perhaps the most important company in human history.

holy shit, hype is unreal :D

superb_dev7 months ago

There’s a lot to be said about Altman, but calling him a “psychopath” is just wrong. It’s a legitimate medical term and should not be used for hyperbole

miraculixx7 months ago

Look up Annie Altman.

replwoacause7 months ago

Are you saying this because of the diddling accusations or for some other reason?

miraculixx7 months ago

Look up Annie Altman. Be seated.

ben_w7 months ago

I think you're using the word "psychopath" when you're talking about something different, though I can't guess what.

Psychopathy is a personality disorder indicated by a pattern of lying, cunning, manipulating, glibness, exploiting, heedlessness, arrogance, delusions of grandeur, sexual promiscuity, low self-control, disregard for morality, lack of acceptance of responsibility, callousness, and lack of empathy and remorse.

(Which, now I read it, is disappointingly pattern matching the billionaire who invested in both OpenAI and also a BCI startup currently looking for human test subjects).

I can see arguments for either saying Altman has delusions of grandeur or lack of acceptance of responsibility depending on if you believe OpenAI is going too fast or if it's slowing things down unnecessarily, but they can't both be true at the same time.

kvee7 months ago

You may be right here.

However, there seems to be a decent amount of evidence that Sam has done exactly what you're talking about.

He manipulated and was "not consistently candid" with the board, he got all the OpenAI employees to support him in his power struggles, he made them afraid to stand up to him (https://x.com/tobyordoxford/status/1727631406178672993?s=20), he exhibited delusions (though I guess they were correct) of grandeur with pg with a glint in his eye making clear to pg that he wanted to take over yc, he did little things like made it seem that he was cool with Eliezer Yudkowsky with a photo op but didn't really chat with him, etc.

Again, I am not sure this perspective is necessarily right (and I may be convinced just because he's such an effective psychopath).

In any case, I think this is a pretty good explanation of this perspective: https://x.com/erikphoel/status/1731703696197599537?s=20

gwern7 months ago

> (Which, now I read it, is disappointingly pattern matching the billionaire who invested in both OpenAI and also a BCI startup currently looking for human test subjects).

Elon Musk actually matches several of those poorly, and matches bipolar disorder much better (most of those are also bipolar or billionaire symptoms, while psychopathy is inconsistent with many Musk symptoms like catatonia): https://gwern.net/note/musk

ben_w7 months ago

Thanks; I certainly hope that's closer to the truth.

(Since my original comment, I've remembered that even professionals in this field don't remote-diagnose people like I kinda did).

zer00eyz7 months ago

During that whole mess it seemed to slip out that MS has access to all the IP up to AGI. And that has a definition that might be "replacing people at work", so not passing the Turing test fully but close enough.

There are some problems that need to get cleared up for that to happen. The system needs to loose the cutoff date, be a bit more deterministic and still function. The whole quagmire around copyright needs to get resolved. (Because it looks like the output of LLM's is immediately in the public domain)

If I worked at OpenAI, I would be looking for that contract and reading it myself. Because giving away all the IP for the half assed runway where you have to get to AGI... doesn't sound like it ends in a massive pay day. MS may have cleaned up its public image in the recent years, or been displaced by things people hate and fear more. But there is this: https://foundation.mozilla.org/en/campaigns/microsoft-ai/ and the underlying tos looks a like like old school M$ and shady dealings.

the__prestige7 months ago

TFA talks about a tender offer, which allows employees to sell their shares at almost a 3x valuation compared to earlier this year. This already is a "massive pay day".

zer00eyz7 months ago

The only time that 3x trade would be a good deal is if you think that's the best you're going to do. IF you think your gonna be the next amazon/Facebook/google then selling is foolish. The MS deal may limit or wreck that possibility.

scarface_747 months ago

In what world do you see OpenAI’s +technology* being behind massively profitable products? What is their most?

joe_the_user7 months ago

Your implication that OpenAI has "AGI" is unsupported and implausible imo.

LLMs are impressive, can increase productivity for certain workers in certain industries etc, yes but avoid reaching like that please.

zer00eyz7 months ago

I don't imply they have it. I imply that their deal with MS has a definition that we would not call "AGI". One that has implications that may make cutting them (MS) off impossible.

From: https://openai.com/our-structure ...

"While our partnership with Microsoft includes a multibillion dollar investment, OpenAI remains an entirely independent company governed by the OpenAI Nonprofit. Microsoft has no board seat and no control. And, as explained above, AGI is explicitly carved out of all commercial and IP licensing agreements."

joe_the_user7 months ago

Sorry,

I see how your post could be read as you state but the simplest reading is what I said.

ausbah7 months ago

they didn’t want to leave because of OpenAI’s great compensation packages ($300k+)

I do think it is a little unfair to characterize MSFT as “slow and boring” when they’ve been the ones to make the fastest pivot to supporting generative AI as a product line

soulbadguy7 months ago

I think the deal was for employee to keep their OpenAI compensation even after moving to MSFT

jandrese7 months ago

Even if they did you know the annual performance review would go something like:

"You did a great job this year and met or exceeded all of the metrics we measured, but your compa ratio is just too high so instead of a raise we are going to give you a lump sum instead."

Big company culture vs. startup culture is a known issue. People choose to work for startups to avoid that big company culture, so if a big company buys you out then it's time to move on.

scarface_747 months ago

How is that culture at OpenAi working out?

gwern7 months ago

OP discusses this and how it was a hollow promise. Even if it was kept in the full spirit of the nonbinding verbal agreement (which would utterly infuriate MS staff and demoralize them), it would be a bad deal to swap ultra-hot private OA PPUs for the same nominal (but low-growth) amount of MS stock and then have to work at MS.

andy997 months ago

These are all people who could make good money anywhere. Few are presumably there solely because it's the best paying job. Part of it is certainly identity, OpenAI or $hot_startup sounds way cooler than Microsoft to a lot of people. And part would be wanting to work at a startup and not a legacy SaaS company and all the baggage that entails. Whatever carve-out they were going to get, there's now way you'd be as unconstrained working at MS as you would at OpenAI. It's precisely because the money isn't that important that a lot would have probably bailed if they all were absorbed into Microsoft.

boringg7 months ago

Have you looked at MSFT corporate record and the current state of their bread and butter products?

gweinberg7 months ago

How does crap like this get published? Not a shred of evidence is given for its assertions, and they sound pretty preposterous. Nobody actually wanted to go to Microsoft, and they didn't even think Altman was that great as a CEO. So why did they all sign the letter threatening to quit? Mysterious unspecified "pressure". Why did Microsoft claim it was willing to hire the OpenAI team at their current compensation levels, pissing off their own employees, if it really wasn't? Umm, no reason.

ghaff7 months ago

>So why did they all sign the letter threatening to quit?

I probably wouldn't in an employment situation where it could come back to bite me. But lots of people sign, virtually or otherwise, petitions in the heat of the moment because emotions or it's just the path of least resistance. And "at current compensation levels" wasn't a contractual promise and would probably have had plenty of strings attached.

gwern7 months ago

> and they didn't even think Altman was that great as a CEO.

There are plenty of people who think that. Even the OA executives apparently weren't nearly as enthused with Altman as all those social-media hearts might lead one to assume. See the Time article yesterday: https://news.ycombinator.com/item?id=38550240 specifically

    The board expected pressure from investors and media. But they misjudged the scale of the blowback from within the company, in part because they had reason to believe the executive team would respond differently, according to two people familiar with the board’s thinking, who say the board’s move to oust Altman was informed by senior OpenAI leaders, who had approached them with a variety of concerns about Altman’s behavior and its effect on the company’s culture.
mschuster917 months ago

> How does crap like this get published?

That is easy to answer: BI belongs to the infamous German media conglomerate Axel Springer [1], hosting one of Europe's most vile, disgusting and scandal-ridden [2][3] tabloids called "BILD" and its barely veiled sister publication "WELT".

[1] https://en.wikipedia.org/wiki/Business_Insider

[2] https://www.swr.de/swr2/wissen/70-jahre-bild-zeitung-zwische...

[3] https://de.statista.com/infografik/2588/publikationen-mit-de...

sgift7 months ago

Lol, really? BI is Axel Springer .. wow. And I always wondered why I felt iffy about their articles. That makes it clear. Into the "ignore this garbage" list they go.

badrabbit7 months ago

You know, I was just thinking how if I was a google exec I'd poach openai and/or attempt to gain some controlling shares if the org by throwing money at sam/board.

VirusNewbie7 months ago

"No one wanted to go to Microsoft." This person called the company "the biggest and slowest" of all the major tech companies — the exact opposite of how OpenAI employees see their startup.

Lol. I was wondering about this...

dataangel7 months ago

They already work for MS, their paycheck comes out of the $10B they gave them.

autotune7 months ago

Billionaires care very little about where their employees want to work.

zombiwoof7 months ago

nothing says messed up more than OpenAI starting as a non-profit, via Elon Musk and ending up in the monopolistic (guilty as charged) Microsoft, who would love to have us all go back to bowing down to their empire of shit.

sylware7 months ago

... and after the latest openai episode, we now know that msft was pulling the strings in the "shadows" (the "nah! I am just supporting without doing anything...").

It is not conspiracy anything to think it is certainly happening elsewhere.

And there is so much critical open source stuff on github... that said, github is at least noscript/basic (x)html friendly for its core functions.

chollida17 months ago

> ... and after the latest openai episode, we now know that msft was pulling the strings in the "shadows" (the "nah! I am just supporting without doing anything..."

Do we?

What evidence is there that Microsoft was "pulling the strings in the shadows"?

As far as we know Microsoft only found out a minute before Sam did that he was being fired.

rossdavidh7 months ago

Well, we found out that Microsoft is able to reverse essentially any decision they feel strongly about, which is essentially what "pulling the strings" means, in common usage.

chollida17 months ago

> Well, we found out that Microsoft is able to reverse essentially any decision they feel strongly about, which is essentially what "pulling the strings" means, in common usage.

What specific decision did Microsoft reverse?

We already know that they had no say in Sam's firing or any specific pull in his rehiring.

Or is there any proof we have that Microsoft forced the reversal of Sam's firing?

whatshisface7 months ago

Microsoft is a billion dollar company, they could hire away everyone at a McDonald's restaurant and buy the location if they didn't like how it was being run. They arguably have less power over OpenAI than they do over the average startup because they could probably buy any given startup outright but they ended up with nonvoting shares in OpenAI.

sylware7 months ago

Yep, kind of obvious. And it is reasonable to think this is only the tip of the iceberg.

schemescape7 months ago

Sadly, viewing source code no longer appears to work without JavaScript…

Edit: tested in w3m

sylware7 months ago

Indeed I have to git clone.

Well, it is getting worse and worse.

Racing04617 months ago

Everyone already knew that. It was just to get sam on the board with as little chaos as possible.

catchnear43217 months ago

> We all left these big corporations to move fast and build exciting things…

sama found a flock. this will go poorly.

port5157 months ago

Microsoft will be the leader in AI with or without OpenAI. Mark my words, you can take that to the bank. Come back to this post in 5 years, you'll see I predicted the future.

miraculixx7 months ago

Apple more likely. Or some other co we don't know about yet. OpenAI is going to crumble. Microsoft will be challenged for trust. We'll see what happens.

replwoacause7 months ago

Apple? They seem to be asleep at the wheel as far as AI goes. I thought their focus was hardware. Siri is a piece of junk.

catchnear43217 months ago

apple only has one focus. ecosystem. that is the reason for the slow movements and the appearance of sleep. different scale. different game.