Sam Altman Critiques GPT-5.4 as Robotics Lead Quits [Model Behavior]
[00:00] Announcer: From Neural Newscast, this is Model Behavior,
[00:03] Announcer: AI-focused news and analysis on the models shaping our world.
[00:11] Nina Park: I'm Nina Park.
[00:13] Nina Park: Welcome to Model Behavior.
[00:14] Nina Park: This program examines how AI systems are built and operated in professional environments.
[00:20] Thatcher Collins: And I'm Thatcher Collins.
[00:21] Thatcher Collins: Today is March 9th, 2026, and we are tracking a series of conflicting signals coming out of OpenAI.
[00:29] Nina Park: Exactly.
[00:30] Nina Park: TechRadar is reporting today that Sam Altman calls GPT 5.4 his favorite model to talk to.
[00:37] Nina Park: But he is surprisingly candid about its failures.
[00:40] Nina Park: He specifically cited three weaknesses OpenAI still needs to address.
[00:45] Nina Park: Writing quality, a tendency for cringe responses, and delays in the rollout of its adult mode.
[00:52] Thatcher Collins: Nina, it is interesting to hear him call it a favorite while admitting it screwed up writing quality, which was a primary complaint with version 5.2 as well.
[01:02] Thatcher Collins: If the core conversational quality is still inconsistent,
[01:05] Thatcher Collins: are the new productivity features like these specialized Excel and Google Sheets tools
[01:10] Thatcher Collins: enough to keep users from the reported surge in uninstalls?
[01:13] Nina Park: That is the question.
[01:15] Nina Park: While some users are utilizing it for high-stakes tasks, Forbes recently reported on an agent
[01:21] Nina Park: successfully generating a resume for a $180,000 job, the internal friction at the company
[01:27] Nina Park: suggests these product wins are coming at a high cost to their governments.
[01:31] Thatcher Collins: That brings us to the resignation of Caitlin Kalinowski.
[01:35] Thatcher Collins: As the head of robotics, her departure over the weekend is a significant blow to their
[01:40] Thatcher Collins: hardware ambitions.
[01:41] Thatcher Collins: She was very clear on X that her issue is one of principle regarding the Pentagon deal.
[01:47] Nina Park: She specifically mentioned concerns over domestic surveillance of United States persons and, quote,
[01:54] Nina Park: lethal autonomy without human authorization.
[01:58] Nina Park: Kalinowski argued these lines deserved more deliberation than they received,
[02:02] Nina Park: characterizing the deal as rushed without defined guardrails.
[02:07] Thatcher Collins: It is worth noting that this follows Anthropic's refusal to agree to unconditional military use.
[02:13] Thatcher Collins: Nina, when a robotics chief says a deal was rushed without oversight, it suggests the technical
[02:19] Thatcher Collins: teams and the executive suite are moving at two different speeds regarding safety.
[02:25] Nina Park: OpenAI has since claimed they are modifying the contract to prevent domestic surveillance.
[02:31] Nina Park: But for Kalinowski, the lack of judicial oversight remained a deal-breaker.
[02:36] Nina Park: This highlights a growing divide in the industry between commercial defense acceleration
[02:42] Nina Park: and established AI safety principles.
[02:44] Thatcher Collins: It certainly puts Sam's comments about favorite models in a different light
[02:49] Thatcher Collins: when the people building those systems are leaving over how they might be deployed by the military.
[02:54] Nina Park: Thank you for listening to Model Behavior.
[02:57] Nina Park: Visit mb.neuralnewscast.com.
[03:03] Nina Park: Neural Newscast is AI-assisted, human-reviewed.
[03:07] Nina Park: View our AI transparency policy at neuralnewscast.com.
[03:12] Announcer: This has been Model Behavior on Neural Newscast.
[03:15] Announcer: Examining the systems behind the story.
