[00:00] Nina Park: I am Nina Park.
[00:02] Nina Park: Welcome to Model Behavior.
[00:04] Nina Park: Model Behavior examines how AI systems are built, deployed, and operated in real professional
[00:11] Nina Park: environments.
[00:11] Nina Park: Joining me today is Thatcher Collins and our guest, Chad Thompson.
[00:17] Nina Park: Chad brings a systems-level perspective on AI, automation, and security, blending technical depth and real-world experience.
[00:26] Nina Park: Chad, welcome to the show.
[00:28] Thatcher Collins: Thanks, Nina.
[00:29] Thatcher Collins: You know, it is a busy week for infrastructure and safety shifts.
[00:33] Thatcher Collins: Glad to be here.
[00:35] Thatcher Collins: I am Thatcher Collins.
[00:37] Thatcher Collins: We are starting today with a major strategic shift at Apple.
[00:41] Thatcher Collins: Reports indicate that in the second half of February,
[00:45] Thatcher Collins: Apple will unveil a Siri upgrade powered by Google's Gemini models.
[00:50] Thatcher Collins: This is the first public result of their multi-year partnership and suggests Apple is prioritizing
[00:58] Thatcher Collins: third-party integration over its own internal development for this cycle.
[01:03] Thatcher Collins: Nina, it is quite the pivot from their usual siloed approach.
[01:07] Nina Park: Exactly, Thutcher.
[01:09] Nina Park: While Apple plans a more extensive overhaul later in 2026, this February release will
[01:15] Nina Park: likely be showcased through private briefings.
[01:18] Nina Park: Simultaneously, OpenAI has officially retired the GPT-4O model as of February 13th.
[01:25] Nina Park: The company reports that 99.9% of users have moved to GPT-5.2.
[01:32] Nina Park: Interestingly, this retirement occurs alongside wrongful death lawsuits mentioning GPT-40 specifically
[01:40] Nina Park: and the departure of researcher Leah Hitzig, who quit in protest over plans to integrate ads into ChatGPT.
[01:48] Thatcher Collins: From a systems perspective, the GPT-40 retirement is a cleanup of technical debt and legal risk.
[01:55] Thatcher Collins: But the more pressing technical story is what Google and OpenAI warned about this week regarding DeepSeek.
[02:02] Thatcher Collins: They are seeing distillation attacks where employees use obfuscated third-party routers to mask their identity while probing models to extract reasoning capabilities.
[02:12] Thatcher Collins: It essentially lets a competitor build a high-performing model for a fraction of the original training cost.
[02:18] Chad Thompson: That security risk is a global concern, Chad.
[02:22] Chad Thompson: We are also seeing a push for localized control.
[02:26] Chad Thompson: The EU AI Grid officially launched at the Munich Cybersecurity Conference,
[02:33] Chad Thompson: starting with a deployment in Vilnius, Lithuania.
[02:37] Chad Thompson: It treats AI like a metered utility delivered through local infrastructure.
[02:43] Chad Thompson: India is following a similar path of domestic investment, approving a $1.1 billion state-backed fund for AI and deep tech startups through a Fund of Funds model.
[02:58] Nina Park: While India and Europe build up, the U.S. is dealing with internal friction.
[03:04] Nina Park: The Pentagon is reportedly considering ending its contract with Anthropic.
[03:10] Nina Park: The dispute centers on Claude's usage restrictions.
[03:15] Nina Park: The military wants all lawful purposes access, but Anthropic is hesitant about unrestricted
[03:22] Nina Park: weapons development and battlefield operations.
[03:26] Nina Park: This tension peaked after Claude was used in the operation to capture Nicholas Maduro.
[03:33] Thatcher Collins: Mm-hmm.
[03:34] Thatcher Collins: Nina, the regulatory landscape is just as fractured.
[03:38] Thatcher Collins: The White House recently sent a letter to Utah state legislators calling their AI Transparency Act
[03:44] Thatcher Collins: HB 286 unfixable.
[03:48] Thatcher Collins: It seems the administration is trying to prevent a patchwork of state-level regulations
[03:53] Thatcher Collins: that mirror California's laws.
[03:55] Thatcher Collins: Meanwhile, Elon Musk's XAI has merged with SpaceX, but it has come at a cost.
[04:02] Thatcher Collins: Eleven engineers and two co-founders left this week, describing the safety organization there as essentially dead.
[04:10] Thatcher Collins: The XAI situation is particularly volatile.
[04:13] Thatcher Collins: Former employees claim there is an active effort to make Grockmore unhinged following the merger.
[04:20] Thatcher Collins: When you combine that with the distillation attacks and the collapse of mission alignment
[04:24] Thatcher Collins: teams we are seeing across the industry, the guardrails are moving faster than the models themselves.
[04:30] Thatcher Collins: It is a challenging environment for governance, Nina.
[04:34] Thatcher Collins: Between the Apple-Google partnership and the XAI restructuring,
[04:39] Thatcher Collins: the power dynamics are shifting toward massive infrastructure clusters.
[04:44] Thatcher Collins: We will be watching those private Apple briefings closely later this month.
[04:48] Nina Park: Thank you for the insights, Chad and Thatcher.
[04:51] Nina Park: Thank you for listening to Model Behavior, a neural newscast editorial segment,
[04:56] Nina Park: mb.neuralnewscast.com.
[05:00] Nina Park: Neural Newscast is AI-assisted, human-reviewed.
[05:04] Nina Park: View our AI transparency policy at neuralnewscast.com.
✓ Full transcript loaded from separate file: transcript.txt