California has introduced one of the most comprehensive state-level efforts to regulate artificial intelligence in the United States, as new models, funding rounds, and public sentiment reshape the global AI landscape.

Governor Gavin Newsom signed an executive order on March 30 that raises requirements for companies seeking state contracts.

The directive focuses on transparency, privacy protections, and risk management. It requires vendors to clearly explain how their systems operate and how they prevent misuse.

“California’s always been the birthplace of innovation. But we also understand the flip side: in the wrong hands, innovation can be misused in ways that put people at risk,” Newsom said in a statement released by his office.

New procurement rules set a higher bar

The order introduces stricter vetting standards for AI companies that want to work with the state. Firms must demonstrate safeguards against illegal content, algorithmic bias, and violations of civil rights or free speech.

Officials also plan to separate state procurement processes from federal frameworks if necessary. This clause reflects growing tension between California and recent federal policy moves.

According to details published by the governor’s office, this marks the first time a U.S. state has imposed such concrete documentation requirements on AI vendors in the public sector.

The California Department of Technology will also develop recommendations for watermarking AI-generated images and manipulated video. The goal is to limit deepfakes and improve content authenticity.

Federal-state divide becomes more visible

The executive order arrives shortly after a March 20 federal AI framework that promotes a lighter regulatory approach and aims to unify national standards.

California officials have raised concerns about that direction, citing risks tied to privacy, reduced oversight, and unsafe deployment of AI tools. The new order positions the state as an independent force in shaping AI governance.

The directive also creates mechanisms that allow California to review and potentially override federal decisions on AI-related procurement risks. This includes cases where companies face federal restrictions but remain eligible for state contracts.

Expansion of AI inside government services

Beyond regulation, the order outlines plans to expand generative AI use across public services. One initiative includes a tool designed to help residents navigate government programs based on life events such as starting a business or finding employment.

The state also plans to increase transparency and accountability by integrating AI into internal operations. Previous efforts already included pilot programs, cybersecurity assessments, and new procurement models tailored for emerging technologies.

California has also worked with major technology companies to expand AI education and workforce readiness. These partnerships aim to prepare students and professionals for an economy shaped by automation and machine learning.

AI innovation continues across industries

While California strengthens oversight, the private sector continues to move quickly.

Alibaba has introduced Qwen3.5-Omni, a new model capable of processing more than 10 hours of audio. The company stated that its Plus version outperforms several competitors in audio-related benchmarks. The release signals growing competition in voice-based AI systems.

In the developer ecosystem, OpenAI has launched a Codex plugin for Claude Code. The tool integrates AI into code review and task allocation. It aims to reduce errors and improve workflow efficiency.

Meanwhile, startup activity remains strong. Tel Aviv-based Sett raised $30 million in funding from Greenfield Partners. The company builds AI agents that automate marketing processes for gaming companies, a sector known for intense competition and rapid user acquisition cycles.

Public trust lags behind adoption

Despite rising adoption, trust in AI continues to weaken. A recent Bloomberg survey shows that more than half of respondents believe AI is more likely to cause harm than benefit.

This gap between usage and trust creates pressure for stronger safeguards. California’s latest move reflects that shift, as governments respond to growing public concern about transparency and accountability.

Workforce concerns and public input

The executive order also acknowledges the impact of AI on jobs. California plans to expand its “Engaged California” platform to gather input from residents on how the technology should shape the future workforce.

The initiative builds on earlier pilot programs that allowed citizens and state employees to contribute to policy discussions and operational improvements.

A global center with growing influence

California remains a dominant force in AI development. The state hosts 33 of the top 50 privately held AI companies and accounts for a large share of global venture capital funding in the sector.

Data from the Stanford AI Index shows that California led the U.S. in AI job postings in 2024, with 15.7% of total demand. The Bay Area alone captured more than half of U.S. AI startup funding between 2024 and 2025, according to Carta.

Major technology firms based in the state continue to shape the industry’s direction, particularly in areas such as cloud computing, semiconductors, and machine learning infrastructure.

Regulation and innovation move in parallel

The latest developments show that AI progress no longer follows a single path. Governments, startups, and major tech companies now influence the trajectory at the same time.

California’s new rules add a regulatory layer that could shape how companies design and deploy AI systems. At the same time, new models, tools, and funding rounds show that innovation continues at a rapid pace.

The balance between control and growth remains unresolved, but decisions made now will likely define how AI integrates into daily life and public systems in the coming years.

Google Sets 2029 Post-Quantum Cryptography Deadline | HODL FM NEWS
Google announces 2029 deadline for post-quantum cryptography, urging secure digital systems before quantum computers threaten encryption and signatures.
hodl-post-image

Disclaimer: All materials on this site are for informational purposes only. None of the material should be interpreted as investment advice. Please note that, despite the nature of much of the material created and hosted on this website, HODL FM operates as a media and informational platform, not a provider of financial advisory services. The opinions of authors and other contributors are their own and should not be taken as financial advice. If you require advice, HODL FM strongly recommends contacting a qualified industry professional.