The Hidden Price Tag of California's AI Oversight Bill
The topline of my LLM-derived analysis for AB 1018: Individual firms could face $2 to $6 million in regulatory burdens over a decade.
The California State Legislature is rushing to pass bills before the session adjourns on September 12, 2025. One bill I have been closely tracking is AB 1018 Automated decision systems, which would regulate all automated decision systems that make consequential decisions. The underlying philosophy behind this bill and similar ones assumes that mandating human oversight automatically produces better outcomes, regardless of cost or context.
But AB 1018's definitions cast such a wide net, that they would regulate virtually any computational process used in business operations. Even California's State Water Board warned that Excel workbooks could trigger regulatory requirements.
Using four large language models to estimate compliance costs, individual firms could face between $2 million and $6 million over a decade. Even assuming only 5 percent of businesses comply, the total ten-year cost for the entire economy could reach the billions to low trillions. To be fair, these estimates vary widely in their ranges, suggesting substantial uncertainty, and the economy-wide numbers are constructed such that more companies are included that are likely to be regulated.
Still, the compliance cost estimates for AB 1018 reveal a broader pattern emerging across proposed AI regulation. There is an enormous hidden price tag of mandating human oversight in automated decision-making systems. While human oversight in high-stakes decisions has clear value, at what point do the costs of mandating it across entire sectors outweigh the benefits? The rough estimates for AB 1018 suggest we may already be approaching that threshold, and they serve as a cautionary tale for lawmakers considering similar sweeping mandates elsewhere.
AB 1018 Automated Decision Systems
In May, the California Privacy Protection Agency (CPPA) finalized regulations addressing the use of automated decision-making technology (ADMT) under authority granted by the California Consumer Privacy Act (CCPA). Now, the California Legislature is weighing AB 1018, which would significantly expand this regulatory framework by addressing a broad spectrum of covered automated decision systems (ADS).
Under AB 1018, any process "derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issues simplified output, including a score, classification, or recommendation, that is designed or used to assist or replace human discretionary decisionmaking and materially impacts natural persons" would face regulatory oversight.
AB 1018 casts a notably wider net than the CCPA regulations in defining consequential decisions. It would apply to any decision that impacts the cost, terms, quality, or accessibility for employment-related decisions; education and vocational training; housing and lodging; anything that involves your utilities; family planning, adoption services, and reproductive services; health care and health insurance; financial services; the criminal justice system; legal services; arbitration; mediation; elections, access to government benefits or services; places of public accommodation; insurance; and internet and telecommunications access.
The proposed rules distinguish between the developer—the groups that produce an automated decision system—and the deployer—or the entities that use a covered ADS to make or facilitate a consequential decision.
To be compliant, developers will need to
Conduct impact assessments before deployment and annually thereafter;
Provide detailed documentation to deployers about system capabilities and limitations;
Submit to third-party audits (starting in 2030); and
Designate compliance officers.
For deployers, the rules would require:
Transparency: Provide clear disclosures to people subject to automated decisions, explaining what system is being used and how;
Opt-out rights: Allow people to request human decision-making instead (with some exceptions for financial services and medical emergencies);
Appeal process: Give people the right to correct errors in their data and appeal adverse decisions; and
Impact assessments: Large-scale deployers must conduct regular third-party audits.
Government Estimates of the Cost of AB 1018
Whatever human-centered benefits AB 1018 might deliver, the price tag will be staggering. California's fiscal analysis shows the bill would impose significant costs on nearly all state agencies: "In the aggregate, for all affected state agencies, costs may be in the hundreds of millions of dollars annually ongoing." These estimates reflect substantial financial impacts across virtually every sector of state government, from individual departments facing millions in compliance costs to the judicial system potentially requiring hundreds of millions annually.
For example:
The State Controller’s Office (SCO) estimated compliance at $3,553,000 from 2025 to 2028, which would include “5–7 information security staff—including a dedicated supervisor—for the initial implementation of AB 1018, with 3–4 (including the supervisor) required for ongoing security and compliance operations related to automated decision systems.”
The Department of Justice (DOJ) estimates a fiscal impact of $2.5 million or less.
The Civil Rights Department (CRD) estimates a fiscal impact over $2 million.
The State Water Board estimates a significant fiscal impact (General Fund, special funds) likely in the millions of dollars per year. The State Water Board reports, as written, this bill’s definition of automated decision system is vague, ambiguous, and could encompass many current tools used, like Excel workbooks. These tools are used broadly across Water Boards programs, and many are used to inform actions that could be considered consequential actions under the bill.
“UC Health estimates a fiscal impact of $42 million in one-time costs and approximately $24 million in ongoing costs to implement AB 1018.”
“The California Department of Health and Human Services (CalHHS) reports a significant fiscal impact (General Fund, special funds)...in the low millions up to low tens of millions (Total Fund) annually as solutions applicable under the bill are expected to grow in use over the coming years.”
“Unknown, potentially significant costs to the Judicial Council, likely in the hundreds of millions annually.”
AB 1018 extends far beyond government. It covers countless industries: banks and credit unions; mortgage companies and brokers; insurance companies; investment firms and financial advisors; credit card companies; Software-as-a-Service (SaaS) providers; social media platforms; E-commerce platforms; HR technology vendors; educational technology companies; hospitals and health systems; health insurance companies; medical device manufacturers with AI components; dental and vision care providers; staffing and recruiting agencies; gig economy platforms (Uber, DoorDash, etc.); companies using AI for hiring, performance reviews, or scheduling; property management companies; real estate platforms and brokerages; home appraisal companies; public housing authorities; educational institutions (universities, colleges, K-12 schools); student loan servicers; educational assessment companies; electric utilities; water companies; telecommunications providers; waste management companies; law firms using AI tools; legal technology companies; arbitration services; mediation services; online marketplaces; retailers using AI for pricing or recommendations; hotels, restaurants, entertainment venues; as well as government contractors; and benefits administration companies.
The sheer breadth of industries that would fall under AB 1018's umbrella creates a unique challenge for policymakers: How do you estimate the economic impact of a bill that could affect virtually every sector of the economy?
AB 1018 Cost Estimates
I've been experimenting with large language models (LLMs) to estimate AI bill compliance costs. In March, I published “How much might AI legislation cost in the U.S.?,” which explained the pluses and minuses with this method. What was surprising is that the LLMs were usually close to first year estimates, but they tended to predict much higher ongoing annual costs, suggesting a systematic underestimation.
In June, I published a piece in the City Journal on New York’s RAISE Act which I subjected to the same kind of method. As I wrote,
I asked the leading LLMs to read the RAISE Act and estimate the hours needed to comply with the law in the first year and in every year after that for a frontier model company. The results, displayed in the table below, suggest that initial compliance might fall between 1,070 and 2,810 hours—effectively requiring a full-time employee. For subsequent years, however, the ongoing burden was projected to be substantially lower across all models, ranging from 280 to 1,600 hours annually.
The wide range in estimates underscores the fundamental uncertainty with the RAISE Act and other similar bills. The fact that sophisticated AI models are not converging on consistent compliance costs suggests just how unpredictable this legislation could prove in practice. The market is moving quickly. We need laws that prioritize effective risk mitigation over regulatory theater.
During a Hackathon in May, I coded up a first version of a prompt script, which I am going to iterate on in the future. It is built on persona prompting, where you direct the LLM to take on a role and then answer questions. So, I set up the script to reflect different industries, resources, legal teams, and familiarity with the law.
The benefit of this kind of scripting is that you can run many estimates simultaneously and then summarize the results. In the future, I am going to match these personas with what we know about the market to create more accurate predictions. The code is still messy, but I intend to come back to it when I work on the full paper this fall.
To continue this work, I thought I would try to push the LLMs further in my analysis of AB 1018. What’s especially interesting, and worrying about this law is that it will apply to so many businesses. Could LLMs help figure out the scope of the law?
I followed the typical method of running a cost calculation in public policy. First, you estimate the hours of compliance, then multiply it against market labor rates to calculate an economic cost for a firm. Then you take this number and multiply it against the number of impacted businesses. However, estimating the number of impacted businesses tends to be a blunt measure, so I used a second set of scripts to then estimate which industries are likely to be affected by the law. All of the data can be found in this spreadsheet.
I tested four models, Claude, ChatGPT, Gemini, and Grok, using the July 17, 2025 version of AB 1018. Complete chat transcripts are available here, with notes where calculations were incorrect. The consolidated results of the labor hours from all four LLMs are summarized in the table below:
A couple items worth noting from these estimates:
ChatGPT had low to moderate estimates across the board. The costs for developers jump in 2030 with audits, but still stay below Claude’s and Grok’s numbers. Like Claude and Gemini, deployers are estimated as less burdened than developers.
Claude had the highest estimates among all LLMs, projecting substantial compliance burdens for both developers and deployers. Developers especially face steep increases after 2030 audits.
Gemini has the lowest estimates across all categories, and are less than half of ChatGPT’s and a fraction of Claude’s.
Grok originally estimated cost based on two ADS’s, so these numbers were adjusted to just one ADS. Given this change, Grok provides middle-of-the-road compliance cost for developers, but the highest burden for deployers. Grok stands out by flipping the assumption: It estimates deployers will face heavier costs than developers, especially on an ongoing basis.
It is worth noting that, without prompting, each of the LLMs broke out the labor costs to developers into three buckets: the first year of compliance, an annual cost from 2027 to 2030, and an annual cost from 2030 onward, which is when third party audits are slated to begin. However, Gemini didn’t give an estimate for 2030 onward, saying, “This will add significant hours and cost, involving contracting with an auditor, providing necessary information (with redactions for trade secrets), and making a high-level summary of feedback publicly available. The time commitment for this is highly variable.”
Taking it one step further, I then asked the LLMs to translate the labor compliance hours into costs using typical labor wage rates, allowing the LLMs to select their own to see how that would vary. Each of the LLMs selected a slightly different method:
Gemini assumed a standard 50/50 blended rate: “A blended rate of $150 to $250 per hour is used to account for the mix of senior technical and legal staff required for these tasks.”
ChaGPT varied this blended ratio: “Legal/Compliance staff: $150/hr; Technical staff (engineers, data scientists, IT): $100/hr; Roughly 60% of work = technical, 40% = legal/compliance (developers skew a bit more technical, deployers a bit more compliance-heavy).”
Claude offered a range of wage assumptions, but didn’t explain which ratio it selected: “Senior Legal/Compliance Officer: $150-200/hour; Mid-level Compliance Analyst: $75-100/hour; Technical Documentation Specialist: $60-80/hour; Administrative/Coordination: $40-60/hour; Blended average for compliance work: ~$85-110/hour”
Grok had the clearest breakdown: “~40% data scientist ($80/hour for evaluations, ~216-356 hours), ~30% compliance officer ($70/hour for oversight/setup, ~162-267 hours), ~20% legal ($115/hour for reviews, ~108-178 hours), ~10% admin ($35/hour for records, ~54-89 hours).” Still, these don’t sum to the $70/hr and $65/hr that was used in the final cost calculation.
The variation in wage rates is charted below.
Combining the labor hours with the wage rates yielded the estimated yearly and annual costs, which are displayed below.
Looking at these results, some trends emerge:
ChatGPT produced moderate estimates overall. Still, the estimates suggest developers face higher burdens than deployers, especially post-2030.
Claude offers the highest potential cost since the first-year developer cost could be $132,000. Both developers and deployers face very large costs, with developers’ post-2030 costs nearly doubling their annual burden.
Gemini shows the widest spreads as developer first-year ranges from $27K to $105K. It consistently gives the lowest annual ongoing costs, especially for deployers. Similar to the previous analysis, Gemini makes compliance look far less expensive long-term.
Grok sits in the middle-to-low range for developers, but suggests costs will be higher for deployers. It reinforces Grok’s distinctive assumption: Deployers, not developers, bear the heavier financial load.
From here, I projected these costs over a ten-year period and applied standard economic discounting methods, which I explained back in March:
Similar to discounted cash flow (DCF) analysis, we can think of regulation as a stream of future costs that a regulation is expected to generate over the 10 year period. By summing them all up and then discounting them back to present value using a discount rate that reflects the costs’ long-term time horizon, we can estimate the current market value of the regulation.
Estimates for a ten-year discounted regulatory costs (DRC) are collected below for each of the regulation types and LLMS:
These ten-year totals, which are common in cost benefit analysis, represent the present value of sustained compliance cists that firms would need to factor into their long-term business planning and automated system adoption decisions. The estimates span a 13.4-fold difference, from Gemini's most conservative deployer projection of $620,000 to Claude's highest developer estimate of $8.3 million. Most projections cluster between a $2 to $6 million range for the decade-long compliance period.
Still, the substantial range in estimates reflects the fundamental uncertainty about how AB 1018's broad definitions would be interpreted and implemented in practice.
NAICS Matching and Discounted Regulatory Costs
To translate the LLM-generated industry classifications into quantifiable economic impacts, I developed a systematic approach to match potentially affected industries with California's official industry statistics.
I asked all of the LLMs to create a list of businesses that would likely be regulated by the bill and then I had them match those guesses with the corresponding NAICS code. The responses from ChatGPT, Claude, Grok, and Gemini are collected in this Google Doc.
Each LLM produced different approaches to industry identification. ChatGPT generated 65 business classifications, Claude produced 92, Gemini identified 19, and Grok classified 30 distinct business types. These classifications varied significantly in their specificity. Some LLMs provided detailed 6-digit NAICS codes (like 325412 for pharmaceutical companies), while others used broader industry ranges (like Gemini's "31-33" for manufacturing).
The fundamental problem lies in matching all of these choices with the limited California data. The most up-to-date public business information for California is provided every quarter on a delay. More importantly, these numbers don’t offer the specificity to match businesses with their industry classification. So, I used a hierarchical matching function to place the regulation in the most specific bin possible, while retaining whether the businesses would face deployer, developer, or both regulations. The code helped systemically map LLM-generated regulatory predictions and official state business statistics. This approach successfully matched 96 of the 98 LLM classifications (98 percent match rate) to California establishment data. Match data is located here.
Finally, I took the cost estimates from the section above for each company and then multiplied them against the number of businesses in each category as well as the regulatory type (developer, deployer, or both). If an industry faced both types of regulation, I just added together the two costs. The total discounted regulatory costs are collected below.
I also reran the estimates assuming that only 5 percent of companies will need to comply. It is a blunt measure that I intend to alter in future work. But even with that low number, as the table below shows, the costs may very well be in the trillions.
I don’t want to dwell too much on these numbers because they are rough estimates. The LLM classifications represent informed predictions rather than definitive legal interpretations of AB 1018's scope. More importantly, the NAICS matching process necessarily involves aggregation. Specific business types identified by LLMs are matched to broader industry categories in the official data. This means that the estimates are sure to be higher than actual impact.
Despite these limitations, this approach provides the first systematic attempt to quantify AB 1018's potential economic scope using computational methods, offering policymakers concrete estimates to anchor discussions about regulatory costs and benefits.
Commentary on All The Cost Estimates
Unlike targeted regulations, AB 1018 forces regulatory convergence across disparate sectors. Healthcare AI systems, bank credit algorithms, retail pricing models, and utility grid management would all face similar compliance requirements despite vastly different contexts and risk profiles.
This one-size-fits-all approach ignores the reality that automated systems in different industries serve different functions, carry different risks, and require different safeguards. A scheduling algorithm at a restaurant poses fundamentally different risks than an AI system screening loan applications, yet AB 1018 would subject both to similar regulatory frameworks.
This approach effectively criminalizes mathematical efficiency. Businesses that spent years optimizing operations through data-driven automation face an impossible choice. They could either abandon efficiency gains or navigate complex compliance regimes that could cost millions per system. Neither are especially palatable.
Still, these rough estimates represent just the tip of the iceberg. They capture only the direct compliance costs of hiring staff, conducting audits, implementing new processes, and maintaining documentation. What they don't account for are the cascading economic effects that would ripple through entire sectors. Every dollar spent on regulatory overhead is a dollar not invested in innovation, service improvements, or competitive pricing. For the economy as a whole, it would represent a massive shift of resources from productive activities to regulatory compliance. When California's State Water Board warns that the bill's definitions are so broad that Excel workbooks could trigger regulatory requirements, we're looking at a fundamental restructuring of how modern organizations operate.
Consider the practical implications. A community bank using automated credit scoring to expand lending to underserved populations might abandon the technology rather than spend $3 million on compliance infrastructure. The result isn't more human oversight; it's less access to credit. A healthcare system using AI to prioritize patient triage might revert to manual processes that are slower and more error-prone, not because human judgment is superior in this context, but because the regulatory burden makes the technology economically unviable.
Start-ups and smaller companies face particularly acute challenges. While established corporations can spread compliance costs across large revenue bases, emerging businesses may find regulatory requirements consume resources that would otherwise fund product development, hiring, or market expansion.
The choice isn't between perfect algorithms and perfect human judgment. Neither exists. The choice is between regulatory frameworks that preserve beneficial innovation while addressing legitimate risks, and those that impose such crushing compliance burdens that they eliminate both the risks and the benefits of technological progress.
Despite its human-centric rhetoric, AB 1018 may ultimately prove deeply inhumane to the Californians it claims to protect.