American International Group (AIG) has reported faster than expected gains from its use of generative AI, with significant implications for underwriting capacity, operating cost, and portfolio integration.

Increased Processing Capacity

AIG claims that generative AI has increased submission processing capacity. The company has implemented its internal tool, AIG Assist, in most commercial lines of businesses. Lexington Insurance, AIG's excess and surplus unit, is targeting reaching 500,000 submissions by 2030, having already surpassed 370,000 in 2025. AIG uses generative models to extract and summarise incoming data, and has developed an orchestration layer to coordinate AI agents.

AI Agents as Decision Support

The CEO of AIG describes AI agents "as companions that operate with our teams" providing real-time information, drawing on historical cases, and evaluating underwriting decisions. The company relies on its ability to manage incoming data in a fraction of the time and to orchestrate agents so they can analyse information without bias.

Workflow Optimization

AIG links orchestration to compression of the end-to-end workflow, integrating intake, risk assessment and claims handling more tightly. The company states that multiple agents, coordinated through an orchestration layer, streamlines repetitive and previously lengthy processes.

Concrete Applications

AIG has applied its generative AI stack in specific transactions. During the conversion of Everest's retail commercial business, account renewals were managed in drastically less time. The company built an ontology of Everest's portfolio and combined it with its own, allowing the portfolios to be integrated. The launch of Lloyd's Syndicate 2479, in partnership with Amwins and Blackstone, extended the ontological approach to a special purpose vehicle. In conjunction with Palantir, AIG used LLMs to assess whether Amwins' programme portfolio aligned with the syndicate's stated risk appetite.

For those evaluating on-premise deployments, there are trade-offs to consider. AI-RADAR offers analytical frameworks on /llm-onpremise to evaluate these aspects.