2002 Cat Rate Comparison

June 30, 2002 | Last updated on October 1, 2024
7 min read
Exhibits 6 & 7| | |Exhibits 1, 2, 3|Exhibits 4 & 5
Exhibits 6 & 7| | |Exhibits 1, 2, 3|Exhibits 4 & 5

Catastrophe reinsurance terms have been finalized for 2002, after one of the most intense and difficult renewal seasons in recent memory. The terrorist attack of September 11th, 2001 occurred just as negotiations were getting underway, and significantly slowed the renewal process. Many reinsurance underwriters, given the new uncertainty of their own capital positions, and the increased risk of man-made events, were unwilling or unable to quote rates or offer capacity until they had reviewed their risk appetite and retrocession arrangements.

When terms were finally negotiated, 2002 placements were influenced by a “flight to quality”, as clients looked at their reinsurers’ solvency and claims paying ability. The total cost of reinsurance – premiums paid plus the cost of uncollectable reinsurance – became more relevant for cedents, and the most common question evolved from simply “how much?” to “how much, for what kind of security?

SOFT RESULTS

The catastrophe reinsurance market is based on global diversification, and when a large global event erodes industry capital, prices are driven up worldwide. Yet, while this was clearly a major factor in 2002, rising Canadian premiums in 2002 were not based on global events alone. Although Canada escaped with only minor natural catastrophe activity in 2001 (and has not seen a major natural event since the 1998 ice storm), the extended soft market has caused results to deteriorate badly in both the Canadian primary and reinsurance markets.

In 2001, the property and casualty insurance industry’s combined ratio climbed once again, to 110.3% from 108.4%, while return on equity dropped by half to a historic low of 3%. This deterioration occurred despite an increase in direct written premium of 11.6%, as noted by Paul Kovacs, the chief economist of the Insurance Bureau of Canada (IBC).

The Canadian reinsurance market suffered even more, with the combined ratio for Reinsurance Research Council (RRC) companies soaring from 112.9% to 119.1%, despite the absence of natural catastrophe activity. This very poor underwriting result from “normal losses”, together with stagnating investment income, caused many reinsurers to erode their capital bases once more, instead of accumulating funds for the next large loss. Reinsurers were finally forced to face up to the fact that they had allowed under-pricing and weak underwriting to continue for too long, and that their returns were insufficient to cover the cost of capital.

THE ANALYSIS

The accompanying charts tell the story. Chart-1 and chart-2 show how rates have changed for the three categories tracked by Swiss Re for the Canadian market. The category into which a layer falls is dependent only upon the limit and deductible for the layer. A small company’s entire program may be categorized as low, conversely a very large company’s entire program may be categorized as high. The graph displays the simple average of rate changes, not weighted by premium.

Note that the sample includes all comparable covers for which Swiss Re knows the terms for two consecutive years, and hence includes many layers on which Swiss Re does not participate.

As expected, the data shows that the upswing, which began three years ago, accelerated in 2002. With these increases, rates for nominally equivalent layers have now generally reached the peak that occurred in 1994, in the hard market that followed Hurricane Andrew.

However, this comparison does not take into account the fact that “equivalent layers” are now usually more exposed than in 1994, both because of the effect of inflation on loss costs, and because the underlying portfolios and exposure have typically grown as well. Chart-3 and chart-4 show the data in terms of percentage rate changes from the prior year.

The most significant year-on-year change remains the 49.5% increase from 1992 to 1993. This followed Hurricane Andrew, the most costly natural catastrophic event with insured losses at US$20.2 billion. For comparison, the six large natural catastrophe losses that occurred in 1999 totaled US$18.7 billion (both at 2001 prices).

The upswing in rates from 1999 to 2000, after the severe global losses of 1999, were dampened by the proliferation of multi-term treaties written at the start of 1999 – either to cross over any “Y2K problems” on January 1, 2000, or to lock-in prices in a soft market. With more treaties coming up for renewal in 2001 we saw a more severe increase in rates than in 2000, yet even here the average increase was restrained by those multi-term treaties that continued for a third term.

The hardening of rates did, however, continue with a significant 44.1% increase from 2001 to 2002. While we believe that rates were clearly on the increase in any case, there is little doubt that the terrorist attack of September 11 caused the curve to rise sharply.

Measuring the average change in rate on gross net premium income (GNPI) is only one part of the picture, since the rate alone does not fully reflect the premium charged per layer. Another measure is the premium as a percentage of the limit, or “rate on line” (ROL).

Chart-5 shows that the rate change compared to the ROL change, both as a simple arithmetic average of comparable layers, and as an average weighted by the premium charge for each layer.

The simple average changes in ROL are larger than the changes to the rate on GNPI, implying that the underlying portfolio premium has also grown. This growth may result from two factors: true growth in the portfolio’s size (which increases the exposure to the reinsurance cover), or rate increases at the primary level (which increases a reinsurer’s premium, but not exposure). When looking at the average weighted by premiums, we see that the increases are more significant. The implication is that bigger, more exposed programs (with more premiums) were subject to proportionately larger increases than programs with a smaller GNPI.

THE MULTI-YEAR ANALYSIS

Several multi-term layers, typically coming off three-year terms, renewed in the year 2002. Chart-6 shows the 2002 rate changes for all layers that renewed, compared to the changes for those that renewed after a single term (excluding multi-term layers). Chart-7 then separates those layers that renewed after a single term (excluding multi-term layers) from those that renewed after a multi-term. The differences are as expected. After locking into rates just as the market began to tighten in 2000 and 2001, and escaping some of the increases that were common, multi-year contracts were faced with a much more dramatic adjustment when market forces again came into play.

Multi-year terms clearly provide increased risk for both parties. The reinsurer must commit capital for a longer period, and risk missing annual price adjustments that may occur. Conversely, the client runs the risk that large events anywhere in the world over the multi-year term may diminish the reinsurer’s capital base, and thereby reduce the value of the reinsurance purchased. Few, if any, layers will continue into the year 2003 on a multi-year term.

NATURAL CAT EXPOSURE

By some measures, the year 2001 was one of the worst ever for the p&c insurance industry in Canada, and this was without any major natural catastrophes. Premiums were insufficient to build a bank for future events, and instead were used to pay for the rising claims ratios. Reinsurers, in particular, have not priced their product adequately. Once again in 2001, they did not provide a sufficient return on capital for Canadian exposures.

The focus this past renewal was clearly on the largely new and unknown threat of man-made events. However we should not lose sight of the need to underwrite and price for the exposure from natural catastrophes – particularly storm and earthquake. It is important that on every program reinsurers work with their clients to understand the aggregate exposures, as well as the myriad of pricing, deductible and contract terms that a primary company uses to manage the risk.

Swiss Re is particularly concerned with the accurate measurement of aggregate exposur es for Vancouver earthquake risk, and with the level of pricing charged for that risk at the primary level. Without quality measurement of aggregates, clients may not be aware of how exposed they truly are to an earthquake loss. Moreover, without good data, reinsurers are forced to make assumptions that may work to the client’s disadvantage, both in terms of the reinsurance price or capacity offered.

Perhaps most importantly, the lack of solid exposure measures prevents the client from understanding his own exposure, and therefore prevents him from accurately charging for the risk in front line pricing. In our view, primary pricing for Vancouver earthquake continues to be weak in many cases, with the result that companies – both primary and reinsurance – are allowing other lines of business and other regions to subsidize this exposure.

This study shows that catastrophe pricing continues to increase, after a prolonged period of decline. Are the increases enough to provide an adequate return on the large amount of capital needed for earthquake, storm, and man-made risks? To what extent are primary markets working to increase the quality of their reinsurance capacity to reflect new realities? Both of these questions, as always, are issues for the markets to decide.

This study compares average changes in market rates. There is, and should be, a considerable range of increases depending on an individual client’s book. Portfolio growth, line-of-business shifts, regional shifts, primary pricing, earthquake deductibles, and inuring reinsurance changes all play a role.