The objectives of this study are to develop and test a framework to estimate the cost of enhancing agriculture's potential as a carbon sink through the implementation of reduced tillage productive systems. A major factor inhibiting the adoption of reduced tillage practices is the additional risk perceived by farmers and its effect on net revenues. The expected utility model provides a useful means for evaluating risk-return tradeoffs of agricultural production systems. Incentive levels required to induce implementation of reduced tillage practices are calculated using estimated certainty equivalents. The framework developed is applied to eight case studies across the United States. Results indicate that in the Corn Belt, the required incentive payment for corn/soybeans would have to be $10.20 and $8.30 per acre for poorly drained and well-drained soils, respectively. Continuous corn incentive payments for poorly drained soils were found to be $40.40 and $26.70 for well-drained soils. In the central Great Plains, an incentive level of $14.60 per acre was required for continuous sorghum. Although the mean yields for conventional and reduced tillage practices are fairly close in this case, the higher costs associated with no-till sorghum result in the high incentive level. In the wheat/fallow and wheat/sorghum/fallow rotations of the western Great Plains, the yields associated with no-till were higher than intensive till. Thus, incentive levels near $6.00 per acre are the result of higher costs of no-till in these rotations. In the Mississippi River Corridor region, results show that the switch from corn/soybean to the same no-till rotation would require an incentive level of $7.90 per acre.
- Adoption incentive estimation
- Cost of sequestering carbon
- Expected utility model
- Reduced tillage adoption
- Risk premium