Opportunity cost is a core enterprise-risk and economics concept: when an organization allocates limited resources to one activity, it reduces what is available for other priorities. Increasing cybersecurity typically requiresmoney, skilled personnel time, executive attention, tooling, and operational capacity. Those resources could otherwise be used for revenue-generating work such as new product features, customer experience improvements, system modernization, market expansion, or process automation. That tradeoff is exactly what option D describes, making it the correct answer.
Cybersecurity documents stress that risk treatment decisions must balancerisk reductionagainstcost, feasibility, and business impact. While stronger security can reduce the likelihood and impact of incidents, it can also introduce friction (extra approval steps, stronger authentication, segmentation), slow delivery when changes require additional reviews, and demand ongoing operational effort (monitoring, patching, vulnerability remediation, access recertification, incident response testing). These impacts are not arguments against security; they are the reason governance processes prioritize controls based on the most critical assets, highest-risk threats, and compliance requirements.
Option A may be true in some cases, but it describes a direct cost, not the broader economic concept of opportunity cost. Option B is a trend statement and not the definition. Option C is incorrect because security spend is not always less than breach risk; organizations must evaluate cost-benefit and acceptable residual risk rather than assume a universal rule.