Postulate: No mandate, and no rational policy decision, is made from a Scientific approach where the outcome of said policy results in a penalty -- or promotion -- of a given path.
Scientific approaches to a problem always have uncertainties in the measurements and results. You're familiar with this if you've read papers of the last few years on our most-recently little scientific set of claims -- confidence intervals, p-values, absolute and relative risk and similar.
An engineering approach to a problem does not have uncertainties. It instead has limits. "Load limit 40 tons" on the sign before a bridge is an example; it is not an uncertainty, it is a boundary. Do not exceed it or bad things may happen because the thing was engineered to perform as expected within the limits.
You do not -- and never will -- accept a 1 in 500 -- or 1 in 5,000 risk of a bridge collapsing and injuring or killing you. You expect, and modern society relies on, any number of vehicles being able to go over that bridge without it collapsing and going into the river and that the process dictated by said engineering will detect any potential failure to do so before it occurs. Nothing less is acceptable.
If you want to know why many people got it wrong the last three years it is because they've allowed the perversion of what is an engineering circumstance to be instead substituted with a scientific one which can at best produce odds. Odds, while useful when looking retrospectively at outcomes, are pointless for an individual.
Take a railroad crossing a roadway, where the track is curved around some obstruction so you cannot see an approaching train and the trains have no headlight or horn. If there is a train already across the road when you approach you will be safe because you will stop. If there is no train you can cross it without risk.
The scientist can model this and (truthfully) claim that your risk of being smushed by the train and killed is 1 in 10,000 via any number of possible metrics (e.g. the percentage of the time that a train will be in the zone to impact your car but not visible, empirically by how many cars cross it before one gets smushed, etc.)
The engineering answer is that this risk of being smushed, no matter how small, is unacceptable and thus a sensor is placed well back on the track connected to a bell, lights and often a gate that comes down when an approaching train is detected even though you can't see it, obstructing the road and thus compelling you to stop. That sensor, bell, lights and in a busy or potentially obstructed-vision place gate are all chosen so you will always have sufficient warning and thus will not get smushed except by extraordinary (grossly negligent, in other words) lack of attention or intentional act. Then, just in case all that fails although it should not we demand the engine on the train have a very bright headlight and we erect a sign telling the operator to blow a very loud horn at a given distance from the crossing as a last-ditch defense against failure.
There are places for scientific inquiry and in fact it is how we learn things over time. But those of us who looked at this as an engineering problem proved to be right, and those who went the other way, relying on alleged "science", were wrong.
We weren't right because we were lucky.
We were right because we applied the correct analytical system and structure to the problem and the others did not.