Can we learn something from the success that is modern science, and apply it to how we govern ourselves?
Some have taken the lesson that science/technology works, therefore scientists/technologists can do good work, therefore government should be run by scientists/technologists. This idea is known as "technocracy", and it is foolish, and unworkable, maybe even a little bit evil.
But I have a better idea, which doesn't require replacing democracy, elections, or any of that stuff. It's based on the idea that a lot of legislation doesn't work -- in a particular sense. Many are made based upon promises of curing some ill, and then deliver no cure, or even create a worse disease. The consequence of this is to create new law after law, regulation after regulation, each trying to patch up the mistakes of the prior ones.
To those who take pleasure in the growth of the state, this does not represent a problem. The more laws, more regulations on the book, the more controlled the population becomes, and the greater the governmental Leviathan enforcing it all. If someone around me would admit to being a jerk who thinks this way, I'd give them a lifetime supply of loathing right then and there.
OK ok, so what's a better plan? Let's borrow a key idea from science: falsifiability, which is a key aspect of the scientific method. The basic idea is that to be believable as a scientific theory, one must make not only a claim, but that claim must be empirically testable. There must exist some kind of experiment or observation that would be able to refute it, if the theory was wrong.
For example, "there is an invisible sky god" would not qualify as a scientific proposition unless the claimant defined the terms, and produced a test ("go pee on the tree, say "woohoo", and she will pop up as a 1cm blue sphere uttering obscenities"). One can pee & peek, and if no blue sphere, then the theory is false. For another example, "who spits against the wind, fouls his beard" (with more precise qualifications) is easily tested, and am happy to report a personal inability to falsify.
Over time, theories that fail their tests are rejected and become superstition, good theories that haven't failed for a long time become known as laws. Thus science grows into a web of propositions, many mutually reinforcing, but each forever subject to eviction upon adverse observation.
How this applies to politics might start to become clear. Take a basic theory is that a law that fails to meet its own standards is a bad law, and that having no law is generally better than having bad law. Let's have acts of government that purport to ameliorate some problem actually include a falsifiable prediction within the statute. If the test fails, the statute is automatically cancelled.
Some examples. A hypothetical Bill to Improve Rich People's Lives, a laudable goal, would have to include a statement of how/when its success is to be measured. Maybe somethign like "in each of the following ten years, the glorious 1%-percentile-of-annual-income would have to exceed the standard rate of inflation". A Bill to Feed the Poor (but Reduce Their Number) could have a statement that after the bill's in operation, the number of Big Macs served on welfare cards will be 10% less than before. Or This Random Act of Congress shall reduce health insurance costs over the aggregate population by 10% by 2018. Or This New Firearms Regulation will help reduce gun murders in Chicago by 10% next year.
If you can't make a strong prediction about its effects, don't make a law strong.
Would it still be possible to game the system? Certainly, there are one or two deviously clever people in politics. But it would discourage a certain type of wishful-thinking-oriented law, whose likely outcomes are so grotesquely disconnected from the dreams, that writing down a falsification clause would make it obvious even to the believers. It would discourage monster omnibus bills, since lawmakers wouldn't want to risk a failure of one prediction rip down the whole kaboodle. It could make government slower and more hesitant. Sounds good to me!
Nearly a decade ago, when I started to learn to fly, I had a series of incidents with my then-instructor where his hands-on teaching style conflicted with my taking mental ownership/responsibility for the flight, leading to mistakes. Nothing unsafe, but still a hindrance to training. I wrote about it earlier.
A few years ago, the dual situation occurred, this time with me as a passenger.
A local pilot offered a ride in his fine-looking steed. From what I heard, he had considerable experience as a former commercial (airline?) pilot, but now retired and flew with his family. His plane and ours must have been maintained at the same shop at the same time for us to make contact. Mutual curiosity led from one thing to another, and before long, there we were at the runway threshold, ready to fly.
I was sitting in the front-right seat, where an instructor/co-pilot would - but for small aircraft, that role is not required, so passengers may sit there too and enjoy the scenery. I certainly did not ask for any piloting powers/responsibilities for the trip, since I was just curious about the plane.
He throttled up to get moving. The airplane was just coming out of maintenance, which is ironically a riskier time to fly than hours later, so we were all paying plenty of attention to noises, instruments, the feel. Sure enough, a few seconds into the takeoff roll, I felt an odd vibration, and the engine monitor indicated one of the twelve cylinders being more than a little off. I asked the pilot whether this was normal - it could be. As we were still accelerating, he said "no".
What happened next was a shock. He asked me what we should do.
It was ridiculous. We both knew I was not familiar with the aircraft type, let alone his particular one. We both knew who was in charge as legal pilot-in-command (he), as I quickly reminded him that it is his decision.
Still, while we were still accelerating, nearly at lift-off speed now, he hesitated and asked me again what to do. This time I simply told him something like "I would stop.". This, he understood, pulled back the throttles, and we safely aborted the takeoff. (Chances are very good that if we had gone flying, nothing bad would have happened; at worst a partial engine failure that this plane would have been able to handle.)
In this case, the pilot abdicated his responsibility, maybe due to lack of confidence in single-pilot operations. In my original "two cooks" case, I (a student at the time) abdicated my partial responsibility, due to conflicting signals from my instructor. Both situations sucked. On the other hand, maybe something similar happened with the Asiana Flight 214 in San Francisco. Maybe all the three (!) qualified pilots in the cockpit were deferring to each other for resolution of their problem (too low & slow), and this time there was hell to pay.
There is one argument against spying-upon-everyone programs like those of the NSA & pals that I haven't heard suggested elsewhere. It has to do with marginalized people's totalitarianism-sensors.
John Ross' book Unintended Consequences is an odd beast. It's not a mainstream title, but as a rough 20th century history of guns & aviation, plus a political cautionary tale, it's interesting. Its philosophical apex is given around the 40% mark of its near-800 pages:
"Exactly, Mr. Hagner. Hobbes' Leviathan is just one more scholarly justification for forfeiting your rights and allowing yourself to be subjugated by the State. Learned, reasoned, articulate, and wrong. Thomas Hobbes has merely-Mr. Bowman," the professor said suddenly, "you are shaking your head. That usually means you disagree with something that's been said. What is it?"
"Professor Arkes, I don't disagree with the basic principle, but it's not enough just to say, Totalitarian regimes are wrong, so don't let the State enslave you'. That's like saying, 'Don't get sick'. The important question is, when do you know it's going to become enslavement? When is the proper time to resist with force?"
"Please elaborate, Mr. Bowman." Henry took a deep breath.
"The end result, which we want to avoid, is the concentration camp. The gulag. The gas chamber. The Spanish Inquisition. All of those things. If you are in a death camp, no one would fault you for resisting. But when you're being herded towards the gas chamber, naked and seventy pounds below your healthy weight, it's too late. You have no chance. On the other hand, no one would support you if you started an armed rebellion because the government posts speed limits on open roads and arrests people for speeding. So when was it not too late, but also not too early?"
As an answer, the character proposes the point where people are about to forfeit their future ability to resist. Whether this could occur by corruption of the judicial system / police, disarming the citizenry, martial law, suppression of effective elected power, or whatever, is not exhaustively speculated in the story. Thank goodness that we're not close to any of these hereabouts, isolated examples to the contrary notwithstanding.
But an uncomfortably close trigger could be the sort of comprehensive surveillance that is becoming public knowledge, where governments (and not just the USA's!) routinely ingest approximately all telephone, location, internet, and even postal traffic into computers. Edward Snowden's leaks, certain famous works of fiction, and mathematical analysis make some of the risks obvious. But consider it from the point of view of a patriot who is looking for signs of the slide of her nation toward totalitarianism. She may consider that her ability to resist is about to disappear, if her personal privacy is lost. After all, if one can't communicate securely, one can't organize. If such people exist, I hope they stay calm while waiting for these surveillance programs to be rolled way back, or adequate security technologies developed/deployed to work around them.
The whole line of thinking suggests a peculiar irony that a free nation's government may want to keep in mind:
In order to make insurrection unnecessary, one must keep insurrection possible.