Home

The Nature of Progress

In the industrial revolution when iron and steel machines were just being invented, people probably said: "these machines could change the wolrd - they will make it faster and stronger, but will take away our indivuduailty. They can be used both for peace, but they will also be used for war".

When computers were first being worked on, there was probably a similar sentiment. Same for the wheel. Every technological leap forward comes with a moral and ethical cost.

What if you were offered a job which, if sucessful:

What if you suddenly realized this future was happening anyway, and the only real choice was if you should participate or not?

After some thought on the matter, I have decided that, morally, you can take the job based on it's immediate purpose. Any useful tool can be (and will be) abused, so any development that pushes the capacity of humanity will inherently have this question. If the work you are doing isn't pushing ethical boundaries chanses are it isn't really anything new.

That isn't to say that you should go off and start the next Manhatten Project or build a handheld laser rifle. There are technologies that do object to ethically - ones that their only intentional purpose is harm. But do you stop spaceX from developing their rockets because every functional orbital rocket system is fundamentally an ICBM?