Antifragile - N. N. Taleb

The idea of the book, as explained by the author, is to build upon the foundations laid by The black swan. I wholeheartedly agreed with the theses of the latter when I read, and I was even a bit frustrated while reading it - the book kept trying to convince me of something I already believed in, without ever really moving forward with the argument. Moving forward with the argument is what this book does.

The idea: the world is unpredictable (Annahme, discussed in The black swan, not here). Rather than try to predict what is going to happen, which wouldn’t be possible anyway given the Annahme, one should focus on things that react well to unpredictability.
Follow-up question: do these things exist? What are they? Answer: since the world is unpredictable, all that has been able to stick around for a long time must be well suited against unpredictable events; one could say even more, namely that evolution must have favoured organisms that not only tolerate unpredictable events, but even are able to profit from unpredictability.
Definition: Things that suffer from unpredictability and thrive in stability are called fragile; things that suffer from stability and thrive in unpredictability are called antifragile. The main point is that you should be aware of what is what and act accordingly.

The human organism is antifragile - it continually adapts to the external world and fine tunes itself according to external stimuli. If we take a carillon and keep in a perfectly hygienized room for a year, the carillon will be at the end of the year perfectly conserved (stable environment without external stressors); the same environment will weaken the human organism. Everything biological works like this. This means that it’s an extremely bad idea to try to “stabilize” the natural world - it might seem like a good idea at the beginning, and maybe it is, as long as you stay in the perfectly hygienized room; but if some day, for some reason, you get out of the room or something gets in, things get proportionally worse with the time and effectiveness of your enclosure.
This something that gets in or kicks you out is the unlikely event which you can’t have predicted, because the world is unpredictable; since by construction it is going to happen and you are not going to be able to predict it, the only reasonable thing to do is not to close yourself into the room to begin with. Why would you do something that will end up killing you?

So the first thing to do is to embrace the via negativa, i.e. learn to not do things. Not only not fixing this which are not broken, but also not fix things which are mildly broken - nature takes care of itself, and intervention is more likely to cause harm than benefits (iatrogenics). As for artificial things, the best thing is to let them go, subtracting (via negativa) the artificial from the natural world, or, better said, letting the artificial flow into the natural and live its destiny. Keeping things in place by fragilizing the environment will not help.
The second thing to do is to try to understand how nature handles unpredictability, so that you can do the same. The answer in short is that you don’t want to put all your eggs in the same basket: a single big lung would probably be more energy efficient than two, and actually one can survive with only one normal-sized lung - but it has evolutionally proven correct to carry two, in case the unthinkable happens. You don’t want to - and can’t - be sure that you won’t need a Plan B, you need a Plan B and then you can be happy if you don’t end up needing it. You exchange a known, sure loss for a very large, unsure one - like insurance.
The principle of optionality and the barbell are also important. If you invest all of your money in very, very risky business, you will probably end up bankrupt with your life being virtually over - or you can win very, very big with a small probability; if you invest 5% of all your money in very, very risky business, you will at most lose 5% of your money, or you can win 5% of very, very big, which is still big enough. So you have a known, little loss, or a virtually huge win; the idea of the barbell is to distribute only at the extremes - very safe or very risky -, the one of optionality is to exchange a small known loss for a potentially large win. The point of this is that the probability of winning - the probability of the unlikely event happening - gets systematically underestimated in complex systems. This of course wouldn’t work with the lottery, which is “laboratory” chance.
Nature is a vague concept, one should maybe say “complex systems”. One can find antifragile and fragile human jobs - the fragile being the ones that thrive and adapt to local volatility and (as a consequence) have been around more or less forever, fragile the locally stable ones that get wiped out. You have waiters and you have the guy that turned on lamplights every evenings - one day there are more clients in the restaurant, one day less, and one day the fragile lamplight technology flies out of the window.

This is what I remember on the spot, now for the points I marked while reading: