rss

Battle of Waterloo and Trading

I often talk about the Battle Of Waterloo and how it relates to trading in general and specifically strategy development. If you don’t know the battle (which I recommend reading about if you have time), just listen to this once popular country song and you’ll get a sense to why I think this is so important.

While I’m no historian, I do think traders can learn a lot about trading through learning about important battles in history. The Battle Of Waterloo offers a great example as it offers many lessons for us to consider:

  1. Make your planning and risk analysis commensurate with the size of your project. For major endeavors, contingency plans are critical.

  2. Know when to cut your losses if necessary. Don’t let your desire to succeed be the enemy of good judgment.

  3. Be sure that the justification is clear for your project, and that your entire team is sold.

  4. Don’t become over-confident, especially after many successes. Remember the basic principles.

  5. Never attempt an unpopular endeavor in isolation.

  6. Don’t make enemies. You are only as good as your allies.

  7. Adopt leader style politics, not the Machiavellian style. Look for the win-win.

Many of these lessons apply to good trading, especially the ones about the importance of having contingency plans, knowing when to cut losses, having clear justifications for your trades, the importance of avoiding overconfidence and finally how important it is to attack from a strong position like having plenty of capital and cash reserves.

Needless to say, every trading strategy has their own weaknesses. So, what the most common weakness I’ve found? That’s easy – human error. That’s right, usually most strategies that have been backtested and proven to work continue to work well unless we do things to either deviate from the plan and/or we apply leverage to it rendering it extremely vulnerable. It is fairly often that I see traders come forward with a hot strategy they’ve used and are in the process of levering it up, creating havoc and exposing themselves to great risk. There is good reason for the expression – leverage always kills. In my experience, that has been true. Beyond that, many strategies are based on things that don’t account for the constantly evolving nature of the market. (more…)

12 ways the world could end

Since the dawn of civilisation people have speculated about apocalyptic bangs and whimpers that could wipe us out. Now a team from Oxford university’s Future of Humanity Institute and the Global Challenges Foundation has come up with the first serious scientific assessment of the gravest risks we face.

Although civilisation has ended many times in popular fiction, the issue has been almost entirely ignored by governments. “We were surprised to find that no one else had compiled a list of global risks with impacts that, for all practical purposes, can be called infinite,” says co-author Dennis Pamlin of the Global Challenges Foundation. “We don’t want to be accused of scaremongering but we want to get policy makers talking.”

The report itself says: “This is a scientific assessment about the possibility of oblivion, certainly, but even more it is a call for action based on the assumption that humanity is able to rise to challenges and turn them into opportunities. We are confronted with possibly the greatest challenge ever and our response needs to match this through global collaboration in new and innovative ways.”

There is, of course, room for debate about risks that are included or left out of the list. I would have added an intense blast of radiation from space, either a super-eruption from the sun or a gamma-ray burst from an exploding star in our region of the galaxy. And I would have included a sci-fi-style threat from an alien civilisation either invading or, more likely, sending a catastrophically destabilising message from an extrasolar planet. Both are, I suspect, more probable than a supervolcano.

But the 12 risks in the report are enough to be getting on with. A few of the existential threats are “exogenic”, arising from events beyond our control, such as asteroid impact. Most emerge from human economic and technological development. Three (synthetic biology, nanotechnology and artificial intelligence) result from dual-use technologies, which promise great benefits for society, including reducing other risks such as climate change and pandemics — but could go horribly wrong.

Assessing the risks is very complex because of the interconnections between them and the probabilities given in the report are very conservative. For instance, extreme global warming could trigger ecological collapse and a failure of global governance.

The authors do not attempt to pull their 12 together and come up with an overall probability of civilisation ending within the next 100 years but Stuart Armstrong of Oxford’s Future of Humanity Institute says: “Putting the risk of extinction below 5 per cent would be wildly overconfident.” (more…)

Go to top