There seems to be a persistent perception in the UK and perhaps everywhere that things are bad and getting worse. But what evidence is there to support this?
Of course, there are reports in the papers about murders and rapes and other violence. But what evidence is there that these are widespread and that they are more frequent than before?
So let me ask you a question: Which country in Western Europe has seen the largest fall in violent crime over the past few years? The answer is: England and Wales. (See this BBC report.)
And here are some more statistics: Between 2003 and 2012, the homicide rate in the UK fell from 1.99 homicides per 100,000 people to 1.00 homicide per 100,000 people. And the number of people treated in hospital for violent crime fell by 14% in 2012. (See this BBC report.)
So why does everyone seem to believe the opposite? It seems to me that the popular press delights in reporting bad news; and in the absence of any major wars in the world at the moment (with the exception maybe of Syria), they are now focussing on bad news in the UK.
I really think we have to stop believing the popular press. The distorting effect that they have on people's perception is stunning.