Ideally there would be no difference. Finance – the operation of banks, central banks, insurance companies, investment funds and ‘shadow banks’, and their impact on households, businesses and governments – is an integral part of today’s economies, and should therefore be an essential component of economics.
Unfortunately they were long artificially separated, analytically and institutionally. Finance still tends to be taught separately from economics, and much is compartmentalised into ‘financial economics’ options that deserves a place in core macro- and microeconomics. Central banks were for decades left in charge of ‘finance’ while governments managed the ‘real economy’ – which then got knocked off course by left-field tremors from the financial sector. Bailing-out banks after the last financial crisis left governments with so much debt that they left central banks in charge of reviving the economy, via ‘monetary expansion’. So in Europe (though fortunately not in the US) we’ve witnessed an unsuccessful attempt to generate real expansion entirely through printing more money, while governments withhold the necessary support from expansionary budgets that get this money spent.
Bailing-out banks after the last financial crisis left governments with so much debt that they left central banks in charge of reviving the economy, via ‘monetary expansion’.
Mainstream economics has often been embarrassed by its artificial finance/economy separation. It struggled for decades to perfect a ‘general equilibrium’ theory which had no place for money, except as an interim means of exchange, and no need for a financial system. Essentially, relative prices were determined by the quantities of products, labour and capital that people wanted to buy and sell. The actual price level depended on the amount of money in circulation (and its rate of circulation), but that was an afterthought that didn’t affect the ‘real’ quantities and prices in the system.
The separation of ‘real’ from ‘financial’ economies rested on the assumption that money was exogenous, introduced for buyers’ and sellers’ convenience by a government (or central bank) which determined its quantity. That belief lulled economists in the 1980s into the ‘monetarist’ idea that you could squeeze inflation out of the system by restricting money supply, without disrupting real activity or causing a recession. We’re still living with the long-term effects – on production capacity, public services and public health - of the devastating downturns that followed.
"We’re still living with the long-term effects – on production capacity, public services and public health - of the devastating downturns that followed."
It took until 2014 for the UK central bank to admit that money is integral to real activity, created by banks whose lending expands when businesses invest more in anticipation of rising demand. ‘Endogenous’ money had been identified at the start of the 20th century, yet is still too ‘new’ to feature in most textbooks.
Ironically, the effects of this endogeneity were enhanced by financial deregulation inspired by free-market economics that had largely neglected it. Left to compete among themselves, banks lend on the ‘security’ of assets whose value rises as banks step-up their lending. Credit creation begets more credit creation, until an asset-price fall suddenly shrinks the ‘security’ and causes the system to implode, as it did worldwide in 2008.
The artificial separation of finance from economics is gradually being overcome, as the role of central and commercial banks in creating money and credit – and the interaction between ‘financial’ and ‘real’ circulations – is gradually assimilated into a re-worked macroeconomic analysis. But the long divide is hard to overcome, because most of its practitioners still view economics as a ‘natural science’, and borrow its techniques. Modern mainstream (neoclassical) economics was largely pioneered by engineers and mathematicians, who liked to explore the interaction of ‘real’ quantities and left the determination of prices (including the wages that decided income distribution) to a second, subsidiary stage.
"Modern mainstream (neoclassical) economics was largely pioneered by engineers and mathematicians, who liked to explore the interaction of ‘real’ quantities and left the determination of prices (including the wages that decided income distribution) to a second, subsidiary stage."
Other, dissenting economists recognised early on the inherent connections between finance and the ‘real’ economy. These were extensively spelt out by (among others) John Maynard Keynes in the 1920s, enabling him to identify the key flaws in mainstream macroeconomics in his now equally forgotten General Theory of 1936. Informed by this discovery, Keynes was principal architect of the post-war Bretton Woods system which deliberately sheltered national economies from international capital movements so that their governments could pursue independent monetary policies. The next 20 years were unique in recording no serious crises in the international economy.
Bretton Woods’ collapse in the 1970s led to Anglo-American ‘big bang’ deregulations which removed the real economy’s protections from financial market fluctuations. But the consequent boom-bust risks were ignored by an economics that stayed wilfully ignorant of how finance really work. Sadly, on the eve of the next asset-bubble bursting, that’s still the case.