Health insurance in the United States of America
Austrian Perspectives on Interventionism before ObamaCare
The U.S. health care system is dominated by private and public (governmental) insurance which makes it difficult for many Americans to imagine access to health care without it. The health care system in the United States is sometimes mistakenly referred to as market-based, partly because of the previous lack of compulsory insurance.
mars 2025, env. 86 pages, Routledge Focus on Economics and Finance, Anglais
Taylor and Francis
978-1-032-96120-0