What Doctors Won’t Tell You

By: Vic

Probably most people who read this are, at least to some extent, cynical about the Western medical profession. In the West medicine is ruled by money, particularly the huge amounts of money made from drugs but there’s also a cancer industry and, for instance, vast amounts of money in hospital supplies etc.

A few years ago after a trip to the Middle East I suffered from a range of skin problems. Initially it was diagnosed as impetigo, a disease usually affecting children. The doctor prescribed antibiotics. After one course was unsuccessful he diagnosed a stronger antibiotic. The initial symptoms went away …read more