Wednesday 14 June 2017

The story of leprosy: from 2000 BCE to now

Throughout human history, very few diseases have carried the levels of stigma and shame that leprosy brings upon it’s sufferers. One of the oldest known illnesses, Mycobacterium leprae is thought to have been with mankind in one form or another for well over four millennia and traveled along the migration routes through East Africa and the Near East.
Leprosy takes its name from the Latin word  lepra, meaning “scaly”, as the predominant symptom of infection are raised skin lesions. However in modern medical circles it is more commonly referred to as Hansen’s disease, after G. H. Armauer Hansen, the Norwegian physician who identified Mycobacterium leprae as the causative agent.

Girls in the Philippines identified as having leprosy. Image: National Museum of Health and Medicine
Contrary to popular belief, leprosy does not cause body parts to fall off. Infection with M. leprae instead leads to severe nerve damage in the skin, limbs and eyes. This can leave suffers vulnerable to secondary infection if they do not receive adequate medical attention, and gangrene may then set in.
Neither is leprosy particularly infectious. In fact, 95% of the world’s population has a natural immunity to M leprae (although they may still be able to carry it). Even then, it takes prolonged exposure to a sufferer for infection to pass. The idea that just brushing past a leper would make you somehow “unclean” is yet another myth.
In 2009, a mummified skeleton from circa 2000 BCE was found in northwest India. DNA analysis of the remains found evidence of  M. leprae infection within the skeleton, the oldest yet known case of leprosy. An Egyptian Papyrus document written around 1550 BCE makes mention of a disease that may be leprosy and Hippocrates discusses the infection in his writings in 460 BCE. In China, the Feng zhen shi manuscriptwritten about 250 BCE discusses a condition that leads to destruction of the nasal septum as well as anaesthesia, which is also very likely leprosy.
The Bible makes multiple reference to “tzaraath”, a disfiguring disease which most translations render as leprosy, DNA found on in a tomb in the Old City of Jerusalem, dated 1-50 AD confirmed the presence of the disease in the area at the time. In Ancient Rome both Pliny the Elder and Aulus Cornelius Celsus wrote about the presence of lepers in and around Rome and the Near East.
Historical accounts of leprosy can be hard to identify as some may mistake psoriasis, vitiligo or other skin conditions for leprosy. The descriptions of highly infection skin conditions are more likely fungal infections than leprosy, and, after 1492 when Columbus returned from the new world after exchanging small pox for syphilis, that gets added to the mix too.
In many writings from the mid-16th century onwards, descriptions of “leprosy” begin to more resemble congenital or tertiary stage syphilis – patients with collapsed noses and rotting ears, loss of skin on the palms of the hands and feet, deafness and loss of teeth. This may also explain the sudden increase in numbers of “leprosy” suffers seen in the 16th century.
Leprosy has long been associated with a curse, or believed to be a punishment from God for sins committed in this life or a previous one. This link has been made throughout many cultures and continents. Because of this, suffers throughout history were often denied treatment and care. Instead they were sunned by society and quarantined in purpose-built asylums or leper colonies.
Unlike many infectious diseases, leprosy does not have a long history of ridiculous cures and treatment. This is probably because of the association it has had with being unclean or cursed in some way. Doctors or priests would refuse to treat the afflicted as they would not want to be associated with the illness. There was also the general thought that you have “brought the disease upon yourself” in some way.
The Ancient Egyptians, writing circa 1550 CBE recommend bathing in blood to alleviate the symptoms of leprosy and it seems that this was the go-to treatment across Europe, the Middle East and into China for millennia. There is some variation in the source of the blood that is recommended, from child to dog to lamb, and whether or not it should be also used as a beverage, but the theme remains.
Pliney the Elder makes mention of snakes blood, but also mentions that he doesn’t think it will help. However Philippe Gaucher, a 19th century French doctor stepped it up to applying cobra venom to the skin lesions, and as late as 1913 a Dr Boinet tried increasing doses of bee stings (up to 4000, apparently) to no avail. Bare in mind, the bacteria causing the infection had been identified in 1873.
As of 2016, worldwide, two to three million people were estimated to be permanently disabled because of Hansen’s disease. Although the disease is now entirely treatable with multidrug antibiotic therapy, the stigma that is associated with leprosy infection still prevents some sufferers from seeking or receiving treatment.
The disease is traditionally associated with poverty and sadly that link still remains. The necessary multidrug treatment is comparatively expensive, however Novartis, the company that produces it, offers it for free.  Several leprosy charities exist, with the goal to treat those infected. Some also offer reconstructive surgery and artificial limbs, which can help sufferers reintegrate into society.

The inhabitants of the Kalaupapa leper colony in 1905. Image: Wikipedia
There are many leper colonies left in the world, over 800 in India alone and in Europe, one leper colony still remains. Tichileşti is in Romania and had 19 inhabitants in 2011. Although the people living there have been given the necessary treatment most are now very elderly and feel they are unable to leave.
Approximately 150 people are diagnosed with leprosy each year in the U.S.A, although due to the long dormancy period of the bacteria (from two years to over a decade) it is hard to identify where they contracted the initial  infection. There have been two cases reported in the last three years in the UK, both in men who recently moved from the Indian subcontinent.
M. leprae has a natural reservoir in armadillo populations and in red squirrels, although there have been no known cases of animal to human transmission from squirrels, there is some evidence that it may be possible to catch leprosy from an armadillo. If you were both unlucky enough to be genetically vulnerable to M. leprae and had prolonged exposure to said infected armadillo.
Cases of Hansen’s disease worldwide have dropped rapidly over the last 30 years, from 5.2 million in 1985 to about 210,000 a year now. In 2000 the WHO declared it was no longer a public health problem. Both the WHO and Novartis believe that, through new screening techniques that especially target at-risk children, by 2020 child suffers will no longer develop deformities. There is the potential to fully wipe out leprosy, as we did with smallpox, early detection of infection being key.

Published Nouse 13/06/2017

Monday 22 May 2017

A brief history of the hangover

For the last 12, 000 years (at least) humans have been purposefully mixing fermented grain drinks, possibly even before the development of bread as a dietary staple. Almost every settled community across the globe has developed a form of alcohol and ceremonies based around the drinking of it. And ever since mankind discovered the wonders of fermented plant matter, humanity has also suffered the indignity of the hangover.
Egyptian hieroglyphics depict the pouring out of beer. Credit: WikiCommons

The term hangover only came to commonly apply to the aftereffects of a night of drinking around the turn of the 20th century. Before then, phrases such as the “morning fog” or “blue-devils” were used colloquially, and the phrase “bottle-ache” appears in several doctors’ notes from the mid-1800s.

In the UK, it was calculated that alcohol and the resulting days lost to hangovers accounted for almost £2.55 billion in lost wages. In Finland, over 1 million workdays are thought to be lost a year to hangovers, which is impressive considering the population is only 5 million.

One of the oldest and most persistent cures for a hangover has been to have another drink. The phrase “hair of the dog that bit you” comes from a Middle Ages remedy for a rabid dog bite, in which the patient drinks a concoction of honey, wine and the hair of the suspect canine. Not unexpectedly, this is an ineffective rabies treatment. However many will swear by it as a hangover remedy.
Dionysus, Greek god of wine, beer and inebriation. Credit: WikiCommons

In Ancient Egypt, the goddess of beer was known as Nephthys. She would answer drunken revellers’ prayers with the gift of vomiting, apparently allowing them to be hangover-free the next day. The Greeks had several gods of inebriation, from the most famous Dionysus and his son Comus, to the more beer and grain-specific such a Sabazios and Demeter. Drinking vessels can be found etched with prayers to Dionysus asking for the drinker to be blessed with a hangover-free morning after.
Pliney the Elder took some time to advise his fellow Romans on what to do after a night of overindulgence. His go-to cure was raw owls eggs, though he also recommends a nice fried canary with salt and pepper to taste. However he advises that drinkers are better to line their stomachs with a solid meal of roasted sheep’s intestine before a night on the town.
Sticking with the avian theme, ancient Assyrians would mix ground swallows’ beaks and myrrh whilst in Mesopotamia a mixture of myrrh, liquorice, cardamom, beans, oil and of course, wine was recommended to help shake off a particularly bad morning after. In Mongolia, pickled sheep’s’ eyes where the go-to restorative.

From a medicinal standpoint, eggs are a good source of the amino acid cystine which is key in the liver’s metabolism of alcohol by products. Meanwhile myrrh does appear to have some analgesic effects, at least on toothaches although the effects internally are somewhat less clear. There is, as yet, no medical opinion on pickled sheep’s eyes.

In Medieval England, normally the source of some of the worst medical advice on record, raw eel and boiled cabbage were the breakfast of choice after a night on the tiles. A spot of sushi and some veg doesn’t sound so bad.

By the end of the 17th century, branding begins to come into medicine. One of the more famous cure-alls on the market at the time was Goddard’s Drops. Developed by the physician Jonathan Goddard, they contained, among other things, ammonia, the crushed skull of a hanged man, dried viper and ivory. Goddard claimed his drops could cure everything from a hangover to bladder stones.

The trope of “throw a bucket of cold water over the sufferer” seems to appear in the early 1800s, along with drinking a glass of warm water and soot. A Victorian suggestion of rubbing vinegar on the temples seems particularly ineffective. Meanwhile out in the Wild West the cowboys developed something called “jackrabbit tea” made from rabbit droppings.
Hall’s Coca Wine: The Elixir of Life. Credit: WikiCommons

At the turn of the 20th century cocaine was in, and no good hangover cure was complete without it. A popular British drink called Hall’s Coca Wine combined both the hair of the dog with a hefty dose of coca leaves to make a “great restorative”. Also introduced in 1905, was a seasickness treatment called Mer-Syren, sold as a mystical herbal treatment from India,  but six years later the British Medical Association discovered it was nothing but powdered potatoes.

PG Wodehouse describes a drink made up of Worcestershire sauce, raw egg, tabasco and pepper, fed to a suffering Bertie Wooster by the manservant Jeeves in the his 1916 short story Jeeves Takes Charge. This is practically a Prairie Oyster (minus the vodka), a cocktail developed back in 1878 New York as a pick me up for those having a particularly rough morning.

Famous drinker Kingsley Amis published his book On Drink in 1972. In this he described the metaphysical (ie, nagging feelings of guilt and self-loathing) aspects of a hangover as much worse than the physical. He recommends getting your mental house in order to better sort out your physical self. For the body though he suggests trying beefpaste and vodka, or baking soda and vodka or tomato juice and vodka. You may be noticing a trend. He also suggests vigorous sex if you can find a willing partner.

During the Cold War, both the KGB and the CIA were rumoured to be developing pills that would allow their agents to drink without either getting drunk or suffering a hangover. Now several versions of these apparently anti-hangover pills, such as Chaser and Alcohol-X are on the market. They are based around the concept of activated charcoal, mostly using vegetable carbon to bind and filter “toxins”. None of these pills have been through any sort of thorough clinical trials, so whether or not they are effective is yet to be proven.

In 2009, a study by Newcastle University concluded that a full English breakfast is the best cure for a hangover. However the subjective nature of the beast means that it is hard to control from person to person and night to night what a hangover will become. Obviously the only way to avoid a hangover fully is not to drink. But if, for whatever reason, that seems unlikely, then staying hydrated, replacing electrolytes and getting lots of rest is the tried and tested option.

Publish Nouse Online 18/05/17

Saturday 6 May 2017

The sad story of Typhoid Mary Mallon

In the summer of 1906, the Warren family were summering on Long Island when six out of the eleven people staying at the rented house came down with typhoid fever.  Charles Warren wanted to know who to blame for the outbreak in his family so hired an investigator named George Soper. Soper soon decided that the most likely culprit was the cook, Mary Mallon.
Typhoid fever is caused by the bacterium Salmonella Typhi, which is easily transmitted through contaminated food and poor public sanitation conditions. One of the first outbreaks of typhoid is thought to have occurred in 430 BC Athens, killing a third of the population and helping to end the Age of Pericles.
In 1838, English doctor William Budd realised contaminated water was the factor in the spread of infectious disease. However during the American Civil War more soldiers were still killed by typhoid than died in battle and by the mid to late nineteenth century, the annual typhoid death rate in Chicago was 174 per 100,000.
Once Karl Eberth had identified the bacteria behind typhoid fever, it only look 16 years to develop a vaccine. The initial version of the vaccine was first used by British soldiers during the Second Boer War, and then 10 million doses of a more developed version were given to troops during WWI.
But for Mary Mallon, being accused of having typhoid in 1906 New York was insulting and degrading. Typhoid was now thought of as the disease of the dirty, unwashed masses. Not something that clean, proper folks caught.

Public opinion of “Typhoid Mary” was swayed by pamphlet campaigns. IMage: Flickr
From her childhood in poverty in County Tyrone, Ireland, Mary had immigrated to the United States alone at just 15. Her career as a cook for various wealthy New York families was impressive and well paid. By her mid-thirties she had built herself a small reputation around Manhattan for her fine peach ice-cream.
As Soper went through her employment records, he found that she had worked for seven families since 1900. In those households, 22 people had become sick with typhoid and one had died. Soon he had tracked her to her new workplace and the investigator confronted the cook.
Mallon was unimpressed by the strange man in her kitchen accusing her of infecting her employers, and chased him out with a meat fork. Soper then brought the New York Health Department and police officers to Mallon’s house.
Confused, and scared, Mallon ran and then put up a fight. Eventually she was taken into custody and placed in a quarantine ward on North Brother Island. There was no long term plan for Mary, she was kept in virtual isolation and treated poorly by staff and other patients.
Over 163 samples of various bodily fluids were taken from Mary, mostly against her will, and 120 tested positive for typhoid. Doctors did not understand at the time how she was able to shed the bacteria and yet show no symptoms of the disease.
Soon public opinion was moving into Mary’s favour. By 1910 was perceived as unfair that a woman that did no willful wrong could be held against her will. After three years in quarantine she was eventually allowed to leave the hospital on the understanding that she would no longer work as a cook.
Authorities eventually lost track of Mary, but in 1915 a typhoid outbreak at a maternity hospital lead to 25 sick and, sadly, two deaths. A woman named Mary Brown was working in the kitchens, and it didn’t take long for her to be recognised as Mary Mallon.
Once again, Mary was hunted down and arrested. It seems that she truly didn’t understand that she was capable of infecting people with this disease as she had never had typhoid symptoms herself.
None of the doctors had ever taken the time to explain to her how the bacteria could lie dormant, they had just demanded that she give up the only line of work she had ever known. However the public had no sympathy for Mary now and she spent the remaining 23 years of her life in quarantine.

Mary Mallon was kept in quarantine for over 23 years. Image: WikiCommons
By the time Mary Mallon died, 400 other asymptomatic typhoid carriers like her had been identified in the USA. None of the other carriers were quarantined like Mary, not even Alphonse Cotils who worked as a baker or Tony Labella who is thought to have caused over 100 cases of typhoid fever and at least five deaths.
Why Mary alone was forced to spend her life in isolation is unknown. It’s possible she completely refused to comply with demands to change her profession, that the trust was broken after the incident at the maternity hospital, or that doctors thought she was incapable of understanding her situation.
Mary died of pneumonia after suffering a stroke in the Riverside Hospital in 1938. Samples taken during her autopsy found evidence of typhoid still in her system. Mary was cremated and her name became synonymous with anyone who, knowingly or not, spreads sickness and disease.

Tuesday 25 April 2017

World Penguin Day: Overfishing and climate change impacts penguin populations

World Penguin Day is celebrated on the 25th April every year. This coincides roughly with the start of the annual northward migration of the five Antarctic species of penguins.

Image: Wikimedia Commons
The State Of Antarctic Penguins 2017 report, released today by non-profit organisation Oceanites, calculated the number of penguins in Antarctica at 12 million. The report uses satellite technology to help assess the bird numbers over huge areas and across seasons.
Adélie and chinstrap penguin numbers in Antarctica have declined rapidly over the last few years. As global warming has affected the ice caps where the birds live, they are particularly vulnerable.
Outside of the Antarctic, penguin populations are not faring that much better. Twelve out of the 18 penguin species in the world are facing population decline. According to the International Union for Conservation of Nature’s Red List, 10 of the 18 are either endangered or vulnerable.
In South Africa, penguin population has fallen by 70 per cent in the last twelve years. Penguins complete with commercial fisherman and as sardines and anchovies stocks are depleted they struggle to feed their young.
Overfishing, especially dragnet fishing to make fishmeal that goes on to feed farmed salmon, chickens and other livestock affects many marine species. The decline in penguin numbers is yet another side effect of the emptying of the oceans.
​Glynn Davies, WWF’s executive director of global programmes, said: “The decline of species is reaching a critical point and we cannot ignore the role of unsustainable livestock production.“If nature is to recover, we need to work together and encourage sustainable farming systems which will limit pollution, reduce habitat loss and restore species numbers.”
Penguins live across the Southern Hemisphere from the southern Australian coast to the Galapagos Islands and Peru. They occupy a huge variety of habitats from the extreme frozen wastelands of Subantarctic Islands to urbanised beaches in South Africa and forests in New Zealand. Penguins are one of the most popular species of birds, they have a friendly nature and a cuddly appearance. Zoos and popular culture have kept them in the forefront of the public’s imagination from Chilly Willy in the 1950s to Happy Feet.  
However the threat of environmental destruction, including oil spills and illegal egg harvesting threatens all species of penguins. With World Penguin Day 2017, WWF, Bird Life Europe and Compassion in World Farming hope to highlight the impact humans have on these birds and other marine species.

Published Nouse Online 25/04/2017

Warm winters drive the spread of Lyme disease

As climate change drives the spread of Lyme disease carrying ticks across new areas of the Northern Hemisphere, we still have no human vaccine against the infection. Neither is there a medical consensus about the existence of “chronic Lyme disease”, the symptom complex that some sufferers experience even after being treated for the initial infection.

Lyme disease, or Lyme borreliosis is a bacterial infection of Borrelia burgdorferi spread to humans by infected tick bites. The bite can leave a bulls-eye shaped rash and initial infection causes fever, headache and exhaustion. If left untreated, Lyme disease may develop over several weeks, months or years, leading to inflammatory arthritis, heart problems, issues of the nervous system and even meningitis.

Imagr: Flickr
Lyme borreliosis is the most common disease spread by ticks in North America and is estimated to affect at least 65,000 people a year in Europe alone. The autopsy of Ötizi the Iceman, a 5,300 year old mummy found in the Ötztal Alps between Austria and Italy discovered the DNA of Borrelia burgdorferi, making him the earliest known human to have suffered from Lyme disease,

Since 2001, cases of Lyme disease in the UK have increased ten-fold. In the US, the Centres for Disease Control has called the increased incidence of infection “a major US public health problem”. This seems to be linked to the warmer winters we have been experiencing over the last decade.

As global warming increases world average temperatures, Lyme disease is spreading outside of its usual territories. Whilst those who live in Lyme disease zones are aware of how to avoid contact with ticks, the spread of the insects to new areas means those less familiar with tick awareness will begin to encounter the disease.

When a tick initially bites, it is only the size of a poppy seed and some people do not display the typical “bulls-eye” rash that would otherwise signify infection with Lyme disease. Likewise, initial flu-like symptoms are easily misdiagnosed and because antibodies to Lyme disease can take weeks to develop, early tests may miss it.

The best approach to combat Lyme disease would be vaccination of those at risk, and in 1998 GSK released Lymerix, an FDA approved human vaccine against Borrelia burgdorferi. However, some recipients of this vaccine claimed it had caused them to suffer autoimmune arthritis, and lead by pressure from various anti-vax groups, Lymerix was withdrawn from the market.

The FDA has since confirmed there was no link between the Lyme vaccine and any autoimmune side effects, but the damage to the vaccines image was done and Lymerix was declared unmarketable by GSK. There are, however, various animal-approved vaccines for Lyme disease, so most farm animals and pets are protected.

The French biotech company Valneva is working on a new human Lyme vaccine that will improve on Lymerix by immunising recipients against all five strains of the disease, however it is still only in the early stages of human trails.

In the meantime, if you plan on hiking in a “high risk” area, such as the North York moors, read up on your anti-tick precautions. Lyme disease can be treated with a two or four week course of antibiotics, and if you suspect you have been bitten by a tick you should seek medical advice.

Published Nouse Online 21/04/2017

Tuesday 18 April 2017

Marmite is good for the brain, York study suggests

Eating a spoonful of Marmite everyday could have a positive impact on your brain’s health, researchers at York have found.
A group from the University of York’s Department of Psychology have identified a potential link between Marmite and the increase of gamma-amino butyric acid (GABA) – a chemical messenger which is associated with healthy brain function.
Marmite contains a high concentration of vitamin B12 Image:Flickr
Dr Daniel Baker, Lecturer in the Department of Psychology and senior author of the paper, said: “The high concentration of Vitamin B12 in Marmite is likely to be the primary factor behind results showing a significant reduction in participants’ responsiveness to visual stimuli.”
The study looked at 28 healthy volunteers who were split into two groups; half ate a teaspoon of Marmite every day for a month whilst the control group where given peanut butter.
The participants’ responses to visual stimuli were measured and they recorded electrical activity in the brain using electroencephalography (EEG). Subjects that had been eating the “yeast extract product” saw a reduction of around 30% in the brain’s response to the visual stimuli.
GABA is an inhibitory neurotransmitter; it reduces the excitability of neurons in the brain, as a delicate balance of activity is needed for brain health. Abnormal levels of GABA are associated with epilepsy, autism and depression.
Anika Smith, PhD student in York’s Department of Psychology and first author of the study, said: “These results suggest that dietary choices can affect the cortical processes of excitation and inhibition – consistent with increased levels of GABA – that are vital in maintaining a healthy brain.
“As the effects of Marmite consumption took around eight weeks to wear off after participants stopped the study, this suggests that dietary changes could potentially have long-term effects on brain function.
“This is a really promising first example of how dietary interventions can alter cortical processes, and a great starting point for exploring whether a more refined version of this technique could have some medical or therapeutic applications in the future. Of course, further research is needed to confirm and investigate this, but the study is an excellent basis for this.”
Published Nouse Online 5/4/17

Thursday 9 March 2017

New approaches to childhood obesity

METABOLIC SYNDROME, sometimes referred to as insulin resistance syndrome, is a term used for the group of various medical conditions including obesity, elevated blood pressure and high fasting blood glucose levels. It is associated with the risk of developing cardiovascular disease and type 2 diabetes. Approximately 25 per cent of the world’s adult population exhibit the cluster of risk factors that make up metabolic syndrome, and one of the highest risk factors leading to adult metabolic syndrome is childhood obesity.
Childhood obesity has reached epidemic levels in many developing countries as well as in most developed countries. In the UK, 28 per cent of children aged 2 to 15 are at least overweight and half of those are obese. Childhood obesity can have a serious impact on a child’s health, both physically and mentally. It may affect their social and emotional well-being, and obesity is associated with lower self-esteem and bullying, despite it becoming more common.

Image: Pexels
A recent report by the NHS has shown that many parents of overweight children wrongly thought their child was a healthy weight – 91 per cent of mothers of overweight children and 48 per cent of mothers of obese children. Factors that impact obesity, both in childhood and adulthood, are complex. They include environmental factors, socioeconomic status, lifestyle preferences, and the culture around you. The study looked at 60 000 children and their parents from all over the country, and tried to identify problems in tackling childhood obesity. Although the root causes of obesity can be hard to challenge, spotting the problem early is key in healthcare.
A 2007 report in the New England Journal of Medicine showed how obesity can spread through a social circle. Both adults and children are less likely to acknowledge their own weight gain if those around them are obese. There is also the key factor of denial: parents don’t want to accept that there may be a problem. Simply telling a child they are fat is not the solution, however, as this can lead to stress, comfort eating or ill-informed diet options. In fact, the children are likely to be aware that they are overweight, even when the parents fail to notice. A better approach is education. Lessons in cooking and nutrition are becoming more common in schools, along with better food labeling standards. A study from the University of Michigan last year taught children and teenagers about how fast food companies use advertising to manipulate your desires, and disguise their poor nutritional content.
It turns out that being informed about your choices leads to better decision making. The junk food market is manipulative and targets young children in its ad campaigns. Simply informing young people of the unfair practices of the food industry can lead them to making their own decisions in an almost social justice-oriented, rather than focussing efforts on diet or weight-loss. Healthy habits are important at a young age, and of course parents should lead by example. Early intervention to prevent childhood obesity can have a very positive impact on future health.
Published in Nouse 08/03/2017

Tuesday 21 February 2017

The spotted history of chicken pox

Most British children will contract chicken pox during their childhood. The disease is almost a rite of passage – the itching, discomfit and, of course, the distinctive rash. Many people have somewhat nostalgic memories of chamomile-soaked days off school, with their equally spotty siblings and an exhausted parent.

Image:Pexels. A child with the typical “head downwards” chicken pox rash
Chicken pox is caused by the varicella zoster virus, one of eight herpes viruses known to infect humans, and is thought to have been infecting humans for millennia. The virus is spread through the air, and can also be spread via contact with the blister on the skin before they heal over. Chicken pox presents as a head-downwards rash, usually very distinctively. In patients aged 1 to 15 years, symptoms, although uncomfortable, are rarely serious.
One in five of those who contract chicken pox as a child will go on to suffer from shingles, known as herpes zoster. This can be made more likely if you are immunocompromised, such as from cases of HIV or cancer, or elderly. Shingles occurs as the varicella zoster virus remains dormant in the body’s nerve tissues where it is repressed by the immune system. Typically a shingles rash will occur in a single painful stripe down one side of the body, following the line of the nerves.
Chicken pox is surprisingly sparsely documented in extreme antiquity. It can be hard to decipher references to diseases in ancient texts, especially when the main symptoms spots – are they describing small pox, herpes, syphilis or just a spot? One approach is to see what the treatments offered were. The more serious the disease, the more extreme the treatment is likely to take, from bloodletting (popular from ancient Greece to the middle ages) to animal dung ointments, noted in the Ebers Papyrus from c 1550 BCE Egypt as having healing properties and working to ward off bad spirits.
Probably the earliest description of chicken pox can be credited an ancient Babylonian text from over 2000 years ago, but the description of a “yoke around the [..] abdomen or pelvis” is more compatible with shingles. The ancient Egyptians suggested treatment of a particular rash with an oatmeal bath, which is still today a popular method of reliving the chicken pox itch.
In the Devi Mahatmya, a Hindu religious text from c 400 CE chicken pox patients should be treated by placing a jar of water at the head of the bed, spreading neem leaves around the doorways of the house and over the bed and all the lamps in the house should be extinguished. Lamp smoke would irritate the rash, causing itching and possible bacterial infection. Neem leaves prevent mosquitoes and the bundles at the doorway would act as a warning to guest that there was disease in the house. The water by the bedside prevents the infected from wandering off to look for a drink. All in all, a very effective treatment and quarantine procedure.
The ancient Greeks identified shingles and called it zoster after word for girdle, as the most common place for the rash to appear is in the peripheral nerves of the back that wrap around the abdomen. Similarly the Romans referred to the disease as cingulus (belt) which is where we get the word “shingles” from.
The source English name “chicken pox” for the childhood rash is more disputed. There are theories that the name arose because the blisters made the skin look like it had been pecked by a chicken or, as proposed by Doctor Samuel Johnson in the 18th century, that is was the coward’s form of Small pox. More likely the name arises from the old English word “giccan” to itch or itchy pox.
In the 1767, the English physician William Heberden demonstrated that chicken pox was not a lesser form of small pox and that a patient who has had chicken pox would remain immune to the disease. It took over a hundred years for another scientist, Rudolf Steiner in 1875, to identify that chicken pox was caused by an infectious agent. He did so by extracting fluid from the blisters of an infected person and rubbing it on the skin of healthy volunteers, they too devolved am itchy, blistering rash.
When Dr. James von Bokay proposed that chicken pox and shingles were, in fact, caused by the same virus he tested the hypothesis by rubbing fluid extracted from shingles blisters into the skin of healthy children. When they contracted chicken pox, it appeared to confirm his suspicions, although it wasn’t until 1953 that Thomas Huckle Weller isolated virus from both illnesses and confirmed that they were, in fact, the same.
Michiaki Takahashi, a Japanese virologist, developed a live attenuated varicella zoster virus vaccine in 1972 and Japan became one of the first countries to routinely vaccinate against chicken pox. The USA followed soon after and cases of chicken pox dropped from approximately 4 million per year to less than 400, 000 in 2005. The vaccine has been adopted into childhood routine immunisations in Canada and Australia and is gaining wider acceptance across Europe.

Image:Pixnio A chicken pox vaccine has been available for decades but has not been made part of NHS childhood vaccination schedules.
The UK has targeted recommendations for the vaccine, such as health care workers, and those with regular contact with immunocompromised persons. By 2005 all NHS works had had their immunity determined and been immunised if they were non-immune.
Whilst chicken pox is rarely fatal, in a pregnant woman the complications can be severe. Risks to the fetus include encephalitis, damage to the development of the eyes, and hypoplasia of the extremities amongst other complications. Newborns who develop symptoms are at a high risk of developing pneumonia and other serious complications of the disease. Meanwhile in the USA vaccine coverage for chicken pox is currently at almost 95% in adolescents aged 13 to 17 years. Over the last year, three and a half million cases of varicella, 9,000 hospitalisations, and 100 deaths are prevented by varicella vaccination in the United States, according to the CDC.
So why does the NHS not currently offer the vaccine to the whole population? There is a worry that, by reducing the numbers of the less harmful childhood chicken pox, there would be a loss of herd immunity for adults. Adults who suffer from chicken pox are much more likely to suffer complications such as pneumonia and the natural boosting of  immunity by expose to the virus by expose to infected children is thought to keep cases of adult chicken pox and shingles down.
In the US, overall rates of herpes zoster (shingles) appears to be increasing, but whether or not this is linked to the increased rate of varicella vaccination is yet to be determined. Many factors can play in to whether or not someone develops shingles, from smoking to obesity to age.
Chicken pox vaccines are still provided on the NHS where there is a clinical need, for example if a child has a sibling or parent with a weakened immune system. Many parents in the UK will make sure their child contracts chicken pox as a child, but if you are worried you do not have immunity, your GP can carry out a blood test to check.
Treatment for chicken pox hasn’t progressed much since the days of neem leaves and oatmeal baths. As long as the young patients can be prevented from scratching the blisters, there will likely be no permeant scarring, the fever will pass and the rash will last only a few days. However some argue that there is no need for this disease at all in modern times and it should go the way of small pox. If we are capable of preventing a disease, should we take those steps to eradicate it?
Adding chicken pox to the MMR vaccine (marketed as the MMRV vaccine) has been proposed in the UK since 2007. However uptake of MMR is far from universal and some parents are still unwilling to vaccinate their children against the more frequently deadly measles, an outbreak of which hospitalized 88 and killed one in Swansea during 2013. No child needs to suffer from a vaccine-preventable disease, from diphtheria to meningococcemia. However chicken pox will remain a childhood “tradition” for a few more generations to come.
Published on Nouse (online) on 21/02/2017

Wednesday 15 February 2017

Treating diabetes mellitus: from ancient Egypt to the NHS

Over the past thousand years of medical progress, mankind has seen a slow but steady increase in human longevity. Though the occasion plague, famine or war will lead to a mortality peak in a generation, by and large each new wave of humanity is healthier than the last.

Image:WikiCommons
But this trend seems to be about to change. A study published in 2015 revealed that middle-aged white Americans are dying at younger ages than their parents for the first time in decades; and as with all trends, where the US leads, the UK and Europe are sure to follow soon after. In fact, there are many  studies suggesting that today's children may lead shorter lives than their parents.

To explain these trends experts have looked to two main factors – firstly “deaths of despair” such as opioid overdoses, suicides and the complications from long term alcohol abuse. In 2015, 52, 000 Americans died of drug overdoses alone, more than died per annum of HIV/AIDS during the epidemic's peak years in the mid 90s. Almost half of these deaths were due to opioid-based drugs, such as heron or the much stronger synthetic opioid fentanyl.

Secondly a more recent study has linked diabetes to the increase in American mortality. Whilst in 1958 only 0.93% of the US population was diagnosed diabetic, now 7.02% (nearly 30 million people) of the country live with the disease. The number has grown three-fold since the early 1990s, rising with the ever increasing obesity rates.  Approximately 368 million people on Earth were living with the disease in 2013.

Most of these cases are  diabetes mellitus Type 2. This is what used to be known as “adult onset diabetes”, to differentiate it from Type 1 diabetes, which involves the auto-immune destruction of the insulin producing beta cells in the pancreas and usually begins in childhood.  Type 2 diabetes now makes up 90% of all diabetes diagnosis  in Europe and is seen increasingly in young adults and children.

Type 2 diabetes is associated with a ten-year reduction in life expectancy, and is though to be an under-reported cause of death, likely affecting life expectancy trends. People with diabetes often have multiple co-morbidities such as obesity, high blood pressure, cardiovascular disease, and even cancer.

Diabetes was one of the first diseases that we can recognise as described in an Egyptian manuscript from c. 1500 BCE. They mention “too great emptying of the urine” and that the urine would attract ants. This is due to the high levels of glucose in the urine seen in untreated diabetics. These first cases are believed to all be Type 1.

Type 1 and type 2 diabetes were described as separate conditions a thousand years later, in India, by the doctors Sushruta and Charaka, with Type 1 being associated with youth and Type 2 with obesity. The name “diabetes” was given by the Greek doctor Apollonius of Memphis in 250 BCE, meaning “to pass through”.

So throughout historical times, both types of the disease were recognised, although rare, and treatments were generally unavailable. Aretaeus of Cappadocia offers a list of symptoms of diabetes, although no treatments and notes “life (with diabetes) is short, disgusting and painful”.

However by the late 19th Century the idea of a low-carbohydrate diet was realised. Whilst under rationing in Paris during the Franco-Prussian war, French physician Bouchardat realised his diabetic patients were faring somewhat better. This lead to some doctors going so far as to keep their patients under lock and key to prevent them from breaking particularly restrictive diets.

In 1889 Germany, Oscar Minkowski and Joesph von Mering removed the pancreas from a dog and saw the poor animal developed diabetes. The protein insulin was eventually identified as being the key to blood sugar control in 1921. Sir Frederick Banting and Charles Best went on to purify insulin from cows and successfully treated a 14 year old boy with Type 1 diabetes in 1922.

Advances were made rapidly, in 1936 the two types of diabetes were made distinct from a treatment perspective and in 1944 a standard insulin syringe was developed. The structure of insulin was first determined in 1951 and first genetically engineered, synthetic human insulin for use in patients was produced using E. coli recombinant expression in 1978.

Since then, there has been huge amounts of progress in the treatment of diabetes, both Type 1 and Type 2, including the introduction of the blood glucose meter and the insulin pump. Short and long-acting insulin derivatives that stem from work done within the York Structural Biology Laboratory at the University of York are now the standard treatment for many Type 1 diabetes patients worldwide.

Researchers at the University of Pennsylvania looked at the prevalence of Type 2 diabetes in the US population and looked at the increased risk of death adults ages 30-84. They calculated that, while diabetes was listed as the cause of death in 3.7% of cases, it was more likely to the underlying cause in almost 12% of the total deaths. Amongst the obese cohort alone, the death rate from diabetes was closer to 19%.

There are now many drug treatments available for Type 2 diabetes, however many have complicated side effects. Most disease management regimens focus on lifestyle interventions to lower various risk factors and maintain a healthy blood sugar level.

Annually, the NHS currently spends £8.8 billion (over 8% of its budget) treating Type 2 diabetes and its complications – from outpatient services to amputations. On a societal level too, Type 2 diabetes has a huge impact on levels of absenteeism and early retirement as the various complications of the disease effect the suffers lives.
Image:Pixabay

Prevention of the onset of Type 2 diabetes is the ideal solution from a healthcare prospective, and it can be achieved with both lifestyle changes and medication. Patients with prediabetes who go through lifestyle changes alone (weight-loss, increased physical activity and quit smoking) can reduce their risk of developing Type 2 diabetes by 50 to 60%.

Although it has been known for some time that obesity and the associated co-morbidities are a leading factor in reduced life expectancy, researchers are hopeful that a focus on diabetes and specifically the control of blood sugar might help healthcare workers and policy makers to combat the trends in mortality statistics. 

An abridged form of this article appeared in Nouse 14/022017

Thursday 2 February 2017

Food, your mood and how we choose

Get short tempered before lunch? Snap at people if you’ve skipped breakfast? Perhaps you are suffering from hanger – the combined effects of hunger and anger.

Self-control requires energy. When our energy levels are low, it follows that our control over our temper is reduced too.

Image:Pexels
As blood glucose levels drop, the stress hormones cortisol and adrenaline are released to drive us to find our next meal. Along with a chemical identified as neuropeptide Y, these combine to make people more aggressive to those around them.

The effects of blood sugar on aggression were measured in a 2014 trial investigating 107 married couples. In the first part, the couples used voodoo dolls and up to 51 pins to express the level of anger that they felt at their partner at that time, and the blood glucose levels of both was measured.

In the second half, the couples played a competitive game, after which the winner could blast the loser with a loud noise through a set of head-phones. As expected, the lower the blood glucose, the more pins and the longer the noise the partners received.

A 2012 study at Columbia University looked at case sentencing by judges and saw they tended to be more lenient first thing in the morning and right after lunch. On the other hand, this may have been more to do with the ordering of the caseload (shorter cases vs longer and more complicated ones) than the timing of meals.

The hunger hormone, ghrelin, which is produced in the stomach prior to meals and during fasting, has been seen to have a negative impact on the brain’s ability to make rational decisions. During an experiment at the University of Gothenburg in 2016, rats with a higher level of ghrelin (mimicking hunger conditions) behaved more impulsively and erratically.

However this study was done in rats, they can be a good model for humans, and more research is needed to confirm the effect is true in us. But perhaps for now be careful when making decisions on an empty stomach.

Wednesday 1 February 2017

Launching a solution to the lithium-ion problem

Image:Flickr
YOU OFTEN find rechargeable lithium-ion batteries in phones, laptops, hoverboards and even planes and electric cars. They are light-weight, highly efficient and rechargeable; this makes them ideal for all sorts of gadgets.
In comparison to nickel-cadmium batteries, lithium-ion batteries are more reliable, hold charge for longer and can be built to be much smaller and thinner.

The components of a lithium-ion battery are much less toxic than those of other battery types, which may contain lead or cadmium. The iron, copper, nickel and cobalt of a lithium-ion battery are safe for landfill or incinerators. Yet, 25 years after their introduction to the market, there are still occasional reports of lithium-ion batteries causing fires in all sorts of devices, including a fire on-board a Boeing 747 flight in 2010 which killed two people.
In 2016 2.5m Samsung Galaxy Note 7 smartphones were recalled due to a problem with their batteries, causing fires and injuries to many users. The recall of the Samsung phones was due to an engineering fault, the theory being that one part inside the battery was coiled incorrectly leading to an excess of stress on another single part. As more demands are made on the battery in any given device, engineers try to pack more power in to smaller spaces.
Within a lithium-ion battery, there are three main components: the positively charged cathode (a metal oxide), the negatively charged anode (graphite) and the liquid electrolyte, a solvent of lithium ions. The cathode and anode must be physically separated by a permeable wall and in very slim batteries this can be done by a polymer as thin as ten microns.If this wall is breached, it can lead to a process called thermal runaway. The battery gets hotter, leading to further degradation of the polymer, which causes the battery to heat even more. The flammable electrolyte can reach 500°C at which point it may ignite or even explode.
Simply adding flame retardant to the electrolyte solution would lessen the chance of fire but at the same time massively reduces the efficiency of the battery. To reduce the risk of a catastrophic fire, researchers at Stanford University have devised an automatic fire extinguisher for lithium-ion batteries. Yi Cui and this team have produced a thin polymer capsule that contains a fire retardant. If the battery overheats to the point that the polymer shell melts, the miniature fire extinguisher is automatically set off and the fire retardant released into the battery.
If these safety devices can be shown to work on a large-scale in a real world setting then it opens lithium-ion batteries to more widespread use in electric cars and aircraft. Currently a safer alternative to lithium-ion batteries is the solid state battery, where the liquid electrolyte is replaced with a solid which is far less flammable. However the inherent problem with the solid state battery is that it takes an incredibly long time to charge – negating most of its useful potential in cars and electronic devices.
Still, it is important to note that lithium-ion batteries are generally extremely safe. The probability of the lithium-ion battery in your phone failing is less than one in a million – whereas the probability of you being stuck by lighting stands at around 1 in 13 000 – meaning that lithium-ion batteries remain a relatively safe and efficient option.

Published Nouse 24 Jan 2017

Friday 20 January 2017

Skeleton Tree - Nick Cave and the Bad Seeds

I wrote this album review back in autumn 2016 for Circulation Magazine. I have shared it here as a variation in the blog style. Perhaps I shall include some more music here later. Imogen.



Nick Cave writes about death. He always has done. Death, destruction and mortality are ongoing themes across the decades of his work. Yet here in Skeleton Tree there is something more. Although all but one of the songs were written before the loss of Cave’s fifteen-year-old son (in a fall from a cliff near the family’s home in Brighton), the album recording was completed in the aftermath.
It is hard to separate the album from the tragedy that surrounds it, although this is what Cave himself asks us to do in the accompanying documentary film ‘One More Time With Feeling’. But even the opening line brings your mind snapping back to focus “You fell from the sky / Crash landed in a field / Near the River Adur”. Words written before the tragic event but somehow so painfully foreshadowing.
There is, however, a logical progression from the earlier work. The Cave motifs are continued, mermaids, the sea, Gothic horror and the exploration of mortality. There are tracks where Cave sings from the view of an unknown female character and then goes on to sing about the same character from an outsider’s perspective, watching the woman becoming unrecognisable.
Whilst, for the most part, the lyrics are the standard Nick Cave fare, it is the delivery that really makes the record bite. Gone is the rage and threatening creepiness that rang out in Cave’s voice in 2013’s album Push Away the Sky. Here the speech-singing feels vulnerable and lost over the floating, looping texture of the music. He sounds like he has aged so much in the intervening years. On the seventh track, ‘Distant Sky’, he is joined by Danish soprano Else Torp as he searches for peace and the soothing lullaby of her voice just sharpens the bite of grief.
The Bad Seeds are as experimental and hypnotic as they have ever been. The old jangling rock of the early albums is long forgotten and Warren Ellis leads the fascinating and at times unsettling waves of sound that underlie Cave’s piano and voice.  This is a band that known for impressive energy and wildness during live performances,  producing tracks like From Her to Eternity; the stripped back effect is disquieting, though there is still the occasional snap of discord to keep you hooked.
This is an album you will listen to from start to finish and then find yourself sat in the dark thinking about for hours later. Perhaps, unlike his earlier work, it is not one you will return to again and again, the emotional drain it takes even as an outsider is exhausting. And perhaps this is not an album for outsiders to enjoy as such. But as Nick Cave explains in One More Time With Feeling, life goes on around tragedy, and for an artist, life is making art.