Health, Science, Society, United Nations, World Health Organisation

Global cases of cholera are on the rise

CHOLERA

CASES of cholera are increasing, with 22 countries around the world experiencing an outbreak. After many years of decline, incidences rose in 2022 due to vaccine shortages, climate change and escalating conflict. It is a trend that is expected to continue.

. Science Book

Some 26,000 cholera cases were reported in Africa during the first 29 days of January 2023. This is already 30 per cent of the continent’s total in 2022. At the end of February, the World Health Organisation (WHO) said that more than 1 billion people across 43 countries are at risk.

Overall, Malawi appears to be the worst-hit country, with the highest number of deaths. It reported just under 37,000 cholera cases and 1,210 fatalities from 3 March 2022 to 9 February 2023.

This was triggered by a cyclone that hit in March 2022. This led to wastewater contaminating drinking water supplies.

Cholera is spread by the ingestion of food or water that is contaminated with the bacterium Vibrio cholerae. When it enters the body, some types of V. cholerae release a toxin that interacts with the cells lining the surface of the intestine, leading to diarrhoea.

In some cases, this can result in severe dehydration and death. In Malawi, 3.3 per cent of people with cholera die of the infection. With treatment, this is typically around 1 per cent.

In 2022, Malawi vaccinated millions of people in districts that were facing cholera outbreaks, but the cyclone has allowed the disease to spread to all of its districts, putting unvaccinated people at risk.

Extreme weather, driven by climate change, means many more countries are at risk of wastewater contamination. Cyclone Freddy, which hit Mozambique on 24 February, is expected to exacerbate the country’s cholera outbreak.

Climate change-driven droughts in countries such as Kenya and Ethiopia have also forced people to rely on water sources that may be contaminated with V. cholerae, according to UNICEF. Many people in these regions are malnourished, which affects their immune health, leaving them more vulnerable to severe cholera complications.

Displacement, whether due to conflict in countries like the Democratic Republic of the Congo or disasters such as the earthquake that hit part of Syria on 6 February, can also play a role in cholera outbreaks if people are forced to move to less sanitary areas, or if already infected people take the bacteria with them.

The destruction of health facilities and infrastructures [in Syria] that bring water to people could lead to more cases. According to the United Nations, the country reported more than 37,700 suspected cases in the cities of Idlib and Aleppo from 25 August 2022 to 7 January 2023 – 18 per cent of which were in people in displaced camps.

The unprecedented scale of the cholera outbreaks in 2022 – with 30 countries reporting cases, compared with an average of fewer than 20 in the previous five years – has also depleted global vaccine supplies. Only 37 million doses are available.

The International Coordinating Group on Vaccine Provision, which manages the WHO’s global vaccine stockpile, therefore recommends that at-risk people be vaccinated with a single dose of a cholera vaccine rather than the typical two doses. The one-dose regimen gives only about one year of protection, compared with three years with two doses. If the outbreaks continue as they are, this year of protection might not be enough time to get them under control.

Cholera has always been an issue, which prompted the UN to publish a road map in 2017 to cut 90 per cent of cholera deaths globally by 2030.

Several countries have made progress. The fact that Malawi has detected cholera outbreaks so quickly points to the work that officials have done to increase health surveillance.

But with just seven years to go until 2030, many aren’t convinced that the UN’s target will be reached. They say there hasn’t been enough investment in water infrastructure around the world to reach those goals.

Standard
Arts, Psychology, Research, Science

Psychology: Choice

POSITIVE PSYCHOLOGY

The more alternatives, the more difficult the choice.

It goes without saying, that some choice is good and that more choice is even better. The freedom to choose lies at the heart of any democratic, equal and healthy society based on a free market, ranging from choices as important as to which school our children attend, who to vote for, to choices as mundane as to what to eat from the canteen menu, what to wear and which TV programme to watch this evening. The flipside of having choice is that we also have to take responsibility for the decisions we make – consequences may arise.

Various studies suggest that feeling that we can control our destiny is vital to our psychological well-being, and that limiting personal choice reduces well-being. There is no doubt that over the past 20 or 30 years we have been seduced by the power of choice, to the point that most of us take it for granted, and don’t really give it a second thought. Choice means we have freedom. It means we can express who we are as individuals and it’s central to our identity. Denying or restricting choice is considered something to be avoided at all costs. Choice is now central in every domain of our lives.

But is having greater and greater personal choice really better for us? Some psychologists believe not, and have shown in research that increased choice makes us unable to make decisions and reduces our well-being. Barry Schwartz, acknowledged world expert on the psychology of choice, states that the fact that some choice is good doesn’t necessarily mean that more choice is better. Schwartz refers to this as “the tyranny of choice”.

Four decades ago, sociologist Alvin Toffler described a psychological reaction to constant change and too much choice as “future shock”. He theorised that faced with too much choice – which he called “overchoice” – in too short a period of time, decisions would be harder and take longer to make as we’d have to process much more information. This would lead to slower reactions and decisions, and ultimately to psychological issues such as depression, distress and neurosis.

Recent research in psychology backs this up, suggesting that there are a number of problems associated with having too much choice. For example, in order to make a choice you’ll have to make some form of comparison between the different alternatives, which means sifting through an increasingly large amount of information about each one.

Some parts of the NHS appointments service in the UK utilises a “choose and book” system. Previously, in years gone by, patients would have gone directly to their local hospital; now there are pages of statistics from several hospitals within a 30-mile radius to wade through, including details on infection and mortality rates, car-parking availability and staff satisfaction rates. In situations like this, even if the majority of the available pieces of information are irrelevant to the choice you’re making, you still have to decide whether or not to take each one into account. It goes without saying that the volume and complexity of information you have to deal with increases the likelihood of making the “wrong” choice or making a mistake. In short, having too much choice causes you to worry, and is likely to lead to lower rather than higher well-being.

Findings from various experimental studies challenge the implicit assumption that having more options is better than having fewer. For example, shoppers are more likely to buy gourmet jams or chocolates and students are more likely to complete an optional class essay when they’re offered a limited array of six choices rather than an extensive array of 24–30 choices. What’s more, the shoppers reported greater subsequent satisfaction with their selections, and the students wrote better essays when their original set of choices was limited.

Psychology researchers conclude from these studies that having too much choice can have significantly demotivating effects. In relatively trivial contexts, not making a decision, such as going home without buying a pot of jam or a box of chocolates, is neither here nor there. More worryingly, choice overload may hinder decision-making in other more serious contexts, such as choosing medical treatment, especially where there are (or are perceived to be) costs associated with making the “wrong” choice, and where it takes the chooser a significant amount of time and effort to make an informed decision.

Are you a maximiser or a satisficer?

Back in the 1950s, Nobel prize-winning social scientist Herbert Simon introduced the distinction between maximising and “satisficing” as decision-making strategies. A maximiser is someone who wants to make the best possible choice, and so they complete an exhaustive study of all the available options before making their decision. A satisficer, on the other hand, is someone who is looking to make a “good enough” choice, so they keep looking at options only until they find one which meets their minimum requirements.

It’s unlikely you’re a 100 per cent maximiser or 100 per cent satisficer, although you’ll lean more towards one than the other. If you agree with statements such as “I never settle for second best,” and “Whenever I’m faced with a choice, I try to imagine what all the other possibilities are, even ones that aren’t present at the moment” you’re more likely to be a maximiser than a satisficer.

Although studies show that people who maximise tend to get better, higher-paying jobs than satisficers, at the same time they take longer to settle in and they’re more stressed, anxious and frustrated! Maximisers are also more prone than satisficers to be affected by social comparisons and have doubts about their ability compared to others.

. Science Book

Standard
Arts, Psychology, Science

Can I raise my IQ?

IQ SCORE

Intro: We revere intelligence and award “smart” people with a high IQ score, but are we actually born with our intelligence, and can we learn to be smarter?

THE IQ (Intelligence Quotient) score is an internationally-accepted measure of total brain power, calculated via a series of tests. These tests usually involve numeracy, spotting patterns, and logic – so if your forte lies in practical problem solving, negotiating with others, or creativity, the chances are you won’t excel at an IQ test.

Intelligence has no clear definition. Like beauty or personality, it’s relatively subjective, and this is one reason why IQ scores are problematic.

Another issue is that IQ tests have historically been created by (mostly) men in Europe and North America – and are skewed in favour of people from Western culture. Someone from a community that values storytelling, for example, may have great verbal reasoning and memory, yet their overall IQ score might be low because they flopped on the number puzzles.

So, although scientists have worked hard to make tests reliable and relevant across cultures, IQ tests have a limited use. For mentally taxing jobs such as programming, a test is an effective barometer for picking the best candidate. However, if we were patients given the choice between a 22-year-old novice surgeon with a genius IQ and a 55-year-old expert with countless surgical operations under their belt, we will all know who to choose.

Experience, knowledge, social skills, drive, and conscientiousness – all of which could all be considered intelligence – are not accounted for in the conventional IQ test. Even though young adults tend to get the best overall IQ scores, just like a fine wine, many of our abilities continue to improve with age.

Good IQ score or not, everyone can improve their cognitive powers, regardless of their age, schooling, and past experience. Don’t be sold on quick fixes: brain training games and programs will help you get better at those games and tasks, but rarely translate into any practical thinking powers. To get really good at something – be it memorising place names, coding software, or crafting musical compositions – you’ll need to practise that particular skill for many hours.

IQ’S disturbing history

The first intelligence tests were devised by French psychologist Alfred Binet in the early 1900s as a benevolent way to find the least able children who needed special schooling. A strong believer in the idea that intellectual prowess was not set in stone and could be improved with teaching, practice, and discipline, Binet insisted that intelligence tests should never be used “for ranking [people] according to mental worth”.

Soon after his death in 1911 however, Binet’s test, later named “IQ” tests, were seized by scientists in the eugenics movement who reformulated it for adults, labelling those who scored poorly as “feeble-minded” or “degenerate”. In the early 1900s, 30 US states passed laws that meant low-scoring people could be forcibly sterilised. By the middle of the 20th century, around 60,000 people had been sterilised against their will.

Adolf Hitler also espoused the IQ test and created his own stylised version. Hundreds of thousands of low-scoring people were duly sterilised or executed in Nazi Germany.

Standard