• Wallabag.it! - Save to Instapaper - Save to Pocket -

    Building a bright future for science journalism

    Editor in Chief Nancy Shute is ready to produce top-quality science journalism and investigate digital innovations.

    in Science News on February 22, 2018 03:46 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Readers weigh in on human gene editing and more

    Readers debated feeling morally obligated to edit their kid's genes and had questions about exoplanets.

    in Science News on February 22, 2018 03:39 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    50 years ago, early organ transplants brought triumph and tragedy

    In 1968, the liver transplant field had its first small successes. Now, more than 30,000 patients in the U.S. receive a donated liver each year.

    in Science News on February 22, 2018 12:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Wisdom is a journey

    GettyImages-856291670.jpgBy Alex Fradera

    From the beginning of recorded time, humanity has been fascinated by the figure of the wise person, wending their path through the tribulations of life, and informing those willing to learn. What sets them apart? Maybe that’s the wrong question. In a new review in European Psychologist, Igor Grossman of the University of Waterloo argues that understanding wisdom involves taking the wise off their pedestal, and seeing wisdom as a set of processes that we can all tap into, with the right attitude, and in the right context.

    Wisdom might seem to be a difficult concept for psychologists to study, but Grossman says there is some agreement now that it comprises:

    • Intellectual humility
    • Appreciating broader perspectives
    • Knowing that social relations can change
    • A willingness to compromise or integrate different opinions together

    Seen this way, wisdom is distinct from things like creativity and intelligence, while being associated with related phenomena like emotion regulation, cooperative intentions, and political even-handedness (e.g. not condemning an opponent’s change of view as flip-flopping).

    Psychologists who research wisdom tend to use questionnaires administered at one point in time to measure it, treating it as a static property. But Grossman argues that this doesn’t fit with the variability with which wisdom manifests. Experiments in the lab that asked participants to work through hypothetical situations, have found that people’s wisdom on one topic (say, the “Meaning of Life”) only gave a limited indication of their wisdom in another, like adjudicating a family issue. And diary studies where participants reflect on the most difficult issue they faced each day show that wise thinking can fluctuate wildly.

    In Grossman’s view, this suggests that although wisdom has some stability, the context is king. And changing the context of thinking can encourage more wise outcomes. For instance, researchers observe wiser thinking when participants are asked to consider a problem as if it occurred a year ago, rather than now, or if it’s occurring to a third party, rather than themselves. These distancing techniques help us take a less egocentric approach, trapped in the urgent “here, now, me” which pushes us to look for certainty and closure, often prematurely.

    In addition, explaining an issue you care deeply about is done in a more wise, balanced manner when you imagine that the recipient is a 12-year old, casting you as a teacher. It seems that again, seeing yourself as a teacher of someone in need of insight (rather than facing a peer in need of convincing) leads us to be less egocentric and more focused on the other, allowing us to disinvest in static certainties.

    If different contexts can encourage wisdom, does this mean that wisdom can be taught? There’s good reason to be skeptical: in a classic study, John Darley and Daniel Batson found that seminary students who had been preparing a talk about the Good Samaritan were no more likely to stop and help someone in need, suggesting that learning about a wise approach to living doesn’t mean that we will internalise it. And we know that moral philosophers are no more moral than the rest of us, suggesting that even deep training doesn’t add up to real world practical wisdom.

    But Grossman is optimistic, citing the fact that even undergraduate study – including in psychology – can lead to better reasoning about statistics, methodology, and practical aspects of logic. While these don’t map directly onto wisdom, they share something in their recognition of uncertainty and error in the ways we know the world.

    Proponents of teaching wisdom suggest that this teaching could be framed around exemplars – wise figures from the past. Grossman is in favour of this, but suggests that the road to wisdom comes not from treating these figures as peerless exemplars, but as successes within a social context, and by comparing and contrasting where their wisdom apparently reached its limits and why. This is captured by the Solomon Paradox, named after the great Jewish king who also boasted of his riches and ill-prepared his son to succeed him. Grossman argues that exploring these foibles and failings – and whether, in their historical context, they were as great a failing as we now assume – leads to intellectual humility and a sense of a bigger, more complicated picture.

    Putting the social context into wisdom isn’t an entirely new idea. Taoism teaches us to go beyond an individual point of view, and the Socratic method involves gaining wisdom through discussion, rather than solitary rumination. So consider this new paper a reminder of what these traditions have said: that wisdom is a way, a path along which our thinking becomes proportionate, balanced and expansive, and that it’s a journey enriched by company.

    Wisdom and how to cultivate it: Review of emerging evidence for a constructivist model of wise thinking

    Alex Fradera (@alexfradera) is Staff Writer at BPS Research Digest

    in The British Psychological Society - Research Digest on February 22, 2018 09:30 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    How does the microbiome influence brain health? (Part 2)

     

    This is the second post in a series exploring the neurobiology of the gut-brain connection published in partnership with neuroscientist Dr Amy Reichelt.


    Our intestines are home to an ecosystem of trillions of microbes including bacteria, viruses, phages, fungi, protists, and nematodes. Dominant are bacteria from the families Firmicutes and Bacteriodetes. As I talked about in my last blog, the collective community of these microbes in the intestines is the gut ‘microbiome’.

    How does the microbiome influence our health?

    The gut microbiome responds to its surrounding environment including to the foods we eat. Healthy versus unhealthy diets have a profound effect on the microbiome including which species dominate.

    For example, eating an unhealthy diet full of sugar and saturated fat alters the balance between ‘good’ Bactereoides and ‘bad’ Firmicutes bacterial species. Firmicutes are needed to digest fats, and eating a high-fat diet encourages their population to increase, leading in turn to weight gain. Bactereoides digest soluble fibre, so people who consume fibre-rich diets have more Bactereoides than Firmicutes.

    Research in germ-free mice, which lack gut microbes, showed altering the balance between Bactereoides and Firmicutes dictated how obese or lean mice were. A propensity for obesity could be transmitted from mouse to mouse via gut microbes alone.

    But to date, the link between changes in the microbiome and obesity in humans is “weak”.

    The brain regulates appetite, feeding behaviours, and energy balance, and although the gut microbiome contributes to obesity and metabolic syndromes such as diabetes, the exact mechanisms underlying this relationship are uncertain.

    How might gut bugs speak to the brain?

    The gut provides an environment for bacteria to ferment digested food into all kinds of chemicals and molecules that inform the brain of our current nutritional state. As mentioned in the previous blog, the microbiome affects the brain via mechanisms such as vagal nerve stimulation, immune and hormone systems and also via the production of bioactive microbial metabolites.

    Short-chain fatty acids (SCFAs) are metabolites produced in the gut by microbial fermentation of undigested dietary fibre and complex carbohydrates. SFCAs can cross the blood-brain barrier and modulate brain activity to regulate appetite and food intake.

    The SCFA butyrate protects the brain. Butyrate acts as an anti-inflammatory agent in the brain and the gut, reducing harmful toxins that can damage neurons. Butyrate fortifies the blood–brain barrier by tightening connections between cells, which prevents harmful toxins entering the brain causing neuron-damaging inflammation. A leaky blood-brain barrier is often seen in obesity and neurodegenerative disorders such as Alzheimer’s disease. Butyrate also boosts the production of molecules that are important for neuroplasticity including brain-derived neurotrophic factor (BDNF). Boosted levels of BDNF leads to enhanced memory performance in lab rats. Scientists have found that eating lots of junk food diets reduce SCFA levels so it is important to eat a healthy diet to maintain optimal brain function.

    How can we change our gut microbiome?

    I think of the microbiome like the different soils that plants grow in — certain plants grow well in nutrient-rich compost, but they won’t survive in dry, sandy soil. Western diets, especially those with low levels of fibre, have possibly reduced the diversity of microbiota over generations. By contrast, traditional diets high in fibre and low in sugar and fat increase microbiota diversity.

    So what can we do to ensure our microbiome supports our health?

    Researchers suggest the microbiota and its function are established by your genetic background and external factors, including the how you were born, environmental elements, exercise, and nutrition. It’s suggested the key determinant affecting the composition and activity of the gut microbiota is diet —  explaining about 57% of variations in total gut microbiota.

    We can influence the populations of gut microbes by making our gut environment as hospitable for them as possible by eating probiotic and prebiotic foods.

    Probiotics are foods or dietary supplements containing specific gut-beneficial microbes (such as Lactobacillus or Bifidobacterium) to stimulate the growth of these microorganisms in the gut. Probiotic-rich foods include live yoghurt, kombucha, kimchi, and fermented vegetables such as sauerkraut.

    Prebiotics are foods stimulate the growth of beneficial bacteria. I think of them as the nutrient-rich fertiliser that helps plants grow. Prebiotic-rich foods include bananas, onions, wheat, flaxseeds, legumes and leafy green vegetables, e.g. spinach, kale, and broccoli.

    Fermented foods contain microbes that aid gut health and are also prebiotic. The live microbes have already worked on digesting these foods, unlocking extra nutrients.

    Harnessing the gut microbiome to boost the brain

    A healthy gut microbiota is crucial for proper metabolic function and homoeostasis. It’s a win-win for us and our microbes — in exchange for living and proliferating in the gut, they keep us healthy.

    Changing our gut microbiome using probiotics and prebiotics has exciting potential to improve brain function. In my next blog, I’ll talk about a whole new field called ‘psychobiotics’ that could provide a novel therapy for people suffering from mood disorders.

    The post How does the microbiome influence brain health? (Part 2) appeared first on Your Brain Health.

    in Yourbrainhealth on February 22, 2018 05:41 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Human Chains: "Prayer Camp" Psychiatry Study Raises Ethical Questions

    A new medical paper raises complex questions over ethics and human rights, as it reports on a study that took place in a religious camp where mentally ill patients were chained up for long periods. The paper's called Joining psychiatric care and faith healing in a prayer camp in Ghana and it's out now in the British Journal of Psychiatry. The authors are a Ghanian-British-American team led by Dr Angela Ofori-Atta. In Ghana, the authors explain, there are just 25 psychiatrists to cater

    in Discovery magazine - Neuroskeptic on February 21, 2018 09:12 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    New fossils are redefining what makes a dinosaur

    While some researchers question what characteristics define the dinosaurs, others are uprooting the dino family tree altogether.

    in Science News on February 21, 2018 09:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    An amateur astronomer caught a supernova explosion on camera

    An amateur astronomer has caught a supernova explosion on camera.

    in Science News on February 21, 2018 06:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    BMC Ecology Highlights

    The Effects of Stress on Survival in Grey Mouse Lemurs

    By photographer: Gabriella Skollar; editor: Rebecca Lewis (Own work) [GFDL (http://www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/)], via Wikimedia Commons
    The relationship between glucocorticoid levels and survival in the grey mouse lemur were studied by Rakotoniaina et al. Through the implementation of a capture-mark-recapture model and hair cortisol concentration (a type of glucocorticoid) as a stress indicator, the effects of stress on survival in a natural setting were assessed.

    High levels of cortisol and reduced survival appeared to have an association. It was also found that very good body condition was associated with an increased survival semi-annually, but not during the reproductive season. Variations in parasitism did not predict survival.

    This study indicates that long-term increased glucocorticoid levels may be negatively associated with survival. These results highlight the need to consider environmental pressures that may affect glucocorticoid levels, thereby influencing survival.

    Population decline is often hard to measure, and this study demonstrates how an individual health indicator, such as long-term stress levels, may provide an alternate and easier method for detecting issues at a population level, thereby predicting wild populations’ responses to environmental challenges.

    Flycatchers in the Cold

    By Ivan Medenica (Own work) [CC BY-SA 4.0 (https://creativecommons.org/licenses/by-sa/4.0)], via Wikimedia Commons
    Briedis et al. investigated the effect of extreme weather changes on migrating semi-collared flycatchers. They found that the flycatchers delayed the last phase of their spring migration and the population experienced low return rates to breeding sites while enduring a severe cold spell en route.

    It was also found that the start of the spring migration in Africa, and timing of crossing the Sahara were consistent between late and early springs. However, breeding site arrival was dependent on spring phenology at the stopover points in each year.

    This study progresses our understanding of how a species biological rhythms and their external environmental stimuli interact. This enables us to greater understand how climate change may impact on migratory animals.

    A Citizen Science Approach to Investigating Road-kill Hotspots

    © Kellergassen Niederösterreich 2016 / Wikimedia Commons / , via Wikimedia Commons

    Heigl et al. studied the spatial patterns of road-killed reptiles and amphibians on tertiary roads using a citizen science approach. Open-access remote sensing data was obtained through the use of a citizen science app.

    The authors monitored the reptile and amphibian road-kills along 97.5km of tertiary roads, stretching across interurban, municipal and agricultural roads as well as cycling paths. The data was collected in eastern Austria across two seasons.

    72 reptile and 180 amphibian road-kills comprising eight species were recorded, mainly along agricultural roads. It was found that the highest concentrations of road-kills were located next to vineyards, suburban areas and arable land. Green toads, common toads and grass snakes were the most often found road-killed species.

    It was found that a citizen science approach was more cost-efficient than monitoring by professional researchers in instances where more than 400km of road are studied. This information may help authorities aiming to reduce road-kills and other researchers to implement large-scale monitoring of road-kill hotspots in a cost-effective manner.

    Which Frogs Will Reign in an Altered Tropical Forest

    Mareike Hirschfeld and Mark-Oliver Rödel investigated which traits make a species more tolerant to disturbances, thereby enabling their success and survival in altered tropical forests as well as pristine ones.

    Through analyzing published data in 243 frog species, they identified which traits enabled these amphibians to cope most effectively to modifications of their natural environment.

    Those that coped with modifications to their natural habitat best were; indirect developing species with a wide elevational distribution and large range size, which were independent of streams and inhabited leaf litter.

    This study helps to predict future frog communities in altered tropical forests by identifying the traits which are likely to persist in these new environments. These traits found in frogs are also linked with traits found in species with a low extinction risk.

    Habitat Destruction by Crayfish to Access Prey

    By Andrew C (Red Swamp Crayfish (Procambarus clarkii)) [CC BY 2.0 (http://creativecommons.org/licenses/by/2.0)], via Wikimedia Commons
    In their study, Nishijima et al. examined whether invasive crayfish deliberately destroy submerged vegetation which houses its prey in order to increase their feeding efficiency and therefore their growth rate.

    The researchers found that increasing crayfish density decreased vegetation and animal prey density whilst also increasing the rate of growth of individual crayfish.

    When the vegetation was replaced with an increased amount of artificial vegetation, the crayfish growth rate decreased and the survival of animal prey increased.

    The experiments determine that the aquatic vegetation strengthens the bottom-up control of crayfish, but this effect is relaxed when the density of crayfish is increased.

    This study contributes towards ecosystem restoration knowledge, as the introduction of submerged plants which are tolerant to crayfish destruction may assist in the rebuilding of other aquatic plants and therefore invertebrate populations at small scales.

    The post BMC Ecology Highlights appeared first on BMC Series blog.

    in BMC Series blog on February 21, 2018 03:01 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    A new study eases fears of a link between autism and prenatal ultrasounds

    On almost every measure, prenatal ultrasounds doesn’t appear to be related to a risk of developing autism, a recent study finds.

    in Science News on February 21, 2018 12:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Episode 11: How to Get a Good Night’s Sleep

    GettyImages-506725379.jpgThis is Episode 11 of PsychCrunch the podcast from the British Psychological Society’s Research Digest, sponsored by Routledge Psychology. Download here.

     
    Can psychology help us get a better night’s sleep? Our presenter Ginny Smith hears how worry about sleep is sometimes more of a problem than lack of sleep itself. She gives us some evidence-backed sleep tips and finds out about “sleep engineering” – deliberately manipulating the sleep process to aid memory and enhance its health benefits.

    Our guests are Professor Kenneth Lichstein at the University of Alabama and Professor Penny Lewis at the University of Cardiff.

    Background reading for this episode:

    Also, find many more studies on sleep and dreaming in our archive.

    Episode credits: Presented and produced by Ginny Smith. Mixing Jeff Knowler. PsychCrunch theme music Catherine Loveday and Jeff Knowler. Art work Tim Grimshaw.

    Check out this episode!

    Subscribe and download via iTunes.
    Subscribe and download via Stitcher.
    Subscribe and listen on Spotify.

    Past episodes:

    Episode one: Dating and Attraction.
    Episode two: Breaking Bad Habits.
    Episode three: How to Win an Argument.
    Episode four: The Psychology of Gift Giving.
    Episode five: How To Learn a New Language.
    Episode six: How To Be Sarcastic 😉
    Episode seven: Use Psychology To Compete Like an Olympian.
    Episode eight: Can We Trust Psychological Studies?
    Episode nine: How To Get The Best From Your Team
    Episode ten: How To Stop Procrastinating

    PsychCrunch is sponsored by Routledge Psychology.

    PsychCrunch Banner April 16

    Routledge interviewed PsychCrunch presenter Christian Jarrett about the aims of the podcast and engaging with the public about psychology research.

    in The British Psychological Society - Research Digest on February 21, 2018 09:46 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    How the world responded to Cell Press’s cloned monkey paper

    Highlights from social media and the press – and an overview of how media mentions are gathered and tracked

    in Elsevier Connect on February 21, 2018 06:33 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    A fake organ mimics what happens in the blink of an eye

    A newly crafted artificial eye could help researchers study treatments for dry eye disease and other ailments.

    in Science News on February 20, 2018 10:15 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    PNAS asks D.C. court to dismiss $10 million defamation lawsuit

    WASHINGTON, DC — Lawyers for the National Academy of Sciences have asked a District of Columbia court to dismiss a $10 million defamation suit brought by a Stanford University professor. Mark Jacobson, an engineering professor at Stanford who has published research about the future of renewable energy, alleged he was defamed in a June 2017 […]

    in Retraction watch on February 20, 2018 10:02 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    How to build a human brain

    Organoids, made from human stem cells, are growing into brains and other miniorgans to help researchers study development

    in Science News on February 20, 2018 08:30 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Target of $2M recruitment grant falsified several images: ORI

    A former NIH postdoc recruited to a tenure-track position last year committed multiple acts of misconduct in two papers, according to the U.S. Office of Research Integrity. According to the new notice, issued by the ORI, Colleen Skau altered results and multiple figures across the papers, published in Cell and PNAS. The misconduct occurred while […]

    in Retraction watch on February 20, 2018 02:35 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Are computers better than people at predicting who will commit another crime?

    If crime-predicting computer programs aren’t any more accurate than human guesswork, do they still have a place in the criminal justice system?

    in Science News on February 20, 2018 02:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The flowers that give us chocolate are ridiculously hard to pollinate

    Cacao trees are really fussy about pollination.

    in Science News on February 20, 2018 12:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Study conducted during war finds one symptom that is especially indicative of PTSD vulnerability

    GettyImages-452430810.jpgBy Emma Young

    After a traumatic event, some people develop post-traumatic stress disorder (PTSD) – generally within about a month – while others don’t. Identifying those most at risk could allow for targeted interventions, aimed at stopping the disorder developing. So how do you spot these people?

    One way of exploring this question involves viewing PTSD as a dynamic process in which symptoms interact over time to cause the disorder, and some symptoms likely play a bigger causal role than others. So if you can identify the most problematic symptoms, and the people displaying them, at an early stage, then you can work out not only who to target but which symptoms to focus on.

    In a new paper due for publication in Psychological Medicine and released as a pre-print at the Open Science Framework, a team of researchers from Israel and Amsterdam conducted just such an analysis on data collected from Israeli civilians during the 50-day Israeli-Gaza war of 2014. “It is important to note that collecting [this kind of] data regarding traumatic stress symptoms during a conflict situation is unparalleled in the literature,” the researchers write.

    Talya Greene at the University of Haifa led the study of 96 people who were living in communities that were exposed to rocket fire. The participants were recruited via adverts and entered the study between eight and 24 days into the conflict. Twice a day, for 30 days, they received a personalised email link to an online version of a 20-item diagnostic checklist for PTSD.

    This checklist asks people to self-report on four categories of symptoms: intrusions (such as nightmares and flashbacks); avoidance (of thoughts about and reminders of the  traumatic event); negative alterations in mood and thoughts (such as amnesia, blame of self and others, and reduced emotion); and alterations in arousal and reactivity (which measures, among other things, irritability and anger, sleep problems, hyper-vigilance, difficulties in concentrating, and the startle response – an involuntary reaction to a flash of light, a loud noise, or a sudden, potentially threatening movement).

    Every morning, and every evening, the participants recorded any symptoms that they had experienced since completing the previous checklist.

    The researchers found that showing the startle response was clearly the most important predictor of future PTSD symptoms. Reduced emotion, blame, negative emotions and an avoidance of thoughts relating to the conflict were also, in order of decreasing importance, predictive of future symptoms. Participants who showed any of these five symptoms earlier in the study (but the startle response, especially) were more likely to show even more PTSD-related symptoms later on – and those who had more of these predictive symptoms early on developed even more of the other symptoms of PTSD over time.

    The researchers believe this means these are the specific symptoms that could indicate PTSD vulnerability and that targeting them early on could have a preventative benefit.

    There were some limitations to the study. One is that it began during the conflict period, and in fact for many participants the conflict was still unfolding by the time they completed their final checklist. PTSD usually develops in the month after a trauma, but in some cases, it takes much longer to appear. These findings may not apply, then, after a trauma. Also, as the researchers themselves point out, they didn’t take some potentially relevant variables – such as gender, trauma history and severity of exposure to the conflict – into account in their analysis. And it would have been interesting to know which – if any – of the participants would have met the criteria for a diagnosis of PTSD (a process that requires more than completion of a symptom checklist.)

    Still, as the researchers conclude: “More studies in this vein can elucidate the complex processes by which PTSD symptoms crystallise into disorder and indicate possible causal mechanisms that drive the development of PTSD. This in turn could have clinical implications for identifying the relevant importance of symptoms as targets for interventions.”

    Dynamic networks of PTSD symptoms during conflict

    Image: A wounded Israeli soldier arrives at a hospital for treatment on July 20, 2014 in Ashkelon, Israel. (Photo by Andrew Burton/Getty Images)

    Emma Young (@EmmaELYoung) is Staff Writer at BPS Research Digest

    in The British Psychological Society - Research Digest on February 20, 2018 09:53 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Cool the planet: drink tap water and eat less meat at dinner time

    The global food system, from fertilizer manufacture to food storage and packaging, is responsible for up to a staggering one-third of all human-caused GHG emissions. Although there is a need to make food systems more efficient and sustainable, this in itself is not adequate to bring GHG emissions down to acceptable levels; changes also need to be made to our diets.

    But what changes are required? Elisabeth Temme and colleagues at The National Institute for Public Health and the Environment (RIVM) set out to answer this question by looking at food consumption in the Netherlands, where the intake of fruits and vegetables is very low and where the intake of saturated fatty acids and sodium is very high.

    Consumption data was obtained from the most recent food consumption survey in the Netherlands, the Dutch National Food Consumption Survey (DNFCS) 2007-2010. All consumptions were labelled as one of seven food consumption occasions, distinguishing between the three main meals (breakfast, lunch, dinner) and all moments in between meals (before breakfast, in between breakfast and lunch, in between lunch and dinner, after dinner). Socio-demographic and lifestyle data such as physical activity and educational level was also obtained.

    In order to estimate GHG emissions associated with different foods, life cycle assessments were performed; these look at the entire life cycle of a product from primary production, processing, use of packaging and transport to storage as well as the preparation of the food. The GHG emissions associated with the preparation of foods in the consumer phase was based on the average cooking time for each product . Food waste was also included by using food group specific percentages for avoidable and unavoidable food losses throughout the food chain, including the consumer phase.

    The researchers found that, in the Dutch diet, the GHG emission varies per consumption occasion. In the highest tertile of dietary GHG emission, dinner was associated with the highest GHG emissions, and consumptions in between meals were associated with the second-to-highest emissions. Meat consumption contributed most to the GHG emission of dinner, while beverages contributed most to the GHG emission of consumptions in between meals. Dairy (including cheese) was an important contributor to total daily dietary GHG emission throughout the day.

    Mean daily GHG emission (kg CO2-eq) of food groups per consumption occasion in reference scenario.
    Source: Temme, van de Kampe and Seves, 2018

    The authors also developed various scenarios to lower GHG emissions of people in the highest tertile of dietary GHG emission: 1) reduce red and processed meat consumed during dinner by 50% and 75%, 2) replace 50% and 100% of alcoholic and soft drinks (including fruit and vegetable juice and mineral water) by tap water, 3) replace cheese consumed in between meals by plant-based alternatives and 4) two combinations of these scenarios. Effects on GHG emission as well as nutrient content of the diet were assessed.

    Flickr: Government of Prince Edward Island

    The most effective single dietary change was to reduce the consumption of meat, with a 75% reduction in red and/or processed meat during dinner resulting in a 24% reduction in GHG emissions for men and a 22% reduction for women. Furthermore, replacing the consumption of all alcoholic drinks and soft drinks (including mineral water and fruit and vegetable juices) with tap water was also successful in reducing dietary GHG emissions.

    The dietary changes are also beneficial for health as they are more in line with dietary guidelines and reduce saturated fatty acids and sugar intake. For those who may not meet energy or iron requirements with these changes in diet, low GHG emission and nutritious replacement foods might be needed.

    Although the scenarios in this study were focused on people in the highest tertile of GHG emissions, it is expected that the dietary changes proposed here will also lead to reductions in GHG emissions for people in the lowest or intermediate tertiles. However, further research is required to determine whether this is indeed the case.

    The post Cool the planet: drink tap water and eat less meat at dinner time appeared first on BMC Series blog.

    in BMC Series blog on February 20, 2018 09:30 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    When — and how — should journals flag papers that don’t quite meet retraction criteria?

    Readers of Retraction Watch will be no strangers to the practice of issuing Expressions of Concern — editorial notices from journals that indicate a paper’s results may not be valid. While a good idea in theory — so readers can be aware of potential issues while an investigation is underway — in practice, it’s a […]

    in Retraction watch on February 19, 2018 01:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Modern tech unravels mysteries of Egyptian mummy portraits

    A museum exhibit showcases what modern analytical tools can reveal about ancient Egyptian funerary portraits and mummies.

    in Science News on February 19, 2018 01:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Mix of metals in this Picasso sculpture provides clues to its mysterious origins

    The alloys used to cast Picasso’s bronze sculptures provide a valuable piece of the puzzle in reconstructing the histories of the works of art.

    in Science News on February 19, 2018 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Nature vs Nurture: Mothers with multiple children have an intuitive grasp of behavioural genetics

     

    Screenshot 2018-02-19 10.37.35.pngLower scores equals more accurate estimates of genetic inheritance. From Willoughby et al 2018

    By Christian Jarrett

    Several prominent psychologists have recently raised concerns that the “radical left” has a stranglehold on free speech and thought in our universities. The psychologists argue this includes biological denialism: claims that differences between individuals and groups are entirely the result of the biased system or mere social constructions. More generally, many commentators are horrified by the apparent resurgence of far-right ideologies and their twisted interpretation of genetic science.

    It’s timely, then, that a team of researchers, led by psychologist Emily Willoughby at the University of Minnesota Twin Cities, recently surveyed over 1000 online US participants, asking them about their personal circumstances, education, political orientation, and also to estimate the relative contribution of genes and the environment to variation in 21 different human traits, from eye colour to intelligence. This is probably the most detailed study to date of people’s insights into behavioural genetics, and the findings have just been published as a pre-print at the Open Science Framework.

    Comparing participants’ judgments against the best estimates from the behavioural genetics literature, the researchers found a surprisingly high correlation of .77 (where 1 would be a perfect score). “People’s observations and intuitions about the genetic contributions to human traits are relatively informed,” the researchers said.

    Participants’ estimates tended to group into four main categories: physical traits (height, eye colour etc); psychological (intelligence, personality); lifestyle (obesity, blood pressure); and psychiatric (depression, ADHD etc). One interpretation of this is that when making nature/nurture judgments, people tend think in terms of these approximate categories and they infer that traits within a category are subject to a similar level of genetic influence.

    General educational level, and specifically participants’ genetic knowledge were related to the accuracy of their estimates, but only to a surprisingly modest degree.

    What about political orientation? Left-leaning liberals estimated a greater genetic contribution to psychiatric disorders and sexual orientation compared with conservatives, while conservatives assumed a relatively greater contribution of genes to traits like intelligence and musical ability. This led to what the researchers called “a surprising sort of ‘balancing out’,” meaning that individuals’ accuracy did not differ by political persuasion.

    The researchers believe this pattern is consistent with the idea that moral judgments are central to the political split in the USA. Right-wing participants more strongly endorsed the idea that some people have more innate aptitude than others, while the left-wing participants more strongly endorsed the idea that many stigmatised traits are largely innate and should therefore be treated with fairness and compassion, not judgment.

    So there was no greater biological denialism or “blank slatism” by one political wing than the other, but rather a genetic cherry-picking to suit one’s own world view.

    Interestingly, sexual orientation was something of an outlier in the study and the trait for which participants’ intuitions diverged the most from the published literature. The researchers do not offer any speculation for why this might be. Most participants strongly overestimated the genetic contribution to this trait, but conservatives did so less than liberals.

    Surprisingly perhaps, more strongly related to overall accuracy than education, politics or genetic knowledge, were gender and having children. Women were more accurate in their estimations than men, and participants with non-adopted children were more accurate than those without. The most accurate judges of all were women with multiple non-adopted children. “Mothers may be uniquely observant of their children’s abilities, needs, and attributes,” the researchers said.

    Willoughby and her colleagues took heart from these findings. “While it is clear that social and political biases do inform the magnitude of heritability judgments, the best predictors of these judgments are education and parenthood – an encouraging prospect indeed for the public understanding of findings from behaviour genetics.”

    On a more sceptical note, however, this survey was based entirely on myriad correlations and can’t tell us too much about what influences people’s beliefs about behavioural genetics, nor how these beliefs develop through life. For instance, it’s plausible that beliefs about genetic heritability shape people’s political views, as much as the other way around. There may even be genetic influences on people’s insights into genetic inheritance, perhaps manifested through personality, intelligence and political leanings (only the last was measured in this survey).

    In terms of better understanding the roots of some of the scientifically dubious beliefs on display on our college campuses and elsewhere, we probably need studies that follow people over time and that are focused at the more extreme ends of the political spectrum – only around 4 to 5 per cent of the current participants identified as very conservative and around 16 per cent as very liberal.

    These issues aside, this study makes a novel contribution to a little studied topic, and the idea that mothers have a unique insight into behavioural genetics gained through the experience of parenting will surely resonate with many readers!

    Free will, determinism, and intuitive judgments about the heritability of behavior [this paper is a pre-print that has not yet been subjected to peer review]

    Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

    in The British Psychological Society - Research Digest on February 19, 2018 10:50 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Babies can recover language skills after a left-side stroke

    Very young babies who have strokes in the language centers of their brain can recover normal language function — in the other side of their brain.

    in Science News on February 18, 2018 08:45 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Disability Bias in Peer Review?

    Writing in the journal Medical Care, researcher Lisa I. Iezzoni says that a peer reviewer on a paper she previously submitted to that journal displayed "explicitly disparaging language and erroneous derogatory assumptions" about disabled people. Iezzoni's paper, which was eventually rejected, was about a survey of Massachusetts Medicaid recipients with either serious mental illness or significant physical disability. The survey involved a questionnaire asking about their experiences w

    in Discovery magazine - Neuroskeptic on February 18, 2018 03:05 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    This stick-on patch could keep tabs on stroke patients at home

    New wearable electronics that monitor swallowing and speech could aid rehabilitation therapy for stroke patients.

    in Science News on February 17, 2018 09:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Weekend reads: We’re back! (We hope); the data thugs; heroes of retraction

    As many of our readers will know, we’ve been having serious technical issues with the site. We’re cautiously optimistic that they’ve been solved, so although we’re still working on fixes, we’re going to try posting again. Thanks for your ongoing patience. This week, we posted at our sister site, Embargo Watch. Here are those posts: […]

    in Retraction watch on February 17, 2018 02:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Americans would welcome alien life rather than fear it

    Americans would probably take the discovery of extraterrestrial microbes pretty well.

    in Science News on February 16, 2018 10:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Ants practice combat triage and nurse their injured

    Termite-hunting ants have their own version of combat medicine for injured nest mates.

    in Science News on February 16, 2018 07:14 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    James Webb Space Telescope challenges artists to see in infrared

    Astronomy artists face new challenges in translating James Webb’s invisible data into visuals.

    in Science News on February 16, 2018 05:58 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    To hear the beat, your brain may think about moving to it

    To keep time to a song, the brain relies on a region used to plan movement — even when you’re not tapping along.

    in Science News on February 16, 2018 03:49 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    BMC Veterinary Research: editor picks of 2017

    Editor, Dr. Hayley Henderson:“It was a successful year, and we would like to thank all of our authors, referees and our dedicated editorial board for their support and contribution. There is a greater push for more complete and transparent reporting in veterinary research, and for effective collaboration between human and veterinary healthcare professionals. Some of our 2017 publications highlight the importance of these endeavors, and we have used these papers to help create our 2018 objectives to improve veterinary and animal research reporting. We look forward to sharing our plans with our readers soon.”

    Environmental enrichment for captive cats

    Cats kept exclusively indoors may become bored or stressed which can exacerbate negative behaviors such as being aggressive or destructive. Catnip (Nepeta cataria) is a well known olfactory stimulant which causes a euphoric response in some domestic cats and is used by some cat owners to enrich their pet’s environment.

    Bol et al. aimed to test catnip and other plants with anecdotal evidence of causing euphoric responses in cats. The researchers tested catnip, silver vine, Tatarian honeysuckle and valerian root on 100 domestic cats.

    Their results showed that most of the sampled cats had a positive response to the olfactory stimulation. One in three cats did not respond to catnip. A much higher percentage of cats responded to the silver vine (80%) and approximately 50% of the cats responded to the honeysuckle and valerian root. Interestingly, 75% of the cats which did not respond to catnip did respond to silver vine, and a third to the honeysuckle.

    The authors concluded that silver vine and Tatarian honeysuckle may be good alternatives for cats which do not respond to catnip, and suggest that olfactory stimulation may be effective for improving quality of life for cats.

    Exploring the influence of funding source on the quality of animal randomized control trial reporting

    Randomized control trials are essential for veterinary evidence-based clinical research and it is important that trial reporting is transparent and complete. Animal trials are often poorly reported compared to human trials, mainly because human trial guidelines are considered more rigorous.

    Wareham et al. reviewed a sample of veterinary randomized control trials published in 2011. Their study looked at the number of outcomes measured, number of animals enrolled, whether primary outcome measures were stated, and sample size calculations were performed before animals were enrolled. The authors also studied whether the trial funding source influenced trial design and delivery.

    They concluded that veterinary clinical trial, design and delivery could be substantially improved and suggested that low quality reporting impedes veterinary evidence-based practice.

    Their results showed that in 126 trials, the median number of outcome measures reported was 5 and median number of animals enrolled in each trial was 30, neither of which significantly differed across funding groups. Primary outcomes were stated in 40.5% of the trials studied, and interestingly these were more likely to be stated if the trial was funded by a pharmaceutical company. Only 14.3% of the trials reported their sample size calculation. They concluded that veterinary clinical trial, design and delivery could be substantially improved and suggested that low quality reporting impedes veterinary evidence-based practice.

    Searching for the source of “phantom” scratching in canine syringomyelia

    Syringomyelia is a complex neurological condition which affects some dog and cat breeds. The condition is characterized by a malformation in the spine affecting the flow of cerebrospinal fluid, causing “pockets” of fluid, called syrinx, to accumulate in the spinal cord. A common clinical sign in dogs is phantom scratching, a compulsion to scratch towards the shoulder or neck without actually scratching.

    Nalborczyk et al. performed a retrospective cohort study on the medical records of 37 Cavalier King Charles spaniels with a diagnosis of symptomatic syringomyelia. All dogs’ MRI files were reviewed to determine the maximum perpendicular dimensions of syrinx in the spinal cord quadrants.

    Source: Pixabay

    Their results showed that phantom scratching in the dog may be associated with a syrinx located in the C2-C5 vertebrae, which correlates to a section of the spinal cord called the superficial dorsal horn. The authors concluded that phantom scratching in dogs may happen after superficial dorsal horn cervical neurons are damaged. It may be possible that this neuronal damage influences the activity of scratching behavior.

    Outlining quality assurance and best research practices for non-regulated veterinary clinical studies

    Animal clinical trials can generate volumes of useful data which can translate knowledge from clinical research to veterinary and human clinical practice. However, the usefulness of these studies is dependent on the quality of their reporting. Davies et al. outlined quality control and assurance procedures which they believe should be built into all veterinary clinical studies.

    The authors referred to numerous resources for designing and conducting regulated clinical trials as well as suggesting the use of local guidelines to improve trial design and execution. The authors commented that even in non-regulated clinical trials, following trial guidelines would make them more useful for translation into the clinical setting.

    It is further discussed that funders, publishers and quality assurance organizations are increasingly exploring ways to improve the quality of reporting including using standardized guidelines and policies. The authors concluded that research outcomes should be constantly scrutinized as new data emerges in regulated and non-regulated research settings.

    Stimulating collaboration between human and veterinary health care professionals for One Health

    The One Health approach is a concept which aims to improve health and well-being of humans and animals by using an interdisciplinary approach to address the relationship between health care for humans and animals as well as the environment. The approach requires a collaborative effort between human and veterinary healthcare professionals; however documented collaboration between these groups is limited.

    Professional associations may develop the One Health approach by acting as a facilitator to increase collaboration between health care professionals.

    In this study Eussen et al. explored the social dilemma experienced by health care professionals to help understand how the collaborative effort could be improved. The authors developed a questionnaire  to assess the relationship between collaboration and a common goal. The survey was completed by 368 respondents, 16% were human healthcare professionals 84% were veterinary healthcare professionals.

    It was found that a common goal stimulates collaboration. The authors suggested that professional associations may develop the One Health approach by acting as a facilitator to increase collaboration between health care professionals.

    Exploring the possibilities of diagnosing adverse food reactions in dogs and cats with in vivo and in vitro tests

    Adverse food reactions in dogs and cats usually involve eliminating certain foods from the diet and using provocation tests to identify intolerances or allergies, which is a similar method used in human medicine. However this method can be considered laborious by pet owners, and there is a low rate of compliance.

    A laboratory test using samples of blood, saliva or hair could be more convenient for pet-owners who have animals showing signs of an adverse food reaction. In this study, Mueller and Olivry systematically reviewed 22 publications reporting data for adverse food reaction tests.

    Source: Pixabay

    The authors found articles describing several different tests such as lymphocyte proliferation tests, serum tests for food-specific allergies and colonoscopy tests. Lymphocyte proliferation tests gave the most promising results, but are difficult to perform. Until a more simple protocol can be developed, this test can only be used as a research tool. The other tests reviewed were not considered currently useful so the best diagnostic procedure remains the elimination diet with provocation trials.

    The post BMC Veterinary Research: editor picks of 2017 appeared first on BMC Series blog.

    in BMC Series blog on February 16, 2018 02:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The neurology of the Winter Olympics

    The human brain is a wonder and a marvel. At the same time, it is enigma and frustration. Given all it has accomplished, it continues to perplex. This is why I became a neurologist. For me, combining the apex of all organic structures with the vast unknown of cerebral neuroscience produces a daily wonder that is worth dedicating a life’s work to. To that end, I find myself somewhere over the North Pole hurling towards PyeongChang, South Korea.

    Four years ago, roughly, I was on a similar flight to Sochi, Russia for my first experience as a physician for the US Olympic Team. Then, despite all of the preparation, work, dedication, and sacrifice it took to be on that flight, I felt a degree of uneasy apprehension.

    “The Olympics!” people would say when they found out why I’d be away from work and family for three weeks, “So amazing… what an honor!”

    So, mid-flight jitters were created, mainly on the backs of others’ vicarious attestations. In the end, when faced with that same dichotomy of known vs. unknown and the responsibility of caring for the brain of another human, I fell back on what I first learned as a medical student and intern at Charity Hospital in New Orleans. Stay within yourself. Adjust to the unknown. First, do no harm. You may not have every answer in front of you or on the internet (which didn’t exist when I was at Tulane)… so solve the problem as best you can! These are valuable lessons for all physicians, to be sure, but even more so for neurologists.

    These lessons served me well in Sochi, as a foundation wrapped in the premise that it doesn’t matter to the person you’re caring for whether they were hurt in an Olympic halfpipe, a neighborhood skate park, or a French Quarter misstep. In the end, it comes down to the same thing: understanding the brain well enough to appreciate the scope of its complexities while simultaneously having the humility and perspective to admit what you don’t know, expect to be challenged, yet still find a way to provide medical care to the best of your abilities.

    Stephanie Beckert at the Women’s 5000m final by Robert Scoble. CC BY 2.0 via Wikimedia Commons.

    In the past week, you may have already had the chance to watch some of the world’s best athletes do some truly amazing things. In the days to come, hopefully you can watch some more. When you do, take a moment to appreciate what is allowing them to perform, at the brain level. Wonder at how many neurons it takes to twist and flip an athlete while first they go skyward, their skis launching them up, and then continue their contortions as they come down again when gravity wins the day. Consider the cerebral networks involved in an ice-hockey game: the visuospatial, motor, cognitive, proprioceptive, balance networks (to name a few!). Marvel at what it takes to land a quad jump, have the reaction time and attention to navigate a downhill course at 90 mph, produce the fluidity of movement to win a gold medal in ice dancing. Dancing… on millimeters-thick razor blades!

    Then stop and consider what happens when any one of those brains experiences a sudden stop, a deceleration caused by ice, or snow, or an opponent’s elbow. Normal physiological function is impaired if not temporarily halted. The eloquent firing of neurons, allowing each to participate in the formation of a purposeful network, is replaced by sluggish, inefficient recruitment. Inside the brain, the injury is obvious… if only one could be there or measure it! In the outside world, however, we rely on the clinical manifestations of the injury to guide diagnosis and management. Sometimes the clinical effect is obvious; other times it’s subtle. Still for others, although the injury is present, there is simply no clinical effect at all. The marveled married with the unknown. This is why I love what I do!

    Tonight, from the Olympics standpoint, I have the advantage of experience and perspective, so no in-flight uneasiness this time around! From the medical perspective, however, in the weeks to come I’ll really be where I always am, practicing medicine in the intersection of reverence for human accomplishment and respect for the unknown.

    Featured image credit: Ivica Kostelić at the 2010 Winter Olympic downhill by Jon Wick. CC BY 2.0 via Wikimedia Commons

    The post The neurology of the Winter Olympics appeared first on OUPblog.

    in OUPblog - Psychology and Neuroscience on February 16, 2018 11:30 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    What are the psychological dynamics when a couple tries to change a habit together?

    GettyImages-650384784.jpgBy Alex Fradera

    Changing an unhealthy habit depends a lot on your belief that you can do it, something psychologists call self-efficacy. Take smoking, for example. Your belief that you are capable of quitting will influence the likelihood you will decide to quit in the first place, the amount your smoking reduces, and your chances of staying smoke-free in the long-term.

    This self-belief doesn’t come out of nowhere. Besides seeing ourselves make progress (called “mastery”), health psychologists will tell you that one of the most important inspirations is seeing others successfully make the changes that you desire. To test how true this is, Lisa Warner from the Freie Universität Berlin and her colleagues looked at the impact on smokers of having a partner whose own attempt to quit is going well. Their findings, published in the British Journal of Health Psychology, didn’t fit the expected pattern – but there’s news that co-quitting couples can help each other make a difference.

    Warner’s team asked 85 couples, made up of partners who had chosen to quit together, to keep a diary of their progress. At the end of each day, every participant recorded whether they had smoked any cigarettes that day (to indicate their mastery) and also their feelings of self-efficacy regarding the challenge of quitting, rating their agreement with items such as “I am confident that I can refrain from smoking tomorrow even if it is difficult”. The researchers expected that when one partner improved their mastery, this should boost their other half’s self-efficacy the next day.

    This isn’t quite what the researchers found, but partners certainly mattered. The day-by-day analysis showed that a participant’s self-efficacy was more likely to go up when their partner had shown increases in their own self-efficacy the day before. So partner confidence was contagious. The same was true for mastery: one partner’s success predicted their other half’s next-day success (or another way to see this: when one person gave in, it was more likely that their partner would succumb on the next day). Intriguingly, however,  partner mastery didn’t seem to affect a participant’s next-day self-efficacy.

    The fact that witnessing success in a close other wasn’t a driver of self-efficacy is a puzzle for the researchers, but overall this is still important news for couples trying to make healthy changes together – one way or another, a determined partner can be a source of support for finding your way out of smoking – a habit that kills around six million people a year. So put your mind to it, lean into that success cycle, and know that your efforts are feeding those of the person you love.

    Day-to-day mastery and self-efficacy changes during a smoking quit attempt: Two studies

    Alex Fradera (@alexfradera) is Staff Writer at BPS Research Digest

    in The British Psychological Society - Research Digest on February 16, 2018 09:05 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Strong winds send migrating seal pups on lengthier trips

    Prevailing winds can send northern fur seal pups on an epic journey.

    in Science News on February 15, 2018 09:32 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Household products make surprisingly large contributions to air pollution

    A study of smog in the Los Angeles valley finds that paints, fragrances and other everyday items are a growing component of the problem.

    in Science News on February 15, 2018 07:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Fossil footprints may put lizards on two feet 110 million years ago

    Fossilized footprints found in South Korea could be the earliest evidence of two-legged running in lizards.

    in Science News on February 15, 2018 06:19 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    In Borneo, hunting emerges as a key threat to endangered orangutans

    Only small numbers of Bornean orangutans will survive coming decades, researchers say.

    in Science News on February 15, 2018 05:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Higher cigarette taxes may increase use of chewing tobacco and cigars in adolescents

    While the continued decline of adolescent cigarette use has been celebrated as a public health success, we should put away our party hats when we consider that overall tobacco use has remained relatively stable in recent years. In 2016, 20% of adolescents used tobacco products while 8% used cigarettes. Is that how we want to define success?

    Increasing cigarette taxes has been a major policy driver to decrease smoking, including adolescent smoking, while taxes on other tobacco products have received less attention. Taxes on cigarettes, smokeless tobacco, and cigars are all fiscal policies, but they are not all equal. While state taxes on cigarettes have increased substantially over the past decade, there has been little change in policies governing alternative tobacco products.

    Examining the impacts of tobacco taxes

    The aim of our study published in BMC Public Health was to evaluate the impact of chewing tobacco and cigar taxes, cigarette taxes, and smoke-free legislation on adolescent use of smokeless tobacco and cigars.

    Changes in tobacco control policies across and within US states created a natural experiment, which we were able to examine using data on 499,381 adolescents from 36 US states. In the Youth Risk Behavior Survey, adolescents’ self-report on how many days they used smokeless tobacco, cigars, and cigarettes over the past month. We linked information on state chewing tobacco taxes, cigar taxes, cigarette taxes and smoke-free restaurant legislation to each adolescent based on the year the survey was completed.

    We found that chewing tobacco taxes had no effect on smokeless tobacco use and cigar taxes had no effect on cigar use. In contrast, a 10% increase in cigarette taxes was associated with a 1.0 percentage point increase in smokeless tobacco use among adolescent males. A 10% increase in cigarette taxes was also associated with a 1.5 percentage point increase in cigar use among adolescent males and a 0.7 percentage point increase in cigar use among adolescent females.

    We also found some evidence that smoke-free legislation, such as a restaurant smoking ban, was associated with a 1.1 percentage point increase in smokeless tobacco use among adolescents, but only in males. There was no effect of smoke-free legislation on cigar use for males or females.

    Taxes may be driving adolescents to cheaper alternatives

    When developing tobacco control policies, we need to think beyond the conventional cigarette.

    Our results suggest that higher state cigarette taxes are associated with adolescents’ use of cheaper, alternative tobacco products such as smokeless tobacco and cigars. In our data, from 1999 through 2013, 30/33 states increased their cigarette taxes while 16/33 states increased their taxes on chewing tobacco or cigars. As a percent of the retail price, this translates to a 73% and 58% average increase in chewing tobacco and cigar taxes, respectively. In contrast, combined with the federal cigarette tax increase of $1.01 in 2009, cigarette taxes ($/pack) increased by 141% in real terms over the study period.

    Unless taxes on tobacco products are inflation-adjusted, their impact diminishes over time. If taxes on alternative tobacco products are not increased in line with cigarette taxes, some of the decrease in cigarette use reflects adolescents switching to other products. This would not be considered a public health success as health risks from alternative products are the same or greater than cigarettes.

    When developing tobacco control policies, we need to think beyond the conventional cigarette. Adolescents are price sensitive and look to experiment with tobacco. Unfortunately the Youth Risk Behavior Survey did not ask about e-cigarettes or hookahs—two products that continue to gain in popularity.

    Reducing adolescent tobacco use will require comprehensive tobacco control policies that are applied equally to and inclusive of all tobacco products, including chewing tobacco and cigars. If we can align our policies so populations that are more price-sensitive cannot easily substitute products, we may begin to see overall reductions in tobacco use. If that’s the case, then we’re ready for a celebration.

    The post Higher cigarette taxes may increase use of chewing tobacco and cigars in adolescents appeared first on BMC Series blog.

    in BMC Series blog on February 15, 2018 09:30 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    How a technical university is using research analytics to plan for international success

    The National Taipei University of Technology is using Scopus and SciVal to benchmark its research strengths and boost international collaboration

    in Elsevier Connect on February 15, 2018 08:57 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Psychologists clash over how easy it is to implant false memories of committing a crime

    By guest blogger Simon Oxenham

    Historically, the kind of false memories induced in volunteers by psychologists have been relatively mundane. For example, a seminal study used leading questions and the encouragement to confabulate, to apparently implant in participants the memory of getting lost in a shopping mall as a child. This reliance on mundane false memories has been problematic for experts who believe that false memories have critical real world consequence, from criminal trials involving false murder confessions, to memories of child abuse “recovered” during therapy using controversial techniques.

    The discrepancy between psychologists’ lab results and their real world claims vanished abruptly in 2015 when Julia Shaw (based then at the University of Bedfordshire) and Stephen Porter (University of British Columbia) shocked the memory research community with their staggering finding that, over several interview sessions, and by using false accounts purportedly from the participants’ own caregivers, they had successfully implanted false memories of having committed a crime as a teenager in 70 per cent of their participants, ranging from theft to assault with a weapon. But now other experts have raised doubts about these claims.

    When first released, Shaw’s and Porter’s findings, published in Psychological Science, received widespread coverage, perhaps most prominently in the US Public Broadcasting Service (PBS) documentary Memory Hackers (see clip above). I also contributed to the media reaction, covering the study in detail on my blog at Big Think, writing that “the risk of forming shocking false memories … may be greater than previously thought”.

    However, the famous 2015 findings have now been called into question following a reanalysis of the data, published (also in Psychological Science) by Kimberley Wade at Warwick University and her colleagues. According to them, most of the false memories Shaw and Porter produced weren’t really false memories at all. This was because rather than using trained judges to determine, following established criteria, whether participants had truly recalled having committed a crime, Shaw and Porter developed new criteria for what constitutes a false memory. These included:

    • answering “yes” to the question “Did you believe that you had forgotten the event and that it actually happened?”
    • citing 10 critical false details presented by the researchers
    • providing a basic account of the false event in response to the instruction “tell me everything you remember from start to finish”

    “The problem with the [false details] criterion” Wade explains, “is that subjects in false memory studies frequently speculate and imagine, and they talk out loud about what the suggested experience could have been like. They may say ‘I think my classmate was wearing a red t-shirt, he wore that a lot, and it must have happened on my street, I guess, near my house’.  But in the next sentence, they might say ‘Well, I can imagine it happening but I just don’t remember it at all’. In a case like this, Shaw and Porter would say that the subject recalled 6 details in this first sentence alone (i.e. classmate + red + t-shirt + wore a lot + my street + near house).” According to Wade et al. this fails to distinguish between people who merely speculated about the possibility that they committed a crime and people who appeared to genuinely remember it. This may have led to a wildly inflated false memory statistic, they argue.

    When Wade’s team recoded Shaw and Porter’s data using the latter’s own scheme, they replicated the startling 70 per cent result. However, when they used a more traditional coding scheme, they found only 30 per cent of subjects met the criteria for false memories, with a further 43 per cent showing evidence of holding false beliefs (believing they’d committed the made-up crime, but not remembering it). This still makes Shaw and Porter’s 2015 paper groundbreaking, given the extreme nature of the false events. However, the re-calculated stats are more in keeping with previous false-memory findings.

    To test if their doubt in Shaw and Porter’s interpretation would be supported by laypeople, Wade’s team presented transcripts of Shaw and Porter’s data to 300 participants recruited using Amazon’s Mechanical Turk survey website. In cases that Shaw and Porter had classified as a false memory but that Wade’s team reclassified as a false belief, the participants expressed low confidence that the reports showed evidence of remembering.

    Shaw, who is now affiliated with UCL, has issued a rebuttal to Wade et al.’s critique, arguing that existing coding strategies are inadequate. “My argument is that I coded my data (intentionally) differently” says Shaw. “One could argue that across all currently accepted types of coding it is agreed that a significant number of individuals in this study came to believe they committed a crime that never happened.” Shaw also questions the use of laymen, saying “There is no reason to assume that laypeople would be any good at identifying or defining false memories”.

    The debate raises an interesting question: if someone can be convinced that a false event is real, does it matter whether they remember it or not? Shaw says that “such a distinction was not seen as particularly relevant for the false confessions literature to which the study was intended to contribute”. Regardless of the true nature of the memories in her study, I believe the coding method she produced undoubtedly led to a truly groundbreaking discovery about false beliefs, and that in this respect her approach has been further validated by Wade et al.’s reanalysis.

    Further muddying the waters, in an entirely separate development, Nicholas Brown (a PhD candidate in health psychology at UMC Groningen) and James Heathers (an applied physiologist at Northeastern University in Boston) recently found a series of statistical errors in the Shaw and Porter paper. However, the errors do not alter the headline findings. Shaw has issued a correction that’s due for publication and she says the errors were simply caused by rounding values imprecisely using Microsoft Excel and missing cells in the spreadsheet when conducting calculations. Brown and Heathers discovered the errors while developing their GRIM technique for detecting statistical anomalies in psychology research. When they applied their technique to 71 recent studies, including the Shaw and Porter paper, they found half appeared to contain at least one inconsistent result.

    Deconstructing Rich False Memories of Committing Crime: Commentary on Shaw and Porter (2015)

    Constructing Rich False Memories of Committing Crime

    simon-head-shotPost written by Simon Oxenham for the BPS Research Digest. Simon covers psychology and neuroscience critically in his Brain Scanner column at New Scientist. Follow @simoxenham on TwitterFacebook and Google+RSS or on his mailing list.

    in The British Psychological Society - Research Digest on February 15, 2018 07:45 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Elsevier at #AAASmtg: live updates with Women in Science winners

    5 researchers from developing countries are preparing to accept OWSD-Elsevier Foundation Awards for their work in the physical sciences

    in Elsevier Connect on February 15, 2018 04:18 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The neuroscience of the gut-brain connection (Part 1)

    Coloured scanning electron micrograph of Escherichia coli, grown in culture (CC BY 2.0)

     

    This is the first post in a series exploring the neurobiology of the gut-brain connection published in partnership with neuroscientist Dr Amy Reichelt.


    Our brain and gut are intrinsically connected. We have ‘gut feelings’ about a person or event, and feel ‘butterflies’ in our stomach when something exciting happens. Neuroscientists have become increasingly aware that our gut may hold key insights into brain function.

    The gut itself has what is often likened to a ‘mini-brain’ — an extensive network of neurons called the enteric nervous system. The brain and enteric nervous system talk, and the shared communication line between the two is often called the gut-brain axis.

    The gut-brain superhighway

    The gut communicates with the brain via hormones released into the bloodstream that cross the blood-brain barrier, controlling our desire for food. For example, the gut hormone ghrelin tells us when we’re hungry, and other hormones such as glucagon-like peptide 1 (GLP-1) influence satiety or tell us when we’re full. Such hormones act on the reward-signalling neural circuits in the brain, explaining why food tastes better when we are hungry.

    The gut also makes neurotransmitters, which are the molecules neurons use to communicate at synapses. One such neurotransmitter is serotonin (aka 5-hydroxytryptamine, 5-HT). About 90% of the body’s total serotonin is made in the gut, the rest is made in the nervous system. In the gut, specialised enterochromaffin cells lining the wall of the gut make and secrete serotonin where it plays important role in controlling peristalsis — the wave-like contractions of the gut walls that squeeze food along the digestive tract. Some neurons in the enteric nervous system also make and use serotonin as a neurotransmitter.

    Note! The gut is not the brain’s ‘serotonin factory’. Neurons in the brain make their own neurotransmitters. Also, gut-secreted serotonin (and other neurotransmitters) cannot cross the blood-brain-barrier, so it’s improbable gut serotonin directly influence brain function via the blood stream.

    The brain-gut-microbiome axis

    There is another important player in the gut-brain conversation. Our intestines are home to an entire ecosystem of microbes that control digestion, fight pathogens, and modulate hormone and neurotransmitter production. The collective of these microbes is termed the ‘gut microbiome’. An average person has about 1.5 kg of bacteria in their gut – similar to the weight of the brain! Gut bacteria also make the neurotransmitters including γ-aminobutyric acid (GABA), serotonin, dopamine and acetylcholine.

    Gut bacteria talk to the enteric nervous system and the brain, but the precise method of communication is unknown — researchers call it ‘black box connectivity’. Possible lines of communication include via hormones, immune signalling molecules, metabolic pathways, and via the vagus nerve.

    Squeaky clean mice

    Much of the microbiome research to date has used germ-free mice raised in highly sterile conditions. Germ-free mice are born by Caesarean section, thereby preventing the transfer of microbes from their mother. They grow up inside sterile isolators, are fed purified food and water, and breathe filtered air. Germ-free mice offer a blank canvas when it comes to their gut microbiome.

    Fecal microbiome transplants (FMTs aka poo transplants) into germ-free mice allow researchers to explore how a specific gut microbiome profile can change brain function. For example, after receiving a poo transplant, germ-free mice adopt the behavioural traits of the donor mouse. One study showed germ-free mice that received FMTs from stress-prone mice became more nervous in a new environment — essentially adopting nervousness from their poo-donor. In comparison, germ-free mice receiving FMTs from more curious and inquisitive mice became less anxious in strange situations.

    These experiments suggest the presence and composition of gut microbiota provides a signal to the brain via the gut-brain axis that in turn modifies rodent emotions.

    However, it’s very important to understand that germ-free mice are not typical mice. Germ-free mice are reared in artificial ‘bubble-like’ environments in the absence of micro-organisms to shape their immune systems. Their behaviour fundamentally differs from standard lab mice; particularly stress reactions and social responses.

    Can we ‘balance’ the brain with bacteria?

    Altering the microbiome through FMTs has potential to change brain chemistry. Case reports in humans suggest that FMTs may have therapeutic potential for disorders including autism, chronic fatigue syndrome and multiple sclerosis, but much more research is needed. There is a well-recognised ‘gap’ between rodent neuroscience research and application in humans.

    In my next blog, I will discuss how scientists are manipulating the microbiome through the use of ‘psychobiotics’ to treat mood disorders and the use of probiotics and prebiotics as brain health supplements.

     


    Dr Amy Reichelt is a neuroscientist with training in behaviour, psychology and molecular biology. She completed her PhD at Cardiff University in 2011 and went on to postdoctoral research fellowships at the University of Birmingham, UK and as an Australian Research Council Early Career Fellow at UNSW Sydney. She is currently a lecturer at RMIT University, Melbourne.

    Her research examines what happens to the brain when it is exposed to environmental changes – such as junk food diets, exercise, stress, and drugs.

    Amy is passionate about science communication and her career-defining 2016 TEDxSydney talk can be found here: 

    Amy is an avid Twitterer (find her @TheAmyR) and has written for The Conversation, The Guardian, and other news websites.

    Her personal website is http://amyreichelt.com.

     


     

    The post The neuroscience of the gut-brain connection (Part 1) appeared first on Your Brain Health.

    in Yourbrainhealth on February 15, 2018 12:42 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Facts dispel false price point reported by Science Magazine

    Elsevier corrects misleading reporting

    in Elsevier Connect on February 14, 2018 11:20 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Look to penguins to track Antarctic changes

    Scientists say carbon and nitrogen isotopes found in penguin tissues can indicate shifts in the Antarctic environment.

    in Science News on February 14, 2018 10:16 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Study debunks fishy tale of how rabbits were first tamed

    A popular tale about rabbit domestication turns out to be fiction.

    in Science News on February 14, 2018 06:30 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Cutting off a brain enzyme reversed Alzheimer’s plaques in mice

    Inhibiting an enzyme involved in the production of Alzheimer’s protein globs also made old globs, or plaques, disappear in mouse brains.

    in Science News on February 14, 2018 06:12 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Author retracts Nature paper on Asia’s glaciers flagged for data error

    A glacier researcher has retracted a Nature paper after mistakenly underestimating glacial melt by as much as a factor of ten. In September, the journal tagged “Asia’s glaciers are a regionally important buffer against drought,” originally published in May 2017 by Hamish Pritchard,a glaciologist at the British Antarctic Survey, with an expression of concern, notifying readers of the mistake. […]

    in Retraction watch on February 14, 2018 06:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Quantum computers go silicon

    Scientists performed the first quantum algorithms in silicon, and probed quantum bits with light.

    in Science News on February 14, 2018 06:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    BMC Research Notes stops considering case reports

    Medical case reports are an essential part of scholarly communication. It is a proven article format where authors describe patient cases that are of educational value. Over the years, the medical literature has established a vast collection of case reports that serve the medical practitioner as a library of useful resources in clinical practice and research.

    Medical case reports at BMC

    At BMC we have always welcomed case reports and will continue to do so. Many of our journals consider case reports, in particular all medical BMC series journals including BMC Veterinary Research, and the Journal of Medical Case Reports.

    At BMC we have always welcomed case reports and will continue to do so

    In 2017 we re-structured BMC Research Notes to fill a niche in the publication landscape. Not only did we expand the scope to include physical, engineering, earth and environmental sciences, but most importantly we re-focused the journal on short notes with the aim to give researchers a forum to publish their observations that would otherwise remain unpublished and therefore unknown.

    BMC Research Notes and the other BMC series journals share the same editorial threshold. We offer BMC series journals for a variety of medical specialities that are the ideal home for the publication of medical case reports. Therefore, BMC Research Notes will no longer consider medical case reports and will instead focus solely on the publication of short note articles. Authors who are unsure about which BMC series journal is best suited for their particular manuscript are invited to send us a pre-submission enquiry.

    Case reports criteria at the BMC series

    Medical case reports submitted to a BMC series journal should make a contribution to medical knowledge and have educational and/or scientific merit, i.e.

    • Unreported or unusual side effects or adverse interactions involving medications.
    • Unexpected or unusual presentations of a disease.
    • New associations or variations in disease processes.
    • Presentations, diagnoses and/or management of new and emerging diseases.
    • An unexpected association between diseases or symptoms.
    • An unexpected event in the course of observing or treating a patient.
    • Findings that shed new light on the possible pathogenesis of a disease or an adverse effect.

    We do not consider case reports describing preventive or therapeutic interventions, as these generally require stronger evidence. Authors should follow the CARE guidelines.

    Case reports don’t need to be novel but must have educational value. As an example, while there might be little value in describing another simple case of the flu, it is worth publishing a case report of a condition of which only a handful of reports exist. This might not be novel but adding further experience to the literature on sparsely reported conditions is of educational value.

    A clean look for BMC Research Notes

    One year after BMC Research Notes was re-structured, we are excited to present a new and clear-cut offer to our authors with two short note article types, research note and data note. The former is the ideal forum for single observations, null results, replication studies, updates and extensions. Data notes are data descriptors to help researchers increase the visibility and discoverability of their data and to adhere to funder mandates. We offer two note article types, one for the presentation and analysis of results, and their concise discussion, and the other purely focusing on the description of research data.

    We are confident that authors who have published their case reports in BMC Research Notes before will find a suitable home for future manuscripts at one of the medical BMC series journals or the Journal of Medical Case Reports. We are excited to announce this next step in the shaping of BMC Research Notes into an innovative open access outlet for short peer reviewed notes.

    The post BMC Research Notes stops considering case reports appeared first on BMC Series blog.

    in BMC Series blog on February 14, 2018 02:53 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    BMC Musculoskeletal Disorders: highlights of 2017

    Distribution and symmetrical patellofemoral pain patterns as revealed by high-resolution 3D body mapping: a cross-sectional study

    Image from Boudreau et. Al.

    Describing pain symptoms can be a difficult task. Patellofemoral pain (PFP) is a broad term that is used to describe pain and stiffness in the knee, around the patella and the kneecap, making it difficult to perform everyday activities. This affects a large number of adolescents and young adults. Causes of PFP can originate from several components of the knee, but patients experiencing pain are often unaware of the cause of their knee pain and therefore communicating PFP can be as general as pointing to the area around the knee.

    In an attempt to overcome this issue, Shellie A. Boudreau developed a new system which allows patients to communicate pain through visual representation using high resolution 3D body mapping in conjunction with 2D line drawings. Together with Micheal Rathleff, PFP expert, and colleagues they have used this technique to acquire 2D drawings of pain patterns and also symmetrical pain patterns in adolescents and young adults with PFP.

    Results found that the majority of complaints were common in the lower peripatellar region. The vast majority of subjects presenting with bilateral knee pain demonstrated symmetrical knee pain patterns. This study provides a window into the possible pain patterns that may exist within PFP and provides a rationale to further explore the significance of these knee pain patterns in larger cohorts.

    Clinical classification in low back pain: best-evidence diagnostic rules based on systematic reviews

    Low back pain is a common problem that affects most people at some point in their life and can stem from several parts of the lumbar spine, including the intervertebral discs, sacroiliac joints, facet joints, bone, muscles, nerve roots, muscles, peripheral nerve tissue, and central nervous system sensitization. So how we do we classify diagnosis and treatment strategies for the different sub-groups of low back pain?

    With better evidence to identify patho-anatomy, Tom Petersen, Mark Laslett and Carsten Juhl published a systematic review of diagnostic accuracy studies in an attempt to develop evidence-based Clinical Diagnostic Rules (CDRs) for identifying common sources and causes of low back pain. To identify the most common patho-anatomical disorders in the lumbar spine for use in clinical practice, Petersen’s group identified 64 studies to successfully construct promising CDRs for symptomatic intervertebral disc, sacroiliac joint, spondylolisthesis, disc herniation with nerve root involvement, and spinal stenosis.

    In some of these categories sufficient evidence was found to recommend a CDR, whilst other categories had preliminary evidence and will require further studies. Therefore the findings of this study suggest that evidence-based clinical diagnosis can potentially reduce the need for invasive or expensive diagnostic methods; however, further research is necessary to evaluate these findings in a primary care setting.

    Good Life with osteoArthritis in Denmark (GLA:D™): evidence-based education and supervised neuromuscular exercise delivered by certified physiotherapists nationwide

    Osteoarthritis is considered one of the leading causes of disability, and substantial evidence increasingly supports the use of exercise as a first line of treatment for the disease. In this study, Soren Skou and Ewa Roos from the University of Southern Denmark have evaluated the aims of Good Life with osteoArthritis in Denmark (GLA:D), which is a training scheme established in 2013 to reduce hip/knee pain caused by osteoarthritis, reduce the number of people taking painkillers, improve physical function, improve quality of life and reduce the number of patients on sick leave.

    A total of 9,825 participants from the GLA:D registry were included and the findings show that education and supervised exercise improved pain intensity and quality of life. Moreover, the results show that fewer patients took painkillers and were on sick leave following the scheme. Given its positive impact, GLA:D has now been established nationwide and emphasizes the importance of exercise as a lifestyle change to improve general health and quality of life.

    The transverse abdominal muscle is excessively active during active straight leg raising in pregnancy-related posterior pelvic girdle pain: an observational study

    Pelvic girdle pain (PGP) is a condition that commonly affects the pelvic joints during pregnancy, and causes pain and discomfort. To investigate the contraction patterns of muscles in the pelvis, Jan Mens and Annelies Pool-Goudzwaard carried out a cross-sectional observational study to examine the thickness of the transverse abdominal muscle (TrA) using ultrasound imaging in women with PGP during active straight leg raising (ASLR). In their cohort, PGP started during pregnancy in 84% of the participants and 3 weeks post-delivery in 16% of the participants.

    The authors observed no significant differences in the mean TrA thickness at rest between women with and without PGP; however, interestingly, a significant excessive contraction of the TrA was observed during ASLR in patients with PGP. Therefore these findings appear to suggest that exercise may not be the solution for patients with long-lasting pregnancy-related PGP.

    Individuals’ explanations for their persistent or recurrent low back pain: a cross-sectional survey

    It has been estimated that globally approximately 40% of people experience low back pain. Although most research investigates the physical causes of low back pain, Setchell and colleagues from the School of Health and Rehabilitation Sciences, Australia, took a novel approach to explore individual belief systems regarding what causes low back pain to persist.

    The study indicated that persistent low back pain is perceived as structural with “no mechanical fix”, permanent and complex, and often described as very negative. Interestingly, it also appears that these beliefs about low back pain stemmed overwhelmingly from health professionals and the internet as a source of understanding of their LBP. The findings of this study highlight the importance of understanding underlying causes of low back pain, perhaps shifting the thinking within healthcare and looking beyond singular causes.

    Specific or general exercise strategy for subacromial impingement syndrome–does it matter? A systematic literature review and meta analysis

    Subacromial impingement syndrome (SIS) accounts for up to 50-70% of shoulder pain and occurs when the tendons of the rotator cuff become trapped and compressed during shoulder movement. Using a systematic review and meta-analysis approach, Alison Shire and colleagues identified six randomized control trials (a total of 231 patients) that used specific exercise strategies to treat patients experiencing symptoms of SIS. They evaluated whether implementing specific exercises resulted in a more superior effect, as determined by improvements in pain, function and quality of life (QoL), compared with general exercises in a resistance training program.

    Interestingly, the results showed no significant difference to support the use of specific exercises over general exercises in the outcome measures in patients experiencing SIS. Low quality and inconsistent data were indicated as one of the main reasons for not being able to detect differences, and the authors provide recommendations for the design of future trials to address this issue.

    The post BMC Musculoskeletal Disorders: highlights of 2017 appeared first on BMC Series blog.

    in BMC Series blog on February 14, 2018 02:40 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Cardiometabolic disorder poses greater mortality risk than obesity

    It is well known that body mass index (BMI) is a major determinant of mortality and research suggests that the mortality risk of a high BMI is also attenuated by age. However, it is less clear how cardio-metabolic factors (CMFs) such as hyperglycemia and dyslipidemia affect the relationship between BMI and mortality.

    Dr Mark Wahlqvist and colleagues at the National Health Research Institutes in Taiwan set out to study the mortality risks of CMFs as compared to those of BMI in a large population cohort of Taiwanese for a mean follow-up time of 8.1 years. The researchers followed 377,929 participants, aged 20 years or older, registered with MJ Health Screening Center in Taiwan and were able to consider the role played by traditional cardiovascular risk factors on the health effects of BMI across the entire adult life course, which is missing in most studies of BMI and mortality.

    Our findings shed new light on how the known association between BMI and mortality may be altered by cardio-metabolic factors, and thus could help improve strategies for a longer, healthier life

    They found that high blood pressure and blood glucose were associated with an increased risk of mortality in both younger and older individuals, irrespective of gender or body composition. Compared to a 1.04% increased risk in obese individuals, high blood pressure was associated with an 8.57% risk of mortality and high blood sugar was associated with a 6.49% risk of mortality.

    The authors were surprised to find that underweight was a more important predictor of reduced survival than overweight or obesity, except in young adults. Furthermore, older age was protective against the risk posed by higher BMIs, suggesting that weight-management programs may need to be tailored according to age.

    Mark Wahlqvist et. al. 2018
    In people with metabolic syndrome, the mortality risk of being underweight was much greater than that of obesity, especially for younger people

    Dr Wahlqvist et. al. also found that taking metabolic syndrome into account meant that mortality risk of being underweight was greater than that of obesity for all age groups, but especially for people aged 20-39 years old. This may be due to the incidence of high blood pressure and high blood sugar in this group.

    Although the observational nature of the study does not allow for conclusions about cause and effect, it provides strong support of the notion that, in addition to weight management, life-extension policies warrant attention among all age groups to cardio-metabolic disorders such as hypertension, pre-diabetes, and diabetes.

    The post Cardiometabolic disorder poses greater mortality risk than obesity appeared first on BMC Series blog.

    in BMC Series blog on February 14, 2018 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Flowers, apologies, food or sex? Men’s and women’s views on the most effective ways to make up

    giphyBy Christian Jarrett

    You and your partner have had a tiff. Of all the things they could do to try to make up with you, what would be the most effective? A group of evolutionary psychologists recently put this question to 164 young adults. They presented them with 21 categories of reconciliatory behaviour, including giving a gift, cooking a meal and communicating better (derived from an earlier survey of 74 other young adults about ways to make up).

    Men and women agreed that the most effective reconciliatory behaviour of all is communicating (for instance, by talking or texting). To varying degrees, both sexes also rated apologising, forgiving, spending time together and compromising as among the things their partner could do that would most likely heal wounds.

    But some behaviours men thought would be more effective for making up than women, and these were their partner performing nice gestures (such as chores, favours and compliments), and offering sex or sexual favours. On the other hand, women thought their partner apologising or crying would be more a more effective way for their partner to make up than did the men.

    Joel Wade at Bucknell University and his colleagues said these differences are in line with the predictions of evolutionary psychology, namely that thanks to sex differences in mating strategies shaped through our deep ancestral past, men are generally more concerned about opportunities for sex whereas women are more concerned about emotional commitment. The findings also complement past research, in the same vein, that’s found men are more likely to end a relationship if their partner is sexually unavailable, while women are more likely to end the relationship if their partner is emotionally distant.

    “Evolutionary theory predicts a number of sex differences in mate selection, mate retention, and mate expulsion,” the researchers wrote in Evolutionary Psychological Science. “The present research expands this literature by documenting systematic differences in which actions men and women perceive as most effective in promoting conflict reconciliation within romantic relationships.”

    Sex Differences in Reconciliation Behavior After Romantic Conflict

    Image via giphy.com

    Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

    in The British Psychological Society - Research Digest on February 14, 2018 09:32 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    A psychologist noticed this cool chair illusion in his office

    10.1177_2041669517752372-fig1.gifBy Alex Fradera

    A short paper in the journal i-Perception presents a disconcerting visual illusion spotted “in the wild”: how stackable chairs, viewed from a certain angle, mess with your head. This is an unedited image, but your mind resists accepting it could be real. The illusion was first noticed in the office of lead author Nick Scott-Samuel at the University of Bristol, who notes in the paper that “it obtains in real life as well as in images, even when sober”.

    The cause of the trick appears to be the two “edges” seen coming up from the near-base of the stack – marked AD and BC in the annotated version of the image below – which “suggests a change in depth along those lines which does not actually exist” and makes it seem as if the bars that run from one “edge” to another compose an outward face.

    10.1177_2041669517752372-fig2.gif

    160px-Penrose-dreieck.svg.pngIt misleads us a little like the Penrose triangle (see right), but whereas the triangle is actually impossible, the chairs only appear to be so – as demonstrated in this video where you can see the effect dip in and out. Scott-Samuel and his colleagues experimented with the chair stack and it appears that you need at least four chairs to create the effect.

    You know the angles, you know the number: turn up to the meeting room early next time and use that recipe to astound even the most jaded eyes.

    Stacking Chairs: Local Sense and Global Nonsense

    Alex Fradera (@alexfradera) is Staff Writer at BPS Research Digest

    in The British Psychological Society - Research Digest on February 14, 2018 09:11 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Shipping noise can disturb porpoises and disrupt their mealtime

    Noise from ships may disturb harbor porpoises enough to keep them from getting the food they need.

    in Science News on February 14, 2018 12:05 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Genes could record forensic clues to time of death

    Scientists have found predictable patterns in the way our genetic machinery winds down after death.

    in Science News on February 13, 2018 10:12 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Six days after publication, paper is flagged. By day 11, it’s retracted.

    Authors of a 2018 paper have retracted it after discovering “the conclusions in the article cannot be relied upon.” The journal, PeerJ, wasted no time. Less than a week after the paper was published, the journal issued an expression of concern to alert readers to the issue and to the forthcoming retraction notice, which appeared five […]

    in Retraction watch on February 13, 2018 08:14 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    New technique shows how 2-D thin films take the heat

    A new method exposes how 2-D materials react when heated, which could help engineers build sturdy next-gen electronics.

    in Science News on February 13, 2018 04:26 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Neural Basis of Watching "Memento"

    Memento (2000) is a complex psychological thriller about a man unable to form long-term memories. The movie is popular among neuroscientists for its accurate depiction of amnesia. Now, in a wonderfully "meta" paper, a group of neuroscientists report that they scanned the brains of people watching Memento in order to study memory processes. The paper's called Brain mechanisms underlying cue-based memorizing during free viewing of movie Memento, and it's published in Neuroimage, from Finnis

    in Discovery magazine - Neuroskeptic on February 13, 2018 02:25 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Elongated heads were a mark of elite status in an ancient Peruvian society

    Elites in ancient Peruvian society developed a signature, stretched-out head shape over several centuries.

    in Science News on February 13, 2018 02:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    What will it take to go to Venus?

    Undeterred by funding woes, scientists are scraping together ideas to tackle heat, pressure and acidity challenges of landing on Venus.

    in Science News on February 13, 2018 12:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Your childhood best friend’s intelligence probably rubbed off on you

    GettyImages-587812356.jpgJust by hanging out with a brainy buddy, you will likely have absorbed some of their knowledge and skills

    By Christian Jarrett

    Picture yourself aged 11: who was your best friend and how smart were they? The answer may have shaped your life more than you think. A new study published as a pre-print at PsyArXiv reports that participants’ IQ at age 15 was correlated with the IQ of whomever was their best friend years earlier, when that friend was aged 11, even after factoring out the participants’ own earlier intelligence, as well as a host of other potentially confounding variables.

    We already know, thanks to previous research, that our school-age peers shape our personalities, our powers of self-control, and the chances that we’ll get into trouble, so it’s to be expected that they also affect our intelligence (and we theirs). Surprisingly, however, this possibility had not been studied before now. “Our findings add … another layer of evidence for the important and pervasive influence of peers on a host of traits during adolescence,” the researchers said.

    Ryan Charles Meldrum at Florida International University and his colleagues used data collected from hundreds of families resident in 10 US cities between 1991 and 2007 as part of a study run by the National Institute of Child and Human Development. This included intelligence test results from 715 participating kids completed when they were aged 10 and then again when they were aged 15. These “target” kids’ individual best friends had also completed an intelligence test when they were aged 11-12 (most best friends were the same sex as the target kids and they were no more than two years older or younger).

    The IQ of the target kids at age 15 was strongly correlated with the IQ of their best friend when their best friend was aged 11-12. This could be due largely to the fact that children like to be friends with other kids who are similar to them. Critically, however, the target kids’ IQ at age 15, based on a composite of three tests, was still associated (β = 0.08 in a multiple regression) with their best friends’ IQ at age 11-12 even after factoring out their own IQ score when they were aged 10-11, as well as a range of more than nine other confounding variables, such as their mothers’ IQ and education level, and the “enrichment” opportunities in their home. This provides tentative evidence that the target kids hadn’t just chosen childhood friends with an IQ similar to their own, but that their IQ in mid-adolescence had actually been shaped by the intellect of their best friend years earlier.

    The new study can’t speak to how a childhood best friend affects our future intelligence, but one can imagine that if your best friend were motivated to study hard, this will have boosted your own academic motivation, with lasting positive consequences for your IQ. Just by hanging out with a brainy buddy, you will also likely have absorbed some of their knowledge and skills. Conversely, if your best friend was not such an intellectual, you will not have enjoyed these cerebral benefits.

    One caveat to bear in mind is that the IQ tests used in this study, such as the Peabody Picture Vocabulary Test, tap what’s known as “crystallised intelligence”, which refers to our knowledge, rather than “fluid intelligence”, which is more about mental agility. There was also no measure of fathers’ IQ which may have confounded the results, potentially influencing the IQ of the target kids and who they chose to have as a best friend. Future research also needs to look at the effect of peer group IQ in aggregate, not just an individual best friend, although presumably their influence will be the strongest.

    While acknowledging these and other limitations of their research, Meldrum and his colleagues said “overall, our results provide support for the hypothesis that having more intelligent friends is associated with higher future levels of intelligence.”

    On the Longitudinal Association between Peer and Adolescent Intelligence: Can Our Friends Make Us Smarter? [This article is a pre-print and has not yet been subjected to peer-review]

    Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

    in The British Psychological Society - Research Digest on February 13, 2018 09:45 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Facial attraction: red-fronted lemurs recognize photos of their own species

    Species recognition is the ability of animals to discriminate among their own and another species. Evolutionary theory suggests that this ability is particularly important for females as they normally invest much more in reproduction than males. Natural selection should therefore avoid costly interbreeding and favor the evolution of animal signals that can be used to avoid mating with another species (heterospecific mating).

    Species recognition is the ability of animals to discriminate among their own and another species.

    Animals use a variety of different signals for species recognition. Olfactory signals are known to be used for species recognition in some species of bats and fish, whereas acoustic signals are used in many species of frogs and birds, and also in some mammals. Facial cues as a form of visual signals are considered to be particularly relevant for species recognition in non-human primates.

    Our experiments

    In our study we investigated the importance of facial cues for species recognition for the red-fronted lemur from Madagascar. We wanted to “ask” the animals themselves, if facial differences among species actually have a real relevance for the animals in terms of reproductive isolation and if they can discriminate among pictures of their own and those of other closely related species. In an experiment we presented pictures of faces of different species of the genus Eulemur to males and females of free-ranging Red-fronted lemurs in Kirindy Forest, Western Madagascar, and measured the time the lemurs spent looking at the pictures.

    Female red-fronted lemur
    Pixabay

    Red-fronted lemurs looked longer at pictures of their own species than at pictures of others. Females also showed a more pronounced response than males. This may indicate that female red-fronted lemurs perceive and respond to differences in fur patterns and coloration to recognize viable mates from their own species, enabling them to avoid costly interbreeding.

    In a previous study, we found no such ability for discriminating among acoustic signals of different species. Interestingly, the animals also showed increased sniffing behavior towards pictures of their own species. This might indicate that they use two different modalities at the same time, but this (cross-modal recognition) needs to be tested in the future.

    Interesting new questions

    Our findings are particularly interesting because these animals mate with individuals from other species in nature – they hybridize even though they are able to recognize individuals of their own species. It might therefore be interesting to conduct future studies in hybrid zones, where two or more species occur together in order to examine whether experience with closely related species affects their ability to discriminate between species.

    Our results also suggest an effect of sexual variation in color vision. In fact, male red-fronted lemurs are always dichromatic, whereas females can be dichromatic or trichromatic, allowing some of them to see more colors than males. Future studies require genetic tests to test the significance of color vision polymorphism and its role for species recognition in red-fronted lemurs.

    The post Facial attraction: red-fronted lemurs recognize photos of their own species appeared first on BMC Series blog.

    in BMC Series blog on February 13, 2018 09:30 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Even after bedbugs are eradicated, their waste lingers

    Bedbug waste contains high levels of the allergy-triggering chemical histamine, which stays behind even after the insects are eradicated.

    in Science News on February 12, 2018 11:30 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    14 cattle eyeworms removed from Oregon woman’s eye

    Oregon woman has the first ever eye infection with the cattle eyeworm Thelazia gulosa.

    in Science News on February 12, 2018 10:34 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Your Cortex Contains 17 Billion Computers

    Neural networks of neural networks

    Credit: brentsview under CC BY-NC 2.0

    Brains receive input from the outside world, their neurons do something to that input, and create an output. That output may be a thought (I want curry for dinner); it may be an action (make curry); it may be a change in mood (yay curry!). Whatever the output, that “something” is a transformation of some form of input (a menu) to output (“chicken dansak, please”). And if we think of a brain as a device that transforms inputs to outputs then, inexorably, the computer becomes our analogy of choice.

    For some this analogy is merely a useful rhetorical device; for others it is a serious idea. But the brain isn’t a computer. Each neuron is a computer. Your cortex contains 17 billion computers.

    OK, what? Look at this:

    A pyramidal cell — squashed into two dimensions. The black blob in the middle is the neuron’s body; the rest of the wires are its dendrites. Credit: Alain Dexteshe / http://cns.iaf.cnrs-gif.fr/alain_geometries.html

    This is a picture of a pyramidal cell, the neuron that makes up most of your cortex. The blob in the centre is the neuron’s body; the wires stretching and branching above and below are the dendrites, the twisting cables that gather the inputs from other neurons near and far. Those inputs fall all across the dendrites, some right up close to the body, some far out on the tips. Where they fall matters.

    But you wouldn’t think it. When talking about how neurons work, we usually end up with the sum-up-inputs-and-spit-out-spike idea. In this idea, the dendrites are just a device to collect inputs. Activating each input alone makes a small change to the neuron’s voltage. Sum up enough of these small changes, from all across the dendrites, and the neuron will spit out a spike from its body, down its axon, to go be an input to other neurons.

    The sum-up-and-spit-out-spike model of a neuron. If enough inputs arrive at the same time — enough to cross a threshold (grey circle) — the neuron spits out a spike.

    It’s a handy mental model for thinking about neurons. It forms the basis for all artificial neural networks. It’s wrong.

    Those dendrites are not just bits of wire: they also have their own apparatus for making spikes. If enough inputs are activated in the same small bit of dendrite then the sum of those simultaneous inputs will be bigger than the sum of each input acting alone:

    The two coloured blobs are two inputs to a single bit of dendrite. When they are activated on their own, they each create the responses shown, where the grey arrow indicates the activation of that input (response here means “change in voltage”). When activated together, the response is larger (solid line) than the sum of their individual responses (dotted line).

    The relationship between the number of active inputs and the size of the response in a little bit of dendrite looks like this:

    Size of the response in a single branch of a dendrite to increasing numbers of active inputs. The local “spike” is the jump from almost no response to a large response.

    There’s the local spike: the sudden jump from almost no response to a few inputs, to a big response with just one more input. A bit of dendrite is “supralinear”: within a dendrite, 2+2=6.

    We’ve known about these local spikes in bits of dendrite for many years. We’ve seen these local spikes in neurons within slices of brain. We’ve seen them in the brains of anaesthetised animals having their paws tickled (yes, unconscious brains still feel stuff; they just don’t bother to tell anyone). We’ve very recently seen them in the dendrites of neurons in animals that were moving about (yeah, Moore and friends recorded the activity in something a few micrometres across from the brain of a mouse that was moving about; crazy, huh?). A pyramidal neuron’s dendrites can make “spikes”.

    So they exist: but why does this local spike change the way we think about the brain as a computer? Because the dendrites of a pyramidal neuron contain many separate branches. And each can sum-up-and-spit-out-a-spike. Which means that each branch of a dendrite acts like a little nonlinear output device, summing up and outputting a local spike if that branch gets enough inputs at roughly the same time:

    Deja vu. A single dendritic branch acts as a little device for summing up inputs and giving an output if enough inputs were active at the same time. And the transformation from input to output (the grey circle) is just the graph we’ve already seen above, which gives the size of the response from the number of inputs.

    Wait. Wasn’t that our model of a neuron? Yes it was. Now if we replace each little branch of dendrite with one of our little “neuron” devices, then a pyramidal neuron looks something like this:

    Left: A single neuron has many dendritic branches (above and below its body). Right: so it is a collection of non-linear summation devices (yellow boxes, and nonlinear outputs), that all output to the body of the neuron (grey box), where they are summed together. Look familiar?

    Yes, each pyramidal neuron is a two layer neural network. All by itself.

    Beautiful work by Poirazi and Mel back in 2003 showed this explicitly. They built a complex computer model of a single neuron, simulating each little bit of dendrite, the local spikes within them, and how they sweep down to the body. They then directly compared the output of the neuron to the output of a two-layer neural network: and they were the same.

    The extraordinary implication of these local spikes is that each neuron is a computer. By itself the neuron can compute a huge range of so-called nonlinear functions. Functions that a neuron which just sums-up-and-spits-out-a-spike cannot ever compute. For example, with four inputs (Blue, Sea, Yellow, and Sun) and two branches acting as little non-linear devices, we can set up a pyramidal neuron to compute the “feature-binding” function: we can ask it to respond to Blue and Sea together, or respond to Yellow and Sun together, but not to respond otherwise — not even to Blue and Sun together or Yellow and Sea together. Of course, neurons receive many more than four inputs, and have many more than two branches: so the range of logical functions they could compute is astronomical.

    More recently, Romain Caze and friends (I am one of those friends) have shown that a single neuron can compute an amazing range of functions even if it cannot make a local, dendritic spike. Because dendrites are naturally not linear: in their normal state they actually sum up inputs to total less than the individual values. They are sub-linear. For them 2+2 = 3.5. And having many dendritic branches with sub-linear summation also lets the neuron act as two-layer neural network. A two-layer neural network that can compute a different set of non-linear functions to those computed by neurons with supra-linear dendrites. And pretty much every neuron in the brain has dendrites. So almost all neurons could, in principle, be a two-layer neural network.

    The other amazing implication of the local spike is that neurons know a hell of a lot more about the world than they tell us — or other neurons, for that matter.

    Not long ago, I asked a simple question: How does the brain compartmentalise information? When we look at the wiring between neurons in the brain, we can trace a path from any neuron to any other. How then does information apparently available in one part of the brain (say, the smell of curry) not appear in all other parts of the brain (like the visual cortex)?

    There are two opposing answers to that. The first is, in some cases, the brain is not compartmentalised: information does pop up in weird places, like sound in brain regions dealing with place. But the other answer is: the brain is compartmentalised — by dendrites.

    As we just saw, the local spike is a non-linear event: it is bigger than the sum of its inputs. And the neuron’s body basically can’t detect anything that is not a local spike. Which means that it ignores most of its individual inputs: the bit which spits out the spike to the rest of the brain is isolated from much of the information the neuron receives. The neuron only responds when a lot of the inputs are active together in time and in space (on the same bit of dendrite).

    If this was true, then we should see that dendrites respond to things that the neuron does not respond to. We see exactly this. In visual cortex, we know that many neurons respond only to things in the world moving at a certain angle (like most, but by no means all of us, they have a preferred orientation). Some neurons fire their spikes to things at 60 degrees; some at 90 degrees; some at 120 degrees. But when we record what their dendrites respond to, we see responses to every angle. The dendrites know a hell of a lot more about how objects in the world are arranged than the neuron’s body does.

    They also look at a hell of a lot more of the world. Neurons in visual cortex only respond to things in a particular position in the world — one neuron may respond to things in the top left of your vision; another to things in the bottom right. Very recently Sonia Hofer and her team showed that while the spikes from neurons only happen in response to objects appearing in one particular position, their dendrites respond to many different positions in the world, often far from the neuron’s apparent preferred position. So the neurons respond only to a small fraction of the information they receive, with the rest tucked away in their dendrites.

    Why does all this matter? It means that each neuron could radically change its function by changes to just a few of its inputs. A few get weaker, and suddenly a whole branch of dendrite goes silent: the neuron that was previously happy to see cats, for that branch liked cats, no longer responds when your cat walks over your bloody keyboard as you are working — and you are a much calmer, more together person as a result. A few inputs get stronger, and suddenly a whole branch starts responding: a neuron that previously did not care for the taste of olives now responds joyously to a mouthful of ripe green olive — in my experience, this neuron only comes online in your early twenties. If all inputs were summed together, than changing a neuron’s function would mean having the new inputs laboriously fight each and every other input for attention; but have each bit of dendrite act independently, and new computations become a doddle.

    It means the brain can do many computations beyond treating each neuron as a machine for summing up inputs and spitting out a spike. Yet that’s the basis for all the units that make up an artificial neural network. It suggests that deep learning and its AI brethren have but glimpsed the computational power of an actual brain.

    Your cortex contains 17 billion neurons. To understand what they do, we often make analogies with computers. Some use these analogies as cornerstones of their arguments. Some consider them to be deeply misguided. Our analogies often look to artificial neural networks: for neural networks compute, and they are made of up neuron-like things; and so, therefore, should brains compute. But if we think the brain is a computer, because it is like a neural network, then now we must admit that individual neurons are computers too. All 17 billion of them in your cortex; perhaps all 86 billion in your brain.

    And so it means your cortex is not a neural network. Your cortex is a neural network of neural networks.

    With thanks to Romain Caze for suggestions


    Your Cortex Contains 17 Billion Computers was originally published in The Spike on Medium, where people are continuing the conversation by highlighting and responding to this story.

    in The Spike on February 12, 2018 05:56 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Olfactory coding in the turbulent realm

    Long-distance olfactory search behaviors depend on odor detection dynamics. Due to turbulence, olfactory signals travel as bursts of variable concentration and spacing and are characterized by long-tail distributions of odor/no-odor events, challenging the computing capacities of olfactory systems. How animals encode complex olfactory scenes to track the plume far from the source remains unclear. Here we focus on the coding of the plume temporal dynamics in moths. We compare responses of olfactory receptor neurons (ORNs) and antennal lobe projection neurons (PNs) to sequences of pheromone stimuli either with white-noise patterns or with realistic turbulent temporal structures simulating a large range of distances (8 to 64 m) from the odor source. For the first time, we analyze what information is extracted by the olfactory system at large distances from the source. Neuronal responses are analyzed using linear–nonlinear models fitted with white-noise stimuli and used for predicting responses to turbulent stimuli.

    Date: 16/02/2018
    Time: 16:00
    Location: LB252

    in UH Biocomputation group on February 12, 2018 02:36 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    5 ways the heaviest element on the periodic table is really bizarre

    Called oganesson, element 118 has some very strange properties, according to theoretical calculations by physicists.

    in Science News on February 12, 2018 02:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Ancient ozone holes may have sterilized forests 252 million years ago

    Swaths of barren forest may have led to Earth’s greatest mass extinction.

    in Science News on February 12, 2018 12:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Preliminary evidence for the benefits of “Jymmin” – creating or influencing music while exercising

    GettyImages-486858606.jpgThe researchers suspect that being able to influence music causes a bigger release of endogenous opioids

    By Emma Young

    Listening to music while exercising can make a work-out feel more pleasant. But might having some control over the sound of that music have an even stronger effect? A new study, published in Frontiers in Psychology, suggests that it does. In theory, this approach (known as known as “Jymmin” – gym plus jammin’…) might help injured athletes and other rehab patients to complete beneficial, but painful, exercise programmes. As the researchers, led by Thomas Fritz at the Max Planck Institute for Human Brain Sciences in Leipzig, Germany, note, “Physical pain can present a significant obstacle to the success of physical exercise rehabilitation, increasing negative affect and decreasing patient motivation.”

    Ten men and nine women, all in their twenties, worked out for ten minutes, in pairs, facing each other. One member of the pair used a cable lat pulldown machine (which exercises the latissimus dorsi muscle in the back, and also the biceps), while the other used an abdominal muscle trainer.

    For one session, the exercise machines were hooked up to computer software that modified the music being played depending on how hard the participants worked. The fast-tempo (130 beats per minute) soundtrack featured a drum beat, bass guitar line and a melody played on a synthesiser. A faster rate of pulldowns on the lat/bicep machine altered the sound, so that the higher-frequency notes were more audible. For the other machine, more frequent sit-ups increased the pitch of the melody line.

    For the other session, the participants heard the soundtracks created by other pairs – their own efforts had no effect on the music, although it remained upbeat. Some participants completed this session first, others did this one second.

    Immediately after each exercise session, the participants took a pain tolerance test: they had to plunge their non-dominant hand and arm into painfully cold water and keep it there for as long as possible. After the exercise session in which they’d influenced the music, the participants kept their hands in the cold water for, on average, five seconds longer – an average of 50 seconds, compared with 45 seconds when they’d exercised without affecting the music.

    Ideally, the researchers would have measured pain tolerance during the exercise. Practically, this would have been hard, though, which is why they did it afterwards. But they argue that the results indicate differences in pain tolerance during the two different exercise sessions.

    The researchers suspect that being able to influence the music through exercise caused a bigger release of endogenous opioids, and that this explains the results. There are, though, other potential interpretations. What we pay attention to can have a huge influence on our perception of pain: perhaps when the participants were altering the melody, they paid more attention to the music, and less to their own physical state – and so felt less pain, and could tolerate more afterwards.

    There are many other questions still to be answered, not least, whether the same effects might be observed in people exercising alone, rather than in pairs; whether the benefits of exercise conducted while influencing music would actually be as beneficial, in terms of rehab, as exercising while passively listening to music, even if the latter hurts more (in fact, the researchers found that when the participants had influence over the music, they actually worked less hard on the lat machine); and whether it could even reduce required painkiller dosages. It’s worth investigating further.

    Emma Young (@EmmaELYoung) is Staff Writer at BPS Research Digest

    Musical Agency during Physical Exercise Decreases Pain

    in The British Psychological Society - Research Digest on February 12, 2018 09:52 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    3 tips on managing change in a hospital’s digital transformation

    Elsevier’s VP of Health Informatics for Clinical Solutions shares advice – and pitfalls to avoid – while moving towards patient-centric care

    in Elsevier Connect on February 12, 2018 09:16 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Preventing unwarranted variability of care for cancer patients

    Via Oncology, now with Elsevier, helps oncologists give their patients the best evidence-based treatment

    in Elsevier Connect on February 12, 2018 09:15 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Policy Insights from The Neurocritic: Alarm Over Acetaminophen, Ibuprofen Blocking Emotion Is Overblown


    Just in time for Valentine's Day, floats in a raft of misleading headlines:

    Scientists have found the cure for a broken heart

    Painkillers may also mend a broken heart

    Taking painkillers could ease heartaches - as well as headaches

    Paracetamol and ibuprofen could ease heartaches - as well as headaches


    If Tylenol and Advil were so effective in “mending broken hearts”, “easing heartaches”, and providing a “cure for a broken heart”, we would be a society of perpetually happy automatons, wiping away the suffering of breakup and divorce with a mere dose of acetaminophen. We'd have Tylenol epidemics and Advil epidemics to rival the scourge of the present Opioid Epidemic.

    Really, people,1 words have meanings. If you exaggerate, readers will believe statements that are blown way out of proportion. And they may even start taking doses of drugs that can harm their kidneys and livers.


    These media pieces also have distressing subtitles:

    Common painkillers that kill empathy
    ... some popular painkillers like ibuprofen and acetaminophen have been found to reduce people’s empathy, dull their emotions and change how people process information.

    A new scientific review of studies suggests over-the-counter pain medication could be having all sorts of psychological effects that consumers do not expect.

    Not only do they block people’s physical pain, they also block emotions.

    The authors of the study, published in the journal Policy Insights from the Behavioral and Brain Sciences, write: “In many ways, the reviewed findings are alarming. Consumers assume that when they take an over-the-counter pain medication, it will relieve their physical symptoms, but they do not anticipate broader psychological effects.”

    Cheap painkillers affect how people respond to hurt feelings, 'alarming' review reveals
    Taking painkillers could ease the pain of hurt feelings as well as headaches, new research has discovered.

    The review of studies by the University of California found that women taking drugs such as ibuprofen and paracetamol reported less heartache from emotionally painful experiences, compared with those taking a placebo.

    However, the same could not be said for men as the study found their emotions appeared to be heightened by taking the pills.

    Researchers said the findings of the review were 'in many ways...alarming'.

    I'm here to tell you these worries are greatly exaggerated. Just like there's a Trump tweet for every occasion, there's a Neurocritic post for most of these studies (see below).

    A new review in Policy Insights from the Behavioral and Brain Sciences has prompted the recent flurry of headlines. Ratner et al. (2018) reviewed the literature on OTC pain medications.
    . . . This work suggests that drugs like acetaminophen and ibuprofen might influence how people experience emotional distress, process cognitive discrepancies, and evaluate stimuli in their environment. These studies have the potential to change our understanding of how popular pain medications influence the millions of people who take them. However, this research is still in its infancy. Further studies are necessary to address the robustness of reported findings and fully characterize the psychological effects of these drugs.

    The studies are potentially transformative, yet the research is still in its infancy. The press didn't read the “further studies are necessary” caveat. But I did find one article that took a more modest stance:

    Do OTC Pain Relievers Have Psychological Effects?
    Ratner wrote that the findings are “in many ways alarming,” but he told MD Magazine that his goal is not so much to raise alarm as it is to prompt additional research. “Something that I want to strongly emphasize is that there are really only a handful of studies that have looked at the psychological effects of these drugs,” he said.

    Ratner said a number of questions still need to be answered. For one, there is not enough evidence out there to know to what extent these psychological effects are merely the result of people being in better moods once their pain is gone.

    . . .

    Ratner also noted that the participants in the studies were not taking the medications because of physical pain, and so the psychological effects might be a difference in cases where the person experienced physical pain and then relief.

    For now, Ratner is urging caution and nuanced interpretation of the data. He said stoking fears of these drugs could have negative consequences, as could a full embrace of the pills as mood-altering therapies.

    Ha! Not so alarming after all, we see on a blog with 5,732 Twitter followers (as opposed to 2.4 million and 2.9 million for the most popular news pieces). I took 800 mg of ibuprofen before writing this post, and I do not feel any less anxious or disturbed about events in my life. Or even about feeling the need to write this post, with my newly “out” status and all.


    There's a Neurocritic post for every occasion...

    As a preface to my blog oeuvre, these are topics I care about deeply. I'm someone who has suffered heartache and emotional pain (as most of us have), as well as chronic pain conditions, four invasive surgeries, tremendous loss, depression, anxiety, insomnia, etc.... My criticism does not come lightly.

    I'm not entirely on board with studies showing that one dose (or 3 weeks) of Tylenol MAY {or may not} modestly reduce social pain or “existential distress” or empathy as sufficient models of human suffering and its alleviation by OTC drugs. In fact, I have questions about all of these studies.

    Suffering from the pain of social rejection? Feel better with TYLENOL® – My first question has always been, why acetaminophen and not aspirin or Advil? Was there a specific mechanism in mind?

    Existential Dread of Absurd Social Psychology Studies – Does a short clip of Rabbits (by David Lynch) really produce existential angst and thoughts of death? [DISCLAIMER: I'm a David Lynch fan.]

    Tylenol Doesn't Really Blunt Your Emotions – Why did ratings of neutral stimuli differ as a function of treatment (in one condition)?

    Does Tylenol Exert its Analgesic Effects via the Spinal Cord? – and perhaps brainstem

    Acetaminophen Probably Isn't an "Empathy Killer" – How do very slight variations in personal distress ratings translate to real world empathy?

    Advil Increases Social Pain (if you're male) – Reduced hurt from Cyberball exclusion in women, but a disinhibition effect in men (blunting their tendency to suppress their emotional pain)?

    ...and just for fun:

    Vicodin for Social Exclusion – not really – but social pain and physical pain are not interchangeable

    Use of Anti-Inflammatories Associated with Threefold Increase in Homicides – cause/effect issue, of course



    Scene from Rabbits by David Lynch


    Footnote

    1 And by “people” I mean scientists and journalists alike. Read this tweetstorm from Chris Chambers, including:





    Reference

    Ratner KG, Kaczmarek AR, Hong Y. (2018). Can Over-the-Counter Pain Medications Influence Our Thoughts and Emotions? Policy Insights from the Behavioral and Brain Sciences. Feb 6:2372732217748965.

    in The Neurocritic on February 12, 2018 02:57 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    4 questions about the new U.S. budget deal and science

    A new spending package could lead to U.S. science agencies getting a bump in funding.

    in Science News on February 09, 2018 11:16 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The small intestine, not the liver, is the first stop for processing fructose

    In mice, fructose gets processed in the small intestine before getting to the liver.

    in Science News on February 09, 2018 05:15 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Could virtual reality become a treatment reality for lazy eye patients?

    Although 2018 is in full swing, there is still time to take a look back at one of 2017’s most talked about publications in BMC Ophthalmology. Video games have always been a popular talking point, be it in social circles, the media and even scientific research. Researchers have been weighing out the pros and cons of video games for many years now, which have contributed to the concept of their potential use for treatments.

    Amblyopia is often described as a reduction of vision in one eye due to the brain and the eye failing to co-operate.

    In recent years, researchers have started to explore the use of video games in the treatment of amblyopia, more commonly known as lazy eye. Amblyopia is often described as a reduction of vision in one eye due to the brain and the eye failing to co-operate. To compensate, the brain will favour the other eye. This can be caused by visual deprivation due to eye conditions that limit the sight from one eye.

    Treatment for amblyopia often includes the use of an eye patch or eye drops that ‘blur’ the vision in one eye to encourage the use of the less favoured (amblyopic) eye. However, studies have suggested that this type of treatment is mostly effective in children up to 7 years of age.

    Video games as treatment

    More recently, alternative approaches, such as dichoptic visual training and video games have been considered. Researchers have studied the effect of engaging with video games on both adult and child amblyopia patients. Studies such as these were conducted using games consoles, PCs or the use of iPads, with some promising results.

    In their clinical trial, Žiak et al. have incorporated the use of the popular Oculus Rift headset with dichoptic visual training, which involves the presentation of high contrast stimuli to the amblyopic eye and low contrast stimuli to the favoured eye in an attempt to balance visual input.

    A figure taken from Žiak et al’s study showing the Oculus Rift headset and a LCD screen which shows what the participant is seeing.

    In the study, 17 adult participants with amblyopia were given dichoptic visual training via Oculus Rift headsets and the Diplopia Game which was developed by Vivid Vision. The patients had a choice of a space game, in which the user had to guide space ship through rings in an obstacle course, and a block breaker game.

    Each game was designed to include a “dichoptic setting” which combined the concept of dichoptic visual training with the objectives of the game. In the space game the colourful intergalactic obstacles were only presented in the amblyopic eye, whilst the spaceship was only presented in the favoured eye, which stopped the patients “cheating” by closing one eye.

    A figure taken from the paper by Žiak et al’s study, demonstrating the “dichoptic setting” in the space game that was played by the participants.

    The preliminary results from this study showed that there was improvement in the amblyopic eyes of some the participants. Žiak et al. demonstrated that the application of dichoptic visual training using a virtual headset such as the Oculus Rift is a potentially useful treatment for amblyopia. Furthermore, these preliminary results provide a stable premise for future research into alternative treatments for amblyopia and the clinical use of virtual reality.

    The post Could virtual reality become a treatment reality for lazy eye patients? appeared first on BMC Series blog.

    in BMC Series blog on February 09, 2018 04:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Astrocytes in Synaptic Control Symposium

    Recorded as a panel discussion following the UTSA Neurosciences Institute’s 2018 research symposium. The group discusses the tripartite synapse concept, which was coined by two of our panelists, Phil Haydon and Alfonso Araque, in the late 1990s. The group considers the diverse mechanisms of astrocyte-neuron communication, and the magnitude of how we are beginning to redefine the neural circuits of behavior and disease based on this new framework. Hosted by Salma Quraishi. Duration: 49 minutes Discussants:(in alphabetical order) The Panelists Alfonso Araque, Robert & Elaine Larson Neuroscience Research Chair, University of Minnesota Medical School Philip Haydon, Annetta and Gustav Grisard Professor of Neuroscience, Sackler School of Medicine, Tufts University Erik Herzog, Professor of Biology, Washington University in St. Louis Carlos Paladini, Professor of Biology, UTSA Joining the discussion James Lechleiter, Professor of Cells Systems & Anatomy, UT Health San Antonio Salma Quraishi, Assistant Professor of Research, UTSA Matt Wanat , Assistant Professor, UTSA Charles Wilson, Ewing Halsell Chair, Director of the UTSA Neurosciences Institute acknowledgement: JM Tepper for original music.

    in Neuroscientists talk shop on February 09, 2018 03:57 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Hostile Questions at Scientific Meetings

    A brief letter in Nature got me thinking this week: Don’t belittle junior researchers in meetings Anand Kumar Sharma writes to urge scientists not to grill their junior colleagues at conferences: The most interesting part of a scientific seminar, colloquium or conference for me is the question and answer session. However, I find it upsetting to witness the unnecessarily hard time that is increasingly given to junior presenters at such meetings. As inquisitive scientists, we do not ha

    in Discovery magazine - Neuroskeptic on February 09, 2018 02:45 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Let your kids help you, and other parenting tips from traditional societies

    Hunter-gatherers and villagers have some parenting tips for modern moms and dads.

    in Science News on February 09, 2018 01:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    A new approach to experimentally lengthening sleep in short-sleeping teens

    Short sleep is common in teens, particularly on school nights, with a majority obtaining less than the eight to ten hours recommended. Many factors contribute to insufficient sleep during adolescence including increased social and academic demands, bedtime autonomy, the use of electronics, and early school start times coupled with a biological and behavioral tendency to stay up later. Although common, short sleep can be detrimental to healthy functioning in teens and has been associated with worse school performance, negative mood, difficulty with attention, and increases in risky behaviors and accidental injuries. Although there is a good understanding of the negative consequences of short sleep, more work is needed to determine whether extending sleep in teens improves these outcomes.

    To learn more about the effects of sleep patterns on teens, we spoke to Tori Van Dyk, PhD and author of “Feasibility and Emotional Impact of Experimentally Extending Sleep in Short-Sleeping Adolescents”, reflected in the interview below.

    What was new in your approach to evaluating the effects of extending sleep in adolescents?

    Experiments are important, because they can really show cause-effect relationships. But often they can stray far from real-world situations. In the case of teen sleep, prior experimental approaches have been limited to the summer months or have been conducted in laboratory settings. With unique schedules, social interactions, and stressors, the life of a teen may be very different outside of the lab or during the school year. We didn’t want to restrict sleep during the school year, because that could do harm (for example, poor classroom learning or poor test scores). Instead, we developed a home-based experimental protocol that could be ethically implemented on school nights, when short sleep is most common for teens, by helping teens who were already getting too little sleep get a bit more. We were able to design a protocol which significantly lengthened habitually short-sleeping teens’ sleep during school nights for a two week period. We could then compare functioning when sleep was improved to functioning when the teens experienced their typical, short sleep duration.

    What strategies did you use to help adolescents extend their sleep during the school week and how much did sleep times increase?

    Our protocol had two experimental sleep conditions: during the habitual sleep condition we asked the short-sleeping teens to go to bed and wake up at their normal times and during the extended sleep condition we worked to increase their time in bed by 1.5 hours per night. On average during the habitual sleep condition, teens slept about six hours per night. To strategize ways to lengthen sleep for a two week period during the sleep extension condition, we included both the parent and teen in a 25-minute problem-solving session. Most teens in our study were already sleeping in as late as possible in the morning, often because of early school start times. We could not control school start times, so we primarily used behavioral strategies to move bedtimes earlier. These strategies included securing buy-in from both the parent and teen, setting specific goals for bedtime and wake time, identifying and problem-solving barriers to moving bedtime earlier (e.g., completing homework earlier in the day), encouraging positive routines leading up to bed, promoting healthy sleep hygiene (e.g., eliminating digital screens at bedtime), and having teens self-monitor sleep via daily diaries. We also positively reinforced teens for adhering to the sleep schedule and encouraged parents to do the same. These strategies were successful; the teens who finished the study averaged over 70 extra minutes of sleep per night during the sleep extension condition.

    What benefits of sleep extension were you able to document?

    Despite the notion that perhaps teens who don’t sleep a lot simply do not need as much sleep, we found that by lengthening sleep on school nights, these teens experienced improvements in their emotional functioning. When they slept more, teens weren’t as sleepy, fatigued, or angry but did report feeling more energetic and better able to concentrate.

    Did participating adolescents and their parents recognize the benefits of increasing sleep time on school nights?

    The fact that parents and teens responded to questions more positively during the extended sleep condition is an indication that there was a perceived benefit to sleeping more. However, because we did not ask for a direct comparison of sleepiness or mood between the two condition, we can’t be sure that parents and teens were fully aware of these benefits or if perceived benefits led to any long term behavior change. What we do know is that, despite teens reporting that lengthening their sleep for a two week period was challenging, a vast majority of parents and teens were satisfied with the protocol and would recommend the study to other families.

    Featured image credit: Photo by Krista McPhee. CC0 public domain via Unsplash.

    The post A new approach to experimentally lengthening sleep in short-sleeping teens appeared first on OUPblog.

    in OUPblog - Psychology and Neuroscience on February 09, 2018 11:30 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Thinking in a second language drains the imagination of vividness

    GettyImages-872848440.jpgIt is fascinating to wonder how these effects might play out in the real world, particularly in international politics (Image via Getty/Thierry Monasse)

    By Christian Jarrett

    Mental imagery helps us anticipate the future, and vivid mental pictures inject emotion into our thought processes. If operating in a foreign language diminishes our imagination – as reported by a pair of psychologists at the University of Chicago in the journal Cognition – this could affect the emotionality of our thoughts, and our ability to visualise future scenarios, thus helping to explain previous findings showing that bilinguals using their second language make more utilitarian moral judgments, are less prone to cognitive bias and superstition, and are less concerned by risks.

    Sayuri Hayakawa and Boaz Keysar began by instructing 359 online participants, all native English speakers who were also fluent in Spanish, to mentally simulate 35 different sensory experiences, such as imagining the feeling of sand, the taste of salt, or the sight of the sun sinking below the horizon. After imagining each experience, the participants rated how vivid it was. Critically, half the participants performed the challenge in English, while for the others, all the instructions were in Spanish.

    Overall, the participants who completed the mental simulations in their second language, Spanish, reported that their mental images were less vivid. This was true for six out of the eight sensory categories tested. The exceptions were for gustatory and olfactory images (in fact, the latter were more vivid in Spanish than English, perhaps, the researchers speculated, because of the rich connotations of Spanish cuisine).

    You are probably thinking that there are many possible explanations for why participants might self-report their mental images as being less vivid in a second tongue and that this isn’t a convincing finding. To obtain a more objective measure, Hayakawa and Keysar conducted a second experiment with 307 more participants, all native Mandarin speakers who were also fluent in English, that involved them completing a word test that depended on visualisation for accurate answering. For example, a typical item asked the participants to look at the words “carrot”, “mushroom” and “pen” and say which shape is the most different. Half the participants completed this test in their native Mandarin, the others in English.

    The participants who completed this task in English, their second language, performed less accurately, suggesting they had found it more difficult to visualise the different shapes. You might wonder if this was just an effect of vocabulary competence rather than mental imagery, but the researchers checked for this possibility in two ways.

    They had the participants repeat the test with pictures rather than words (so therefore not requiring visualisation). They also asked the participants to complete a similar test that required them to identify the odd one out based on word meaning, with no need for visualisation (for example, identifying the category which is most different from “carrot”, “mushroom” and “pen”). The results were clear: the participants’ accuracy was poorest of all when using their second language to perform the test that required visualisation.

    These new findings don’t tell us why mental imagery is diminished in a foreign language, although the researchers’ favoured explanation is that when we use our imagination we often draw on past experiences that likely occurred in our native tongue. Imagining in a foreign language may therefore impede the ability to use these memories effectively (consistent with this, it’s known that imagination uses brain areas that overlap with autobiographical memory).

    The researchers used a final experiment to test the idea that diminished mental imagery might explain, at least partly, why we make more utilitarian judgments in a second language. For this they asked 800 native German speakers, also fluent in English, to complete the famous Trolley thought experiment, either in their native tongue or in English. This moral dilemma asks whether you would sacrifice one large man – by pushing him off a bridge to stop a speeding trolley – to save the lives of five people who would otherwise be crushed to death by the trolley.

    Once again, half the participants read and answered the Trolley dilemma in their native German, the others in English. Additionally, all the participants stated how vivid different aspects of the dilemma were in their minds, including the large man, the would-be victims and the scene overall. Replicating earlier research, the participants who took part in English, their second language, made more utilitarian judgments, in that more of them were willing to sacrifice the large man for the five would-be victims. Importantly, this difference was explained partly by the fact that those taking part in English said that their mental picture of the large man was less vivid than those participating in German. Presumably, when your image of the man is less real, it is easier to contemplate shoving him off the bridge.

    “Over the last few years, there has been growing evidence that the use of a foreign language affects many aspects of our experiences ranging from emotional responding to decision-making,” Hayakawa and Keysar concluded. “Here we provide some evidence that such phenomena may occur because the world imagined through a foreign language is less vivid than through a native tongue.”

    It is fascinating to wonder how these effects might play out in the real world, particularly in international politics. Just this week it was reported that Sir Nick Clegg, former deputy Prime Minister of the UK and a leading Remain advocate, said he had encountered a “barely concealed, almost sneering disregard for the politics of identity and the politics of patriotism” among EU officials. English is the main language used by EU civil servants, meaning that many EU officials are operating in a second language, potentially influencing their thoughts to be less vivid, less emotional and more utilitarian. “The genesis of European integration was to go above and beyond the trap of patriotic politics,” Clegg continued. “But it was a terrible misreading of what makes people tick. We are all tribal people.” It is speculative to say, of course, but perhaps it is easier for European bureaucrats to reject such tribal instincts when they are thinking and interacting in their non-native tongue.

    Using a foreign language reduces mental imagery

    Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

    in The British Psychological Society - Research Digest on February 09, 2018 08:48 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Trove of hummingbird flight data reveals secrets of nimble flying

    Tweaks in muscle and wing form give different hummingbird species varying levels of agility.

    in Science News on February 08, 2018 11:37 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The wiring for walking developed long before fish left the sea

    These strange walking fish might teach us about the evolutionary origins of our own ability to walk.

    in Science News on February 08, 2018 10:40 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Watch nerve cells being born in the brains of living mice

    For the first time, scientists have seen nerve cells being born in the brains of adult mice.

    in Science News on February 08, 2018 07:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Top 10 papers from Physical Review’s first 125 years

    The most prestigious journal in physics celebrates its 125th anniversary, highlighting dozens of its most famous papers.

    in Science News on February 08, 2018 04:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Smart windows could block brightness and harness light

    A new type of material pulls double-duty as window shade and solar cell.

    in Science News on February 08, 2018 02:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    50 years on, nuclear fusion still hasn’t delivered clean energy

    In 1968, scientists predicted that the world would soon use nuclear fusion as an energy source.

    in Science News on February 08, 2018 12:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    How are low/er strength wines and beers marketed online?

    Alcohol is the fifth leading cause of death and disability globally. Increased availability of low/er strength alcohols has the potential to reduce alcohol consumption if they are consumed as substitutes for – rather than in addition to – higher strength products. For consumers to treat these products as substitutes, the marketing messages used will likely play an important role.

    In our study published today in BMC Public Health, we report an analysis of marketing messages on producers’ and retailers’ websites for low/er and regular strength wines and beers sold online by the four main supermarkets in the UK (Tesco, ASDA, Sainsbury’s, Morrisons).

    We compared messages marketing low/er strength wines and beers across 86 web pages with messages marketing comparable regular strength wines and beers across another 86 web pages. Low strength alcohol products were defined as containing less than 1.2% alcohol and lower strength alcohol products as containing less than 8.5% for wine and 2.8% for beer.

    Low/er strength beer was described as suitable for consumption on additional occasions to regular strength products including sports events

    We identified four main themes in the marketing messages: occasions, health-related, alcohol content and taste. Of these themes, the different occasion, health-related and alcohol content messages were more often present for the low/er strength alcohol products.

    Low/er strength wines were more likely to be marketed as suitable for consumption on any occasion or every day, for example as “lunchtime treats” or “perfect for all occasions”. Low/er strength beer was described as suitable for consumption on additional occasions to regular strength products including sports events such as “to refresh thirsty sportsmen and women”.

    Marketing for low/er strength wines and beers was also more likely to include text or images associated with health, and information about calorie and carbohydrate content. We didn’t find any messages about drinking less or alcohol associated harms.

    Taken together, the pattern of these findings suggests that low/er strength alcohol products may not contribute to a public health strategy to reduce alcohol consumption and related harms. These findings also add to an existing literature that highlights how measures intended to benefit public health – in this case wider availability of low/er strength alcohols – may benefit industry to the detriment of the health of the public.

    Because we only looked at low/er strength wines and beers found on the websites of the four main supermarkets in the UK (and the respective producers’ webpages) future studies could extend the present findings by including other marketing platforms, and in different countries.

    The post How are low/er strength wines and beers marketed online? appeared first on BMC Series blog.

    in BMC Series blog on February 08, 2018 09:30 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Cognitive approach to lie detection rendered useless by made-up alibi

    GettyImages-531979628.jpgBy Alex Fradera

    The desire to catch people in a lie has led to the development of techniques that are meant to detect the physical markers of dishonesty – from the polygraph to brain scans. However, these methods are often found wanting. The insights of cognitive psychologists have arguably fared better, based on the idea that lying is more mentally demanding than telling the truth – real knowledge is automatically called to mind when we are questioned, and this needs to be inhibited  before we answer, leading to slower responses. Unfortunately new research in the Journal of Experimental Psychology: Applied seems to pour cold water on the idea of using these subtle reaction-time differences to develop objective (and cheap) measures to get at the truth. The findings suggest that all it takes to render this cognitive approach ineffective is a prepared false alibi.

    The University of Würzburg team led by Anna Foerster set up a quasi-realistic situation, sending each of their 36 participants into a room – containing various items like a computer, pen and paper and USB stick – to read instructions in a sealed envelope that sent them on a mission. The mission involved drawing shapes on a piece of paper, tearing it and hiding the pieces in different parts of the room. Once they’d finished, the participants discovered they were going to be questioned about their actions, and that they must keep the mission secret by giving a false account of what they’d done. Their alibi was that they’d used the USB stick to email a file out of the room.

    After familiarising themselves with their alibi, the participants went to another room where they answered questions on computer about the mission, mixed in with “routine” questions with predictable answers (e.g. Did you cross the street today? Did you win the lottery?).

    The participants’ task was to use Yes/No responses to answer the routine questions honestly, while responding to the mission questions according to their false alibi. At other times (denoted with an onscreen cue), they had to do the reverse, lying in response to the true routine questions while giving a true account of their mission.

    The participants answered more slowly and less accurately when lying in response to the routine questions than when telling the truth. In other words, it was easier to tell the truth than deny it, consistent with the research literature on the cognitive demands of lying. However, for the mission questions, the opposite was true. The participants’ response times and accuracy suggested they found it easier to answer in line with their false alibi, than tell the truth about what they’d actually done on the mission.

    The next experiment looked at another cognitive sign of lying based on pointing movements. The mission and alibi were the same as before, but this time the participants answered questions on an iPad, tracing their finger towards onscreen Yes or No options which popped up randomly at different sides of the screen. Normally when people lie in this kind of situation, they hesitate before starting their response, they move more slowly, and instead of making a beeline for the lie response, their finger’s trajectory is indirect, curving towards the honest response. All these effects were seen in the current experiment when participants denied the truth of routine questions. But for the mission questions, the reverse was the case: it was the confessional true responses that were slower and curvier.

    There are a number of competing theories for what’s behind these findings. An alibi might weaken the presence of the true information, or become such a strong mental model that now it’s the first point of call. But regardless, the new results don’t look good for what was considered a solid lie detection application – and existing attempts to apply this cognitive approach to real-life court cases (such as this investigation in 2008 that studied the speed of a woman’s answers alongside brain scan evidence) may need to be looked at with renewed skepticism.

    Lying upside-down: Alibis reverse cognitive burdens of dishonesty

    Alex Fradera (@alexfradera) is Staff Writer at BPS Research Digest

    in The British Psychological Society - Research Digest on February 08, 2018 09:29 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Making charts accessible for people with visual impairments

    A collaboration between Elsevier and Highcharts sets a new standard for chart accessibility

    in Elsevier Connect on February 08, 2018 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Do Brain-Wiring Differences Make Women More Vulnerable to Concussions?

    Scientific American Mind 29, 8 (2018). doi:10.1038/scientificamericanmind0318-8

    Author: Simon Makin

    Female axons—brain cells' output cables—are shown to have a thinner structure

    in Scientific American Mind on February 08, 2018 12:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Neuroscience of Changing Your Mind

    Scientific American Mind 29, 5 (2018). doi:10.1038/scientificamericanmind0318-5

    Author: Bret Stetka

    New findings suggest it is more complicated than scientists thought

    in Scientific American Mind on February 08, 2018 12:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Is Addiction a Disease?

    Scientific American Mind 29, 47 (2018). doi:10.1038/scientificamericanmind0318-47

    Author: Elly Vintiadis

    The current medical consensus about addiction may very well be wrong

    in Scientific American Mind on February 08, 2018 12:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    How to Avoid “Purchase Regret”

    Scientific American Mind 29, 44 (2018). doi:10.1038/scientificamericanmind0318-44

    Authors: Conor Artman, Joseph Sherlock & Dan Ariely

    Follow the millennial generation's four golden rules of personal finance

    in Scientific American Mind on February 08, 2018 12:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Do Sexual Harassment Prevention Trainings Really Work?

    Scientific American Mind 29, 40 (2018). doi:10.1038/scientificamericanmind0318-40

    Authors: Vicki J. Magley & Joanna L. Grossman

    Remarkably little research has been performed on the effectiveness of employers' efforts to raise awareness

    in Scientific American Mind on February 08, 2018 12:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Right Stuff in Space

    Scientific American Mind 29, 34 (2018). doi:10.1038/scientificamericanmind0318-34

    Author: Corinna Hartmann

    A successful Mars mission requires that the crew can collaborate under severe stress. Psychologist Dietrich Manzey knows how to avoid strife in outer space

    in Scientific American Mind on February 08, 2018 12:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    My Year on “Mars”

    Scientific American Mind 29, 31 (2018). doi:10.1038/scientificamericanmind0318-31

    Author: Christiane Heinicke

    Physicist Christiane Heinicke spent 365 days sequestered with five others in a geodesic dome on the side of a Hawaiian volcano to test what isolation might do to the psyches of the crew on a Mars mission

    in Scientific American Mind on February 08, 2018 12:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Importance of Fostering Emotional Diversity in Boys

    Scientific American Mind 29, 28 (2018). doi:10.1038/scientificamericanmind0318-28

    Authors: June Gruber & Jessica L. Borelli

    Boys grow up in a world inhabited by a narrower range of emotions

    in Scientific American Mind on February 08, 2018 12:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Secret to a Better Night's Sleep: A Sense of Purpose?

    Scientific American Mind 29, 25 (2018). doi:10.1038/scientificamericanmind0318-25

    Author: Daisy Grewal

    Intriguing new research suggests a positive sleep role for a meaningful life

    in Scientific American Mind on February 08, 2018 12:00 AM.