Yoga and Mindfulness Improve Emotional Health of Third-Graders

Psychology

Education

Health and Medicine

Date: December, 2017

Source: Science Daily, Psychology Research and Behavior Management

A study made by Tulane University showed that third graders who manifest anxiety were helped in their well-being and emotional health after practicing yoga and mindfulness activities.

A group of 20 students underwent special yoga practice while a control group of 32 students underwent the usual school care including counseling. The yoga practice included breathing exercises, guided relaxation and yoga postures. The researchers used two instruments to assess the change in the students: the Brief Multidimensional Students’ Life Satisfaction Scale-Peabody Treatment Progress Battery version and the Pediatric Quality of Life Inventory.

“The intervention improved psychosocial and emotional quality of life scores for students, as compared to their peers who received standard care,” said principal author Alessandra Bazzano, associate professor of Global Community Health and Behavioral Sciences at Tulane University School of Public Health. “We also heard from teachers about the benefits of using yoga in the classroom, and they reported using yoga more often each week, and throughout each day in class, following the professional development component of intervention.”

Sources:

  • www.sciencedaily.com/releases/2018/04/180410100919.htm
  • Alessandra N Bazzano, Christopher E Anderson, Chelsea Hylton, Jeanette Gustat. Effect of mindfulness and yoga on quality of life for elementary school students and teachers: results of a randomized controlled school-based studyPsychology Research and Behavior Management, 2018; Volume 11: 81 DOI: 10.2147/PRBM.S157503

Google with your Mind

Psychology

Technology

Date: April, 2018

Source: New Scientist

Two scientists at the Massachusetts Institute of Technology has developed a device that could detect what one is thinking, send Google queries and get replies from the Internet.

Arnav Kapur and Pattie Maes, the developers, called it AlterEgo, consisting of a headset that puts sensors in seven areas on the cheeks, jaw and chin, and which detects signals in these speech-related areas. In a demonstration with New Scientist writer Chelsea Whyte,  Kapur was asked several questions such as the population of Santiago, Chile, the square root of 360,005 and the product of two large numbers. Kapur just repeated the questions in his mind. The computer responded with the correct answers. In a test with eight people, AlterEgo could recognize words and numbers with 90% accuracy.
AlterEgo is just one of several artificial intelligence softwares being developed to read thoughts through brain waves or nerve signals. The implications can be alarming. The technology now is still in its infancy, but it seems only a matter of time before such devices can really read our thoughts with high accuracy.


Source: New Scientist, April 7-13, 2018

Creativity Linked to Alpha Brain Waves

Psychology
Biology
Humanities
Education
Date: March 2018
Source: New Scientist

If you need to produce your best creative work, try boosting your alpha brainwaves.
Joel Lopata at the University of Western Ontario, Canada, and his colleagues have found that people with more synchronised alpha waves are more creative and produce work of higher quality.

Source: https://www.newscientist.com/article/2162646-very-creative-people-have-a-special-kind-of-brain-activity/

Creative People%u2019s Brain Activity

Artificial Intelligence Can Interpret Brain Scans to Know What a Person Sees

Psychology
Technology
Date: March 2018
Source: New Scientist

Artificial Intelligence in Japan is now able to guess what a person sees by analyzing brain scans.

An article in New Scientist states that “the AI is given an image of a person’s brain, taken with an fMRI scanner. The fMRI scanner shows the surges in blood flow that correspond with activity, so the different parts of the brain involved in processing the image light up on the scan. From this, the AI then produces a caption based on what it thinks the person was viewing. For example, one caption it generated was ‘A dog is sitting on the floor in front of an open door,’ which correctly described the scene.”

Source: https://arxiv.org/pdf/1802.02210.pdf

New Scientist, March, 2018