Tag: mental health help

  • What Is Mental Health First Aid?

    What Is Mental Health First Aid?

    These classes offer participants useful knowledge that can be utilized in the event of a mental health emergency.

    Mental Health First Aid is a term that’s gained a lot of momentum lately, especially with Lady Gaga’s Born This Way foundation, which encourages people to learn about mental health.

    A report on CNN explains the importance of taking mental health first aid classes, and how mental health training can be incredibly beneficial.

    These classes began 12 years ago, and close to two million people have attended them since. In 2015, the government allotted $20 million for this program, and in most areas of the U.S. you can take this class for free.

    Inside The Program

    While an eight-hour seminar can’t take the place of seeing a therapist or mental health professional, the program has been likened to learning CPR to equip yourself with lifesaving skills.
     
    Betsy Schwartz, an executive at the Mental Health First Aid program, says, “We’re not training anyone to be a professional. We’re only teaching people how to be an empathetic friend, family member or coworker.”
     
    CNN had attended a Mental Health First Aid seminar in Ohio, a state that’s had to grapple with alarming rates of addiction and suicide.

    As one social worker explained, “Ohio, since 1999, has had a 30% increase in suicide deaths and is above the national average for suicide rates. So it’s really important that we’re getting information in people’s hands. They’re not easy conversations to have and oftentimes people shy away from that.”

    At this seminar, instructors explained the signs to look out for with depression and anxiety, and how to help calm a person in the midst of a panic attack.

    The acronym ALGEE was introduced and explained.

    A – Assess for risk of harm or suicide 

    L – Listen non-judgmentally 

    G – Give information and reassurance 

    E – Encourage professional help, if needed 

    E –  Encourage self-help

    Diving deeper into the final step, encourage self-help, one instructor explained, “It’s going to be very important to have some buy-in into [someone’s] own recovery. We all like to be able to say ‘I did this.’ Get them involved in those decision-making skills.”

    One person who attended the class had lost a brother from suicide and had mental illness in her family. “Everyday in life you forget to listen and be aware,” she said. “If you’re uncomfortable, taking this class will help you become more confident in reaching out to somebody.”

    View the original article at thefix.com

  • Prince William Helps Promote UK’s Crisis Text Line "Shout"

    Prince William Helps Promote UK’s Crisis Text Line "Shout"

    Crisis Text Line has already been hugely successful in the US and Canada.

    Crisis Text Line, the non-profit service that allows people in the midst of mental health crises or who just need to talk to text rather than have to speak on the phone, is taking its first step abroad by expanding to the UK as “Shout.”

    This move was made possible with a generous grant from the Duke and Duchess of Cambridge and the Duke and Duchess of Sussex. The Duke of Cambridge, also known as Prince William, made a three-minute video promoting the service and encouraged UK residents to sign up as volunteers.

    “As texting is private and silent, it opens up a whole new way to find help. It provides instant support. You can have a conversation anywhere, at any time ― at school, at home, on the bus, anywhere,” he says in the video. “I am incredibly excited to be launching this service knowing it has the potential to reach thousands of vulnerable people every day.”

    According to a statement made by Crisis Text Line CEO Nancy Lublin, all four members of the royal family visited the Shout offices multiple times, “meeting with staff and volunteers to see firsthand how the service and platform function.”

    The service came to Canada in 2018 and has been a success there as well as in the U.S., where they recently reached their 100 millionth text milestone.

    To celebrate, Crisis Text Line created Crisis Trends―a data visualization tool that allows anyone to explore the mass amounts of data they collected from their many conversations. Users can see which subjects are most common in which U.S. state (plus Puerto Rico), what time of day and day of the week people text in for which issue, and how trends have changed over time.

    After the UK, Crisis Text Line plans to expand to Ireland, Australia and South Africa this year, followed by Latin America in 2020. Lublin’s goal for the service, grown out of her own habit of giving support and advice to people who need it, is to expand across the globe and improve mental health for all people.

    “We are proud of the work Crisis Text Line and our crisis counselors have done in the United States to ease the pain of Americans,” she said in a statement. “If other world leaders will follow the lead of Prince William, together, we can end this epidemic of emotional crisis.”

    View the original article at thefix.com

  • Walgreens To Train Staff In Mental Health First Aid

    Walgreens To Train Staff In Mental Health First Aid

    The 8-hour course will teach pharmacists mental health “literacy” and “how to help someone in crisis and non-crisis situations.”

    Walgreens’ latest public health initiative aims to teach pharmacists and staff how to identify and respond to signs of mental health or substance use issues.

    Through a partnership with the National Council for Behavioral Health and the American Pharmacists Association, the national drug store chain is training staff in mental health first aid—an 8-hour course on “mental health literacy, understanding risk factors and warning signs for mental health and addiction concerns, and strategies for how to help someone in both crisis and non-crisis situations,” the company stated.

    “With the growing need for services and resources to help those living with mental health conditions, as well as substance use and addiction, we can play an important role by giving our pharmacists and certain team members the training to help those in crisis,” said Alex Gourlay, chief operating officer of Walgreens Boots Alliance.

    More than 1.5 million people in the U.S. have completed the course.

    “One in five people experiences a mental health or substance use issue in a given year and it’s likely that most of those individuals use a pharmacy’s services during that year,” said Linda Rosenberg, CEO of the National Council for Behavioral Health.

    This year, Walgreens will have installed safe medication disposal kiosks at all of its locations. It also offers naloxone without the need for a prescription.

    In 2016, the company launched Walgreens.com/MentalHealth in collaboration with Mental Health America to provide a resource that connects people with treatment options, free screening tools and information such as “How to Manage Anxiety Medications” and “Helping a Family Member Who Has PTSD.”

    Another major retailer, Walmart, is supporting community mental health by establishing a mental health clinic in a store in Texas.

    Last year Walmart opened its first clinic in its Carrolton, Texas store, with plans to open more nationwide. The clinic is staffed by a licensed social worker and offers treatment for anxiety, depression, grief, relationship issues and more.

    “People don’t know how to find a behavioral health or mental health professional. People don’t know where to go and what to do,” said Dr. Russell Petrella, president and CEO of Beacon Health Options, the company that collaborated with Walmart to open the clinic. “We’re trying to mainstream behavioral health services.”

    View the original article at thefix.com

  • Mental Health Apps Could Be Sharing Your Private Data

    Mental Health Apps Could Be Sharing Your Private Data

    A new study found that dozens of mental health apps shared user data with various advertisers, including big names like Facebook and Google.

    Despite the hope of confidentiality, individuals who use mental health apps may have their private information being shared with advertisers. 

    According to a new study published in JAMA Network Open, some mental health apps are sharing private data without the app user’s knowledge. 

    Tech the Lead reports that researchers looked into 36 different mental health-related apps. Of those 36, they discovered that 33 shared user data with various advertisers, including big names like Facebook and Google as well as smaller organizations. 

    Overall, 92% of the apps studied were determined to have shared information with a third party and about 50% of those did not notify users of doing so.

    Of the apps studied, three even explicitly stated they would not share data and nine others completely lacked a privacy policy of any sort. 

    While the shared data wasn’t all necessarily related to medical conditions or were “personally identifiable,” the fact that any information at all was shared is a red flag, says John Torous, co-author of the study.

    “It’s really hard to make an informed decision about using an app if you don’t even know who’s going to get access to some information about you,” Torous said, according to Tech the Lead. 

    Researchers did find, however, that some of the information shared was sensitive, such as journal entries or information about substance use. 

    Steven Chan, a physician at Veterans Affairs Palo Alto Health Care System who was not involved in the study but has worked with Torous before, tells The Verge that advertisers could use this information to manipulate audiences. 

    “Potentially advertisers could use this to compromise someone’s privacy and sway their treatment decisions,” he said. 

    Chan cited one example in which someone who is trying to quit smoking may be marketed cigarette alternatives. 

    “Maybe if someone is interested in smoking, would they be interested in electronic cigarettes?” he said. “Or could they potentially introduce them to other similar products, like alcohol?”

    Researchers concluded that mental health app users likely lack the access to information and the choice about such sharing practices. 

    “Data sharing with third parties that includes linkable identifiers is prevalent and focused on services provided by Google and Facebook,” the researchers wrote. “Despite this, most apps offer users no way to anticipate that data will be shared in this way. As a result, users are denied an informed choice about whether such sharing is acceptable to them.”

    View the original article at thefix.com

  • Algorithm Can Identify Depression In Speech, Text

    Algorithm Can Identify Depression In Speech, Text

    The technology could potentially be used to help more people get treatment for depression.

    Researchers at MIT have developed an artificial intelligence system that can identify depression simply from listening to people talk or by monitoring their texts. 

    The technology, which uses a neural-network model, can listen or read natural conversations in order to identify speech and communication patterns that indicate depression. 

    “The first hints we have that a person is happy, excited, sad, or has some serious cognitive condition, such as depression, is through their speech,” Tuka Alhanai, first author of the paper outlining the technology, told MIT News

    Doctors diagnose depression by asking their patients questions and listening to their responses. Machines have been hailed as a way to improve diagnostics in recent years.

    However, many of the existing systems require a person to answer specific questions and then make a diagnosis based on the answers that a person provides. “But that’s not how natural conversations work,” said Alhanai, a researcher at the Computer Science and Artificial Intelligence Laboratory (CSAIL).  

    The new system can be used in more situations because it monitors natural conversations. 

    “We call it ‘context-free’ because you’re not putting any constraints into the types of questions you’re looking for and the type of responses to those questions,” Alhanai says. “If you want to deploy [depression-detection] models in a scalable way… you want to minimize the amount of constraints you have on the data you’re using. You want to deploy it in any regular conversation and have the model pick up, from the natural interaction, the state of the individual.”

    The new model works by analyzing speech and text from people who were depressed and those who were not. It then identified patterns in each group. For example, people with depression might speak more slowly or take longer pauses between words. In text messages they might use words like “low,” “sad” or “down” more commonly. 

    “The model sees sequences of words or speaking style, and determines that these patterns are more likely to be seen in people who are depressed or not depressed,” Alhanai said. “Then, if it sees the same sequences in new subjects, it can predict if they’re depressed too.”

    The technology could potentially be used to help more people get treatment for depression. Although the condition is very common, 37% of people with depression do not receive any treatment.

    Alhanai’s team said their technology could be used to develop apps that monitor a person’s conversations and send alerts when their mental health might be deteriorating. It could also be used in a traditional counseling or medical setting to assist medical professionals. 

    View the original article at thefix.com