Last September, just a few weeks into the school year, Sabine Polak got a call from the guidance counselor. Her 14-year-old daughter was struggling with depression and had contemplated suicide."I was completely floored," said Polak, 45, who lives in Valley Forge, Pennsylvania. "I had no clue she was even feeling remotely down at all. When I asked her about it, she just kept saying she wanted to get away from it all ... but I didn't know what that meant."After taking her to a crisis center, which banned phone use for anyone checking in, Polak learned from her daughter that the pressures of social media were driving her increased anxiety. The main source of stress: waiting for her friends to open and respond to messages and photos on Snapchat."It became really addictive -- the sense that you always have to be on, and always have to be responding to someone in order to be seen or to exist," she said. "She would look at her phone and go from calm to storming out of the car, and the rest of the night, just curled up in her bed."Polak turned on some of the phone's parental controls, but they were easy for her daughter to circumvent. She took the phone away but worried this move would only drive her daughter to think about taking her own life again. She gave the phone back only to find her daughter "self-soothing" on another social app, TikTok -- so much, in fact, that "she literally believes that she can't fall asleep without it." As Polak put it, her daughter "feels lost, like, 'I have no idea what to do with myself if I'm not on social media.'"Polak is among a generation of parents who did not spend their childhoods with social media apps and are now struggling to understand and navigate the potential harms that social media can have on their kids' mental health as they grow up. In interviews over the last month, nearly a dozen parents spoke with CNN Business about grappling with how to deal with teens who experience online harms such as bullying, body image issues and pressures to always be Liked. Most of the parents said these issues either began or were exacerbated by the pandemic, a time when their children were isolated from friends, social media became a lifeline and the amount of screen time increased. The issue of social media's impact on teens gained renewed attention this fall after Facebook whistleblower Frances Haugen leaked hundreds of internal documents, some of which showed the company knew of Instagram's potential to negatively impact one's mental health and body image, especially among teenage girls. But Haugen also touched on the impact on parents. During her testimony before Congress in October, Haugen cited Facebook research that revealed kids believe they are struggling with issues like body image and bullying alone because their parents can't guide them."I'm saddest when I look on Twitter and people blame the parents for these problems with Facebook. They say, 'Just take your kid's phone away.' But the reality is that it's a lot more complicated than that," she said in her testimony."Very rarely do you have one of these generational shifts where the generation that leads, like parents who guide their children, have such a different set of experiences that they don't have the context to support their children in a safe way," she added. "We need to support parents. If Facebook won't protect the kids, we at least need to help the parents support the kids."Facebook, which rebranded as Meta in October, has repeatedly tried to discredit Haugen and said her testimony and reports on the documents mischaracterize its actions and efforts. But the outcry from Haugen's disclosures pressured Facebook to rethink the launch of an Instagram app for children under 13. (Children under the age of 13 are not currently permitted to create accounts on any Meta platforms.)It also helped spur a series of congressional hearings about how tech products impact kids, featuring execs from Facebook, TikTok and Snapchat's parent company, Snap. This week, the head of Meta-owned Instagram is set to appear before Congress as lawmakers question the app's impact on young users.In their testimonies, the TikTok and Snap executives showed humility and acknowledged the need to do more to protect their platforms. Jennifer Stout, Snap VP of global public policy, said the company is developing new tools for parents to better oversee how their children are using the app. Instagram previously said it's "increasingly focused on addressing negative social comparison and negative body image."Ahead of the Congressional appearance this week, Instagram introduced a Take a Break feature which encourages users to spend some time away from the platform. The company also said it plans to take a "stricter approach" to the content it recommends to teenagers and actively nudge them toward different topics if they've been dwelling on any type of content for too long. It's also planning to introduce its first tools for parents, including an educational hub and parental monitoring tools that allow them to see how much time their kids spend on Instagram and set time limits, starting next year."You can offer tools to parents and you can offer them insights into their teen's activity, but that's not as helpful if they don't really know how to have a conversation with their teen about it, or how to start a dialogue that can help them get the most out of their time online," Vaishnavi J, Instagram's head of safety and well-being, told CNN Business this week.
Last September, just a few weeks into the school year, Sabine Polak got a call from the guidance counselor. Her 14-year-old daughter was struggling with depression and had contemplated suicide.
"I was completely floored," said Polak, 45, who lives in Valley Forge, Pennsylvania. "I had no clue she was even feeling remotely down at all. When I asked her about it, she just kept saying she wanted to get away from it all ... but I didn't know what that meant."
After taking her to a crisis center, which banned phone use for anyone checking in, Polak learned from her daughter that the pressures of social media were driving her increased anxiety. The main source of stress: waiting for her friends to open and respond to messages and photos on Snapchat.
"It became really addictive [for her] -- the sense that you always have to be on, and always have to be responding to someone in order to be seen or to exist," she said. "She would look at her phone and go from calm to storming out of the car, and the rest of the night, just curled up in her bed."
Polak turned on some of the phone's parental controls, but they were easy for her daughter to circumvent. She took the phone away but worried this move would only drive her daughter to think about taking her own life again. She gave the phone back only to find her daughter "self-soothing" on another social app, TikTok -- so much, in fact, that "she literally believes that she can't fall asleep without it." As Polak put it, her daughter "feels lost, like, 'I have no idea what to do with myself if I'm not on social media.'"
Polak is among a generation of parents who did not spend their childhoods with social media apps and are now struggling to understand and navigate the potential harms that social media can have on their kids' mental health as they grow up. In interviews over the last month, nearly a dozen parents spoke with CNN Business about grappling with how to deal with teens who experience online harms such as bullying, body image issues and pressures to always be Liked. Most of the parents said these issues either began or were exacerbated by the pandemic, a time when their children were isolated from friends, social media became a lifeline and the amount of screen time increased.
The issue of social media's impact on teens gained renewed attention this fall after Facebook whistleblower Frances Haugen leaked hundreds of internal documents, some of which showed the company knew of Instagram's potential to negatively impact one's mental health and body image, especially among teenage girls. But Haugen also touched on the impact on parents. During her testimony before Congress in October, Haugen cited Facebook research that revealed kids believe they are struggling with issues like body image and bullying alone because their parents can't guide them.
"I'm saddest when I look on Twitter and people blame the parents for these problems with Facebook. They say, 'Just take your kid's phone away.' But the reality is that it's a lot more complicated than that," she said in her testimony.
"Very rarely do you have one of these generational shifts where the generation that leads, like parents who guide their children, have such a different set of experiences that they don't have the context to support their children in a safe way," she added. "We need to support parents. If Facebook won't protect the kids, we at least need to help the parents support the kids."
Facebook, which rebranded as Meta in October, has repeatedly tried to discredit Haugen and said her testimony and reports on the documents mischaracterize its actions and efforts. But the outcry from Haugen's disclosures pressured Facebook to rethink the launch of an Instagram app for children under 13. (Children under the age of 13 are not currently permitted to create accounts on any Meta platforms.)
It also helped spur a series of congressional hearings about how tech products impact kids, featuring execs from Facebook, TikTok and Snapchat's parent company, Snap. This week, the head of Meta-owned Instagram is set to appear before Congress as lawmakers question the app's impact on young users.
In their testimonies, the TikTok and Snap executives showed humility and acknowledged the need to do more to protect their platforms. Jennifer Stout, Snap VP of global public policy, said the company is developing new tools for parents to better oversee how their children are using the app. Instagram previously said it's "increasingly focused on addressing negative social comparison and negative body image."
Ahead of the Congressional appearance this week, Instagram introduced a Take a Break feature which encourages users to spend some time away from the platform. The company also said it plans to take a "stricter approach" to the content it recommends to teenagers and actively nudge them toward different topics if they've been dwelling on any type of content for too long. It's also planning to introduce its first tools for parents, including an educational hub and parental monitoring tools that allow them to see how much time their kids spend on Instagram and set time limits, starting next year.
"You can offer tools to parents and you can offer them insights into their teen's activity, but that's not as helpful if they don't really know how to have a conversation with their teen about it, or how to start a dialogue that can help them get the most out of their time online," Vaishnavi J, Instagram's head of safety and well-being, told CNN Business this week.
Source link