There have been fresh calls for the links between mental health and the effect on adolescents mental health to be examined in more detail. The links between mental health and usage are not one dimensional, and most research points to variable of content and length of time. However one issue that is consistent across all components of social media is content that is not age appropriate. One does not have to look far for adult themed accounts on Instagram, Facebook and Twitter.
All platforms have a mechanism for reporting, however the process can take up to 72 hrs to even review the content. Algorithms are not sophisticated enough to detect content which has been manipulated to avoid detection and even when reported will not be removed. SEN World has long petitioned the platforms with evidence of accounts that have been reported and are most definitely adult, the content remains up. There has been no response from Twitter, Facebook or Instagram despite the clear evidence presented. We believe more needs to be done from the providers end, and avoidance under the direction of ' we are developing our detection ' is not sufficient and current detection mechanism lack efficacy.
Professionals who work with children and adults with special needs, know they are particularly vulnerable when it comes to visual content and suggestion.