Facebook has denied it is targeting insecure young people in order to push advertising, amid a row over a leaked document.
A research paper, reported on but not published by The Australian newspaper, was said to go into detail about how teenage users post about self-image, weight loss and other issues.
Facebook confirmed the research was shared with advertisers, but said the article was “misleading”.
“Facebook does not offer tools to target people based on their emotional state,” the network said.
“The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook.
“It was never used to target ads and was based on data that was anonymous and aggregated.
“Facebook has an established process to review the research we perform. This research did not follow that process, and we are reviewing the details to correct the oversight.”
‘Stressed’ and ‘Stupid’
According to The Australian, the report was seen by marketers working for several major Australian banks, and was written by Facebook executives David Fernandez and Andy Sinn.
The document said Facebook had the ability to monitor photos and other posts for users who may be feeling “stressed”, “defeated”, “anxious”, “nervous”, “stupid”, “overwhelmed”, “silly”, “useless” or a “failure”.
The research only covered Facebook users in Australia and New Zealand.
The statement on Monday appeared to soften an earlier comment which mooted the possibility of disciplinary action over the document, though the BBC understands such action could still be taken, pending an investigation into how and why the research was carried out.
The company has guidelines that take into account any possible “adverse effects” on users, or whether people would reasonably expect the network to conduct such analysis. The company said that it appeared the research did go against some of these policies.
Facebook has faced criticism in the past over manipulating users’ feeds for the purpose of research. In 2014 it was discovered the firm was intentionally showing 700,000 users certain types of content and seeing if their emotions could be manipulated.