Hong Kong internet users and psychologists have accused Facebook of acting unethically after learning that it secretly manipulated the emotions of its members as part of a psychological experiment. They also called for more transparency on the part of the social media giant. "It's ethically wrong," social worker Daniel Kwan Ka-king, 28, said yesterday. "I'm concerned about my privacy." Researchers altered the feeds of 689,000 users over one week in November 2012, making them display more positive or more negative posts to see if this would result in more positive or negative postings from other users. Facebook conducted the research with Cornell University and the University of California, San Francisco, to see if "emotional contagion" spreads across a social network. Critics of the study, published by the US National Academy of Sciences last month, likened affected users to "lab rats". "There was no informed consent," said Michael Eason, a professional counsellor with private practice Psychology Resources. "The information … is important, but they didn't go about it in an ethical way." He said manipulating the world view of those who were clinically depressed or had social anxiety issues was particularly risky. "For someone who's suffering from mental health issues [seeing too many negative things] could lead to self-harm." Facebook said users agreed to take part in such experiments when signing up for the service. But Eason said assenting to the user agreement should not count as informed consent, particularly when there was a risk of harm. Co-author Adam Kramer took to the social network to defend the paper. "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. "We found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion encourages it rather than suppresses it."