But it's harder to find statistics about how often a normal user posts, because normal is unique to the individual. Pew statistics on this are varied: a 2011 report found 15 percent of users update their status once a day; 16 percent have never updated at all.
And then there are those who post upwards of ten times a day, from daybreak into the small hours, churning out words in their thousands.
Recently a friend of mine began to post excessively on Facebook every day, usually between 5 AM and 8 AM. One day I used wordcounter.net to add up 9734 words in total; another day, 6288. His page became an archive of racing thoughts, manifested in aggressive, disturbing updates that friends found difficult to read.
Readers will remember a similar, very public series of tweets made by actor Amanda Bynes in the run-up to her eventual hospitalization, along with countless other examples both in public and private.
One in four of us suffer from mental health problems every year, and around one third of the world's population uses Facebook. "FOMO" and "online addiction" might be thrown around as buzzwords, but the overlap between mental health and social media use is a burgeoning field of study.
The Facebook use in Affective Disorders (FAD) Study is currently underway in Melbourne, Australia; it's a collaboration between researchers from the Monash Alfred Psychiatry Centre and computer scientists from the School of Intelligent Systems at RMIT University. It tracks the Facebook activity and mood of participants with bipolar disorder to work out what "normal" is for them, with the hope of alerting subjects when they begin to stray from their usual patterns towards a potential relapse.
The study launched in June, and is open to anyone in Australia who has been diagnosed with bipolar disorder and uses Facebook.
The thing about mental illness is that it's not like a blood test—you can't say, 'Oh, their Facebook level is 50, they must be becoming manic…'
I spoke to psychiatrist Rowan Miller, who was inspired to create the FAD study after meeting a patient similar to my friend when he was still a med student.
"We were talking about what it was like to have a mental illness as a student," he told me, "and I asked if there was any way we could have foreseen the relapse coming. And he said, 'Yes, my Facebook. For weeks, friends have been texting me asking if I'm ok because I've been on Facebook so much.'"
For people with bipolar disorder, a manic episode can involve symptoms like impaired judgement, aggression, euphoria, rapid speech, racing thoughts, and at worst breaks from reality. Coupled with a decreased need for sleep, for some people the lure of venting their thoughts on Facebook proves overwhelming. Miller recalled asking why the patient hadn't gone out and socialized offline; "He replied that if you're awake at three in the morning, and you have the computer in front of you, it's just the lowest barrier to socializing with other people."
In the study, volunteers grant access to their Facebook metadata, allowing Miller and his team to analyze the frequency of their Facebook posts but not the content. The team then develops behavior algorithms which track the rhythm of the user's regular Facebook posts, and alert them when posting frequencies show inconsistencies.
"More than half of our participants have had relapses on Facebook before," said Miller, "So we're able to train the machines to understand how their relapses occur. The thing about mental illness is that it's not like a blood test—you can't say, 'Oh, their Facebook level is 50, they must be becoming manic…' There's absolutely no standard. But we tailor the study to them."
The team is building an app as part of the study, which will be free in Australia. When the algorithms predict a manic episode, it will give users the option of built-in alerts first sent to them, then to a nominated friend, family member, or healthcare professional.
"Someone they'd be comfortable with us contacting," said Miller. "The absolute extent of our message is going to be, 'Hey, look, some of our algorithms have made us concerned about this person. Would you mind just checking up on them? Make sure that they're sleeping correctly, and that they're taking their medications.'"
The FAD study raises ethical questions about our relationship to social media, and how much we are willing to allow its influence upon our minds and bodies. Studies conducted by Facebook and other researchers have confirmed the service's potential to affect mood patterns, so does the site bear a responsibility to look after its users' mental health?
But previous attempts to monitor mental health through social media have been dismissed as heavy-handed and intrusive. The Samaritans Radar app, a UK-based service aimed at preventing suicide by tracking keywords on Twitter, came under fire for monitoring words like "die," "help," and "alone" without picking up on sarcasm, and for focusing attention on those who might already feel vulnerable under scrutiny. It was suspended shortly after launch over privacy concerns.
Miller is confident that his study stands apart in its ability to integrate healthcare with the services we use everyday, rather than making patients go out of their way to self-monitor.
Miller also discussed the idea of building self-censorship into the app—a kind of mental health equivalent to Google Goggles, the email app developed to help make users think twice before sending drunk emails composed late on Friday nights. Long Reddit threads detail the remorse people can feel when they have stabilized, but read back over old Facebook posts.
"It's not just that these people become depressed," Miller said. "It's that they then reflect on the things that they did when they were manic, which a lot of them regret. On Facebook it's there, like a mural, literally a timeline."
I asked Miller if social media should, and could look after its own users.
"I think society has a duty to look after people with mental illnesses," he said. "So many people I know who come into the emergency department are brought in because someone standing in the street alerted the authorities because they noticed something wasn't quite right. And I hope the online community behaves no differently."