Tag Archives: measurement

Online Community Management – Art or Science?


One day a month ago, an acquaintance from NUS’s Centre for Development of Teaching and Learning (CDTL) asked me if I would be interested in doing a presentation at a conference on teaching. I said, as someone teaching social media engagement, I wasn’t exactly sure if I could contribute. I said that my approach isn’t as “academic” or pedagogical as a conference of educators might expect. Although, the fact is I’ve been trying to address the issue of teaching something inherently unteachable. I.e. community engagement.

That sounds interesting, he said. So I continued. The problem with trying to teach human engagement is that it is full of “soft”, indefinable things like creating trust through sustained exchange and engaging a person’s interest through content. I’ve tried to define them in terms of frameworks, but that in itself is always tinged with futility.  My acquaintance nodded knowingly, and said that that’s precisely what needs to be done – “framing” something that resists definition. Even in the business of education, the same problem applies – how do you scientifically define the process of “teaching”? In doing so, do you lose its essence?

The simple answer is that online community management is an art and a science. An art because it has to deal with human behaviour – particularly the irrational kind. Even employees can be (positively) irrational. Because it has to deal with nuances of language, emotion and atmosphere; because it takes leaps of faith and educated guesses – stuff that a pure scientist would either balk at or become confused with.

But we cannot ignore the science part, because at the end of the day, online community management makes use of technology. This means two things:

1) It, as well as social media in general, is a product of computing – specifically social computing. Meaning, if computing (and computers) didn’t exist, social media wouldn’t exist either. We do not know of a better tool that powers online virality or ambient awareness or asynchronous communication.

2) It is measurable, because it is computerized – it is a creature of numbers. It is also becoming increasingly sophisticated. While it can still be argued that web technology measures the quantitative better than it measures the qualitative, the fact is there are many computer scientists out there who are trying to create better ways to measure social media – be it in terms of sentiment analysis or social engagement.

Coming back to online community management, it may be some time (if ever) before one can truly measure such qualities as trust and negative comments. When it comes to defining and teaching the subject, the need to portray both “art” and “science” aspects of the field is immediate. For example, the topic of developing a community from start to sustainability needs a framework to help define it in a “step-by-step” manner.

As my acquaintance puts it: it may be a difficult and perhaps futile thing to do, but someone’s gotta do it. I can only say I will try.


Measuring community for the non-techie

If, in this(still) number-obssessed – bo(b)sessed – world, someone asks, how would you measure a community’s success in non-technical ways, what would I say?

Proxy Measurement

Not a new idea, but still worth looking at. Partly because sometimes we have no choice.

Proxy measurement is the measure of some other indicator to reflect on the performance of the social media campaign under scrutiny. Say for example, you have sales figure X before you start a new social media campaign. Note that down. After the campaign, check out the figure again. Is it X+Y now? Then Y is the ap-proxy-mate gain the campaign caused. May. Have caused.

Look at it with common sense. During the social media campaign, did you post something of note that caused a big change/spike in sales? On that day, did sales/pageviews go up? It’s quite often we say that “The photo post on Tuesday attracted a lot of comments and likes, and that appeared to have boosted sales. We made 15 more sales that day then usual.” Common sense says the stars are aligned and something nice happened. So, might as well make it a little bit more powerpoint-presentable and state that:

12 comments and 80 likes resulted in 15 more sales.

Do this multiple times and you’ll have more and more data, and begin to see patterns.

Not precise enough? Speculative? Rubbish? Hey, remember, there was a time when computers and Google Analytics and even Excel did not exist. How did businesses measure sales performance then?

“I had my guy stand in the street giving out flyers at 6pm and I made about 10 more sales than usual on a Monday evening.”

“Our newspaper ad went up on Thursday and traffic in the store increased by 100-150 people, resulting in 15 more sales than usual.”

“Over 2012 our fan page increased its fan base by 10,000 likes. 8000 likes were amassed during our social media campaign. The average number of likes and comments on our posts have gone up by about 25% comparing from one year ago. “

It’s common sense logic.  It looks speculative at first when you have just started, so you have a very small sample size. Do this over years and you’ll have 100, 500 samples to form clear patterns. It’s worth a shot.

Anecdotal Evidence

You’re in a meeting and your boss asks you how’s the website/Facebook page doing. You can report the numbers, month after month. But you know there’s one thing far more powerful, effective and memorable – a customer’s compliment.

No one really remembers actual numbers (only that they are going up. Or worse, down). But if you show customers being happy with the company’s services, showing positive sentiments, or recommending your product – that sticks in the minds of people (like da boss), because it is a source of great satisfaction and pride. Every business wants the comfort of knowing their customers are happy.

So, collect these like the rare artifacts they are: customers’ positive feedback. Embed their quotes on your presentation slides, tag them with real names. Even if your numbers are not impressive, a single compliment from a customer can cause wonders. Even if you have a complaint from a customer, use it as an opportunity to show how you handle a crisis.

Businesses regularly do surveys to collect both numbers as well as “Any Other Comments” – you’re doing the same thing, without incurring the cost of running a physical survey. So collect these comments, categorize them (positive, negative, suggestions, opportunities) and report them. You might even be surprised how many approving nods you can get from a screenshot from a nice comment on a Facebook page.

Online metrics is not a decimal system

There was a time when the measurement of a community, or rather its website, was counted in hits. Remember? If you don’t, you’re probably born in the 1990s or later. Back then, which is actually the 90s I’m talking about, every HTML-dabbler and web marketer were happily installing grab-off-the-internet-shelf hit counters and putting them on their websites. Everyday we would log in and a little cheer go up in our toddling Web 1.4 hearts when we see that the hit counter had gone up another couple of tens or hundreds. Nothing was more satisfying than the solidity of seeing numbers go up. It was so, er, countable.

Hits are so old, this is a gif. On Netscape.
Hits are so old, this is a gif. On Netscape.

Even on the days when the server screwed up and over-wrote the hit counter data file with, well, nothing, and caused the hit counter to reset to zero – even on those days, we’d go what the hell and just FTPúpéd a “corrected” file (it was just a text file with a number) based on best memory. Yesterday was like 623,XXX so I guess it oughta be about … 647,231 this morning? Yes, I did do it. I didn’t feel I was cheating, mind you it was the server or the hit counter cgi (computer gateway interface, not computer graphics imagery) – the little exe-cutable programme on the server – that screwed up and overwrote the number with a blank file. I was just restoring the hit counter back to what it ought to be, based on best memory. Sometimes I even gave discounts.

….. Arrrgh, fact is the whole thing is a farce.

Alright, give us a break, we’re just trying. Trying to measure the success of a website, using numerical counters that accurately count the inaccurate attribute which gives a  rough idea of the approximate number of people visiting. Or perhaps they were just passing through.

Numbers give us comfort. They are so, um, defined. Even if they accurately measure the inaccurate, it doesn’t matter. Even if I had to approximate today’s hits because the server screwed up the count last night, it really didn’t matter. I mean, it’s not meant to be accurate, right? After all, 4,012,599 hits and 4,076,221 hits doesn’t make a difference to the people visiting the site. They just figured it is an impressive number and it grew a bit since the last time I was here. Wow. Must be a cool website.

And that’s all that really, really mattered.


It didn’t matter even that for years and years (and for some, even today), many people did not realize that the hit counter itself was a lie.  What is a hit, I ask you? For many people, especially web marketers and their bosses, it meant customers. Visitors. People who were looking at your website on their browser!

NOT EXACTLY- a hit is any call to the server for a file. Not just the webpage, but the 5 images, 12 design elements, the CSS file, the little stars that make up ratings, the 8 avatar pictures as well as the cat picture.

Land on one page and that act could generate, like, 32 hits. And you CAN hit the refresh button if you want more.

Thus, at best, the hit counter only gave a highly inflated gauge of your website’s visitorship. Not that it’s not useful, you just needed to be aware what you’re measuring so exactly. Like pageviews.

Your community/website metrics’ number is exactly representative of another number. It is not an exact representative. 30 hits does not equal 30 visitors. 30 hits represents (maybe) 1 visitor. The number is not a decimal number. It’s probably more binary – like a computer counting. It’s not that the number is wrong. It is correct – but you need to know what it truly represents.

Once you understand this, the sometimes meaningless jumble of numbers in monthly reports – made meaningless by bo(b)ssession – shows of a silver lining where (some manner of) truth can be gleaned, making your measurement all that little more meaningful.

(And that’s my first post, which was originally titled “Measuring Online Community Success – for the non-measurer”. I plan to write more about the topic of measuring online community, specifically in a non-technical way, to help the many of us in the field of online communities to better measure them and report to our bossess.)