Peeling a KM Metric Onion (may make you cry!)

Peeling a KM onion (may make you cry).

Groan. Here we go again. I stumbled upon a report that provides guidance on what is supposedly the “top” knowledge management metrics. Ah huh. It would seem that the metrics are suspect, how they were determined to be the “top” is suspect, and heck, I’ll go so far as to say that the top five metrics aren’t even “good” KM metrics much less “top” metrics.

The report though indicates that the use of KM metrics such as these are used to monitor and document the “valued added” of KM initiative.  So here they are – what is then offered up as the “top” KM metrics (don’t even bother with the drumroll if you please):

  1. Average age of documents in knowledge repository
  2. Percentage of employees that contribute to the knowledge base
  3. Percentage of documents in non-enterprise repositories
  4. Percentage documents rarely accessed
  5. Number of internal knowledge sharing platforms

Let’s begin peeling back the onion a bit.

First, let me say that the above isn’t original. Sure, the report was just released. But I mean that pretty much the same top metrics were previously offered up. For example, I recall a report issued by Open Legal Standards Initiative back in 2006, and sure enough nearly all of those were cited then (some were even word for word identical). I won’t waste our time by citing to each of the other instances that I found, but again, none of the above appear to be unique or new.

Let’s next discuss just how those above metrics were identified (this time at least) as the “top” metrics to use for KM. According to the report – they were identified by a “research team” that researched KM metrics in use “around the world.” And then further on they indicate that the metrics were then reviewed by “2 Subject Matter Experts.” What research team? Who did they research? Around the whole world? (it’s a big world!) And who exactly were those SMEs?

They don’t really provide a whole lot more to go on, but what that all suggests to me is that they asked a few folks (who may not have known what they were doing with regard to KM metrics), and then looked around the world (as they know it) and then they had a couple of folks who were supposedly experts (at least in their own mind’s eye) review the whole mess and then sign off on it. I’m really not seeing much in the way of “good research” being conducted. Nothing that indicates how what may well be a rather limited and prescriptive viewpoint (which we’ll peel apart below) is “best” that would result in an identified list of “top” metrics.

Let’s peel back a bit more and poke at these “top” metrics themselves.

#1 Average age of documents in knowledge repository: Okay, first a general sanity check. Would this be the “average age” of documents counting or not counting revisions?  I have to ask because I suspect that in today’s fast moving organization where repository contents get updated regularly, that it’s going to be rather suspect to believe that simply because a document is older that it is better.  Oh wait, perhaps they were seeking to determine how frequently documents were updated?  Hmmm….well, then I’d have to instead be curious as to what it is about a document simply being “newer” (less age) than another document that would then indicate that the new document was “better.”  Whew, a lot to get lost on with this particular metric.  But at the end of the day, even IF this metric were somehow validated it speaks to content management, document management, revision control and such, and not to the “value added” of knowledge management.

#2 Percentage of employees that contribute to the knowledge base:  This one assumes that firstly, that contributing to a knowledge base is what has been determined to be of value, and secondly, that putting anything into a knowledge base is somehow value added.  Wow.  Okay, let’s just say it and get it out of the way — this is clearly driven by some sort of assumption that having a knowledge base is required, and that putting things in there is somehow value adding.

This clearly ignores the whole concept of socialized knowledge sharing (yup, there it is again!).  It also ignores the fact that in many organizations some of the “best” knowledge contributions may never make it to a knowledge base — I’m talking about discussion threads, VTCs or such, or the wealth of other documents that just never get shared because nobody asks about them or for them.  And then there is the whole issue of search related problems — that the vast majority of organizational knowledge bases have no taxonomy, that documents are routinely created and added without so much as a thought given to meta data.  We’re talking search failures that are in the very high double digit percentages.

And how about the whole issue of “what” documents go into a knowledge base?  All documents?  Really?  Just how large is this knowledge base and who are the misfortunates that are wading daily through the alligators to try to measure the depth of the document swamp?  Not all organizational knowledge needs to be in a knowledge base — some knowledge becomes stale or obsolete before you can even get it into document form.  Not all knowledge that is critical can be reduced to document form.  And so on. At the end of the day, it doesn’t matter how many employees contribute to a knowledge base as long as the right employees are contributing the right knowledge the right way to the right knowledge base.  There is no definitive way to link employee contributions to a knowledge base as a factor of value-added benefit of KM.

#3 Percentage of documents in non-enterprise repositories:  Oh oh….is this to determine that all the documents are located in a central repository (available through an enterprise wide knowledge base)?  Or is this checking to see if folks are safely storing their “own nuts for the winter?”  Let’s assume that they meant to examine whether or not there was a large percentage of documents that were not available within an enterprise wide knowledge base.  It just doesn’t make much sense to look at it the other way, so we’ll give them that.  Okay, now then — so what?  This (once again) assumes that documents need to be in an enterprise wide knowledge base.  And the same arguments applied to above #1 and #2 then apply, and were again dealing with issues related to content management, document management, revision control and such, and not to the “value added” of knowledge management.  There is no definitive way to link the percentage of documents located in a knowledge base as a related factor of value-added benefit of KM.  At the end of the day it doesn’t matter what percentage of documents are in an enterprise repository as long as the right documents are there.

#4 Percentage documents rarely accessed:  This one is almost too funny if I didn’t have this mental image of folks combing through knowledge bases setting up searches to identify how many documents aren’t accessed, who doesn’t access them (sarcasm helps with this one), what exactly “rarely” is from a quantifiable and measurable standpoint, and what exactly accessed is.  

But I’m sure that you do recognize that if they are pushing above #2 and #3 (and probably even #1) that they’re really “messing” with this #4 metric.  

But for sake of discussion — let’s say that there is this guy named “Bob” in your organization who creates the single most critical document ever to exist for your organization, needed by everyone, everywhere, all the time.  And everyone knows that Bob contributed the document to the knowledge base.  And everyone must at any given time be absolutely certain that “Bob” hasn’t been working on a more current version than is in the knowledge base.  Perhaps “Bob” has one that was updated this morning.  Or yesterday.  Or weeks ago and it’s just slowly floating its way through a complex maze of an approval process.  

So do you now have this mental image of folks glued to their monitors, waiting to see screen refresh flicker to indicate that “Bob’s” new revision has hit the system?  Or is it much more likely that folks, knowing him to be the expert, are then instead seeking out “Bob” to find out what is current and most important?  

If the later, then you’re now talking about the organization utilizing expertise locators or trying to understand the social networking that takes place, etc. etc., instead of caring whether or not a document is accessed in a knowledge base.  

A really swell “litmus” test of this kind of situation is to toss an “oh no grenade” into the organization…the kind that makes everyone say “oh no” as they run for cover.  And then watch.  See what they do.  Do they all rush to their desktops to login to the knowledge base to look for the critical “Bob” document?  Or do they go look for “Bob?”  At the end of the day you’re going to find it pretty difficult to link frequency of document access to the value added of knowledge management.

#5 Number of internal knowledge sharing platforms:  You know, I’m right now having a mental flashback to the movie “Office Space” when Peter Gibbons tells the “Bobs” that he has eight different bosses, each checking up on him all day long.  

And I’d be remiss if I didn’t mention a recent blog post (KM Bamboo Tools!  Plant Yours Today!) about a similar situation in which an organization of some 1,800 folks had set up (at last count) some 484 individual Sharepoint sites. Same thing applies here.

It simply isn’t relevant in any way how many internal knowledge sharing platforms might exist. What is the supposed linkage between those knowledge sharing platforms and the value added of knowledge management.? What, simply because a knowledge sharing platform exists there is a presumption that knowledge sharing is taking place? And if so, that it is “good knowledge” being shared? And that is measured how — by someone trying to quantify each and every knowledge sharing platform’s individual contribution to the closure of some form of a knowledge gap that has been targeted as critical for closure within the organizational knowledge management strategy?

Give me a break. Come on, this sounds so very very much like a typical sales pitch for site licenses. How many employees do you have? Oh, okay, then you need that many site licenses for individual knowledge sharing platforms because that’s what we (the IT vendor) thinks you should measure. But wait, perhaps we need to be utilizing different types of knowledge sharing platforms….you know, blogs, wiki, discussion threads, chats, etc. etc. And the IT vendor can also provide those. And so take the total number of those potential types of platforms and multiple that by the total number of employees….and well, you get the point.

And all of that doesn’t even take into consideration that a whole bunch of critical knowledge is likely to never be shared via some sort of a “knowledge platform.” At the end of the day there is absolutely no correlation between the number of knowledge sharing platforms and the value added of organizational knowledge.

Now for the other shoe drop moment — did you also notice that each of these “top” metrics seems to be neatly wrapped around IT, and not really at all about the value added benefit of utilizing organizational knowledge?  Knowledge repository, knowledge base, enterprise level repositories, document access, platforms.  All IT centric metrics, and none actually determining knowledge sharing.  None measuring the value added of the documents or the systems (if you could do that).

Okay, let’s wrap this onion exercise up.

These supposed “top” metrics are more likely all about the fact that either a small group of “experts” decided that they neatly fit with an IT driven agenda, OR, in the alternative, are quite likely just metrics that these “experts” frequently saw utilized by lots of organizations that clearly don’t “get” knowledge management and the whole concept of the value adding impact of organizational knowledge. They were available metrics, things that could be counted. Output driven, and none even remotely resembling an outcome based performance metric. Same old, same old. But, it IS a NEW report ($$), that presents that same old, same old.

Dr. Dan's Daily Dose:
When seeking to determine the “valued added” benefit of organizational knowledge and/or knowledge management, it is probably kind of important to attempt to actually measure that value. Measuring output is not the same as measuring outcome. The value-added benefit of knowledge management and in utilizing organizational knowledge is all about achieving desired outcome – which speaks to the need to have an effective KM strategy which identifies knowledge gaps and discusses how knowledge would close those gaps.
About Dr. Dan Kirsch