« Compensation Times

Bad Compensation Data Leads to Bad Decisions

 In

There are competing problems in compensation management to deal with, competing mostly to be the factor that creates the most bad decisions — a misguided direction toward over-reliance on labor market data, and what can best be called a veritable cornucopia of sources of “bad data.”  While the advent of internet data collection and reporting has certainly provided more access to information, it has also invited those with little understanding of compensation data to try and make a few bucks (or save a few bucks) by collecting and reporting on just about everything.

Today I was asked to review information from a survey… more specifically, I was asked “did you consider this source in setting our CEO’s pay range?”  The question was valid, although the answer would have been found in our report where we listed the surveys that were used.  The document I was given was a few pages from a more recent version of a survey that we had used in the analysis, and hadn’t had access to at the time of the report. For some reason known only to those who created it, the publishers had taken a relatively decent data summary and turned it into the kind of thing that businesses seem to be hung up on — the “one number that summarizes the whole survey.”  Unfortunately, the one number approach rendered the data useless.

The report is a survey of about 50 state trade associations, which range in size from about $200,000 in revenue to more than $50 million.  Prior editions had broken the information out by geography, budget size and number of members.  This year, they put the entire sample in one line of results, and did something that I am sure was innocent, but turns out to be really deceptive.  Information on compensation and budget size was put in the same table, but each was clearly analyzed separately.   That is, they put the median, average and percentiles for salary, and the median, average and percentiles for budget size, right next to each other.  This may not seem very important for the vast majority of folks who aren’t data geeks, but for those of us who have clients who rely on us to provide them with advice… it’s crucial.  In short, a user might think that the median-paid employee was running the median-sized association.  Nothing could be farther from the truth, something that would have been clear when looking at the prior version of the report.  It’s bad enough when you’re using the median… but those who chose to look at outliers, like the 25th or 75th percentile, or the 90th where things frankly go more than a little wonky, are going to be making bad decisions.

To complicate and further encourage bad decision making, the survey contains a large table of BLS data for “county quotients.”  The survey user is encouraged to multiply the (median? average? some other unexplained number?) by this BLS data for the county that contains the capital of their state, which will tell them what they should be paying — despite the fact that this is an industry that hires, particularly executives, on a national market, and despite the fact that geographic differentials account for only a fraction of the differences in pay.

I happen to know this industry — and there probably aren’t five associations among the fifty who have the type of staff member who can see the data they’re being given is bad and understands what it means.  That’s not a reflection on them — organizations rarely have in-house compensation expertise until they are many many times larger than the largest of these organizations.  What it does, however, is translate into bad decisions made by Boards.

The collection and analysis of compensation data, and the recommendations made based on that data, is not an arena for amateurs.  Decisions made based on bad data can lead to embarrassment at the least, and unsavory appearances in the local press… not to mention calling into question the judgment of those making the decisions.