Many leadership and executive development learning
managers find themselves at a loss for quantifying a return on investment for
development programs and initiatives. Often,
L&D leaders default to qualitative results versus quantitative results,
which is understandable since quantitative results require considerably more
digging. Organizations without data
scientists or employees trained in knowledge and insights often struggle to
tell a story using data. Below are some
tips to make this work easier for leadership and executive development
professionals.
Step 1: Define your purpose. I recommend a purpose statement like “the
purpose of this research is to determine the influence of leadership
development programs and executive coaching on these variables: performance and potential, retention,
engagement and job movement.” It is
imperative to have your senior leaders’ buy in before beginning the
process. Here is a place to demonstrate
leadership: make recommendations but be sure to incorporate factors that are
important to leaders.
Step 2: Narrow your focus. Are you measuring ROI on executive coaching,
leadership programs or mentoring? Each
discipline will require a different standard for measurement. For example,
programs may last a week while leadership development experiences may last much longer. Most executive coaching engagements range from six to twelve months. Also, it is critical to clearly define your audience. At this point, many leaders realize that their efforts at developing leaders haven’t been as focused as they should have been. Often, emerging leaders are placed in signature programs for which they were not properly vetted and ready. This practice compromises the validity of the executive development experience. Getting the right people in the right programs for the right reasons is key. This is the point at which you clearly define your audience; remember: right person, right program, right time.
programs may last a week while leadership development experiences may last much longer. Most executive coaching engagements range from six to twelve months. Also, it is critical to clearly define your audience. At this point, many leaders realize that their efforts at developing leaders haven’t been as focused as they should have been. Often, emerging leaders are placed in signature programs for which they were not properly vetted and ready. This practice compromises the validity of the executive development experience. Getting the right people in the right programs for the right reasons is key. This is the point at which you clearly define your audience; remember: right person, right program, right time.
Step 3: Decide on the three to five key questions you
want to answer. For example, is there a
common profile for each participant?
Obviously, if you’re measuring the impact of a women’s leadership
program, being female is a common characteristic of all participants. Apply that line of reasoning for each program
or initiative; for example, male/female, job grade, age, tenure, ethnicity, veteran
status, gender identity, etc. All this
data becomes extremely relevant when slicing, dicing and targeting your
development initiatives.
Is retention important?
If so, compare retention rates of program participants against those who
haven’t participated in a program. Is
there a difference?
How does participation in a program affect performance
and potential? Programs that hit their
target are ones that increase not only the participant’s performance, but also
the performance of their team. Look at
the performance of the leader and their team one year prior to the
program/event, the year of the program/event and up to three years after the
program/event. There should be a
positive correlation between performance/potential and the program. If there isn’t, then it’s time to reexamine
the content of the program. A redesign
may be in order.
Are program participants being promoted at a higher rate
than those who haven’t participated in a program? Again, if the right person was in the right
program for the right reason (growth and development), then the participant
should be promoted at a faster rate than non-program participants.
Step 4: Determine and vet your process with learning
leaders. In most cases, these 6 steps
will produce consistent results:
1.
Determine the specific research questions
2.
Determine the data requirements
3.
Gather the necessary data
4.
Conduct data analysis
5.
Summarize the findings
6.
Report the findings to key stakeholders.
A few finer points to also consider:
If you are measuring several programs or initiatives,
call out and report on each one individually.
As you compare results, you may see misalignments that can be
corrected. The same holds true for
measuring executive coaching engagements. Look at the coaching results from each coach
and compare them side by side. Are you
getting the results you expect? After a
deep analysis, you’ll have the quantitative data to make that determination.
If your development programs are “staged” meaning that
there is a program for front line leaders, emerging leaders, high potential
leaders, etc., is there course overlap?
Is there a pure sequence for the programs? This may/may not be important in your
organization, but if it is important, now you’ll have data to support your decision
to continue or course correct.
To better protect your investment, it is important to
look at separations from the company, both voluntary and involuntary. If there are a disproportionate number of
voluntary separations, you may be training people to leave your organization for
other jobs. If there are a
disproportionate number of involuntary separations, are you properly
identifying the right people during your talent management and succession
planning discussions?
Does the investment drive employee engagement? If so, is it appropriate to broaden the
offerings to drive higher engagement? Is feedback provided at the end of a
program? If so, how is that data used to
enrich the program?
Finally, has the concept of “differentiated investment”
become a common expression in talent management and succession planning
discussions? If so, are the leaders
identified able to articulate what that means and the expectations that come
from such investments?