Thursday, August 11

How effective is your learning evaluation?

The evaluation of training is too important to be left to trainers.

At the individual intervention level, at the strategic enterprise level, and at all points inbetween, the quality assurance processes applied to formal learning initiatives in most organizations are, in my experience, rudimentary at best.

Training departments are usually stretched thin, and don’t have the time or resources to do a “proper” quality assurance job at either the course level or at the aggregate departmental level. Implementing a regimen that elevates the strategic importance of evaluation (across all levels) and places it on a more professional level will do two vital things. It will improve significantly the effectiveness and efficiency of all learning activities; and it will save a tremendous amount of unnecessary, un-useful, or redundant work.

My fear is that with the advent of LMS-based evaluation and record-keeping, the information we have about the quality of our learning activities is becoming more narrowly focused, and its usefulness is becoming further diluted. Just as LMS functionality tends to constrain the nature of our design of instruction, it constrains the nature of our inquiry into its impact.

I’d like to see more training departments creating evaluation units and staffing them with a trained expert or two who can help get past the simplistic "smile-sheet & ROI" approach and start building systems that put the important issues on the dashboards of individual trainers, instructional designers, and senior learning managers.

4 comments:

Anonymous said...

It is interesting to me that you post on this topic today. I just transitioned from a decidely techy job in a large school district in one state to a Application Support & Training position in a large school district in another state. I will be training office staff as well as teachers.

One of my first charges was to revamp the workshop training "feedback" structure, which is currently a piece of paper with a 1994 datestamp that asks lots of "touchy-feely" questions and is used for absolutely nothing. (Except as a place for people to vent or praise trainers.

I spent a chunk of time last week grazing the internet for ideas. I came away with three conclusions and I would love to hear people's feedback and or ideas/suggestions on them:

- Expectations should be made perfectly clear up front that training is not an event, that the skills covered in the workshop must be practiced and followed up on to be effective and there is an organizational expectation that this will happen

- Feedback should have three phases; a "Reaction" phase that day, a "Support" phase 30 days out, and a "Results" phase 90 days out.

- All feedback should be gathered electronically and online and should be TRANSPARENT to all people who want to view it.

- Oh and number 4, keep surveys short and only ask the questions if you plan to act on the data you get from the answer.

My current roadblock is a technical one - how to gather "reaction" feedback at the time of the workshop and auto generate the "Support" and "Results" surveys 30 and 90 days out.

Again - I would LOVE any feedback others are willing to share.

I hope I haven't threadjacked this post, my intentions are good I swear!

Godfrey Parkin said...

Threadjacking is what we are looking for on this blog - otherwise we are talking to ourselves :-)

This is one of my hobbyhorses, so I will respond, probably later in the day. I hope a few others will chime in in the meantime.

Anonymous said...

Chris, you might want to check out a couple of really good open source survey tools to help with your training evaluation project.

Nsurvey is a really good Windows based tool that I think can accomplish the bulk of what you are trying to do. You can build the appropriate surveys, schedule opening and closing dates, and then assign them to people using their email addresses. It requires a .NET server as well as a SQL backend (although you can use MSDE).

If you want to go the full open source route, you could use phpSurvey. It’s similar to Nsurvey, although I have to admit that my experience with it is limited. Both tools will allow you to email the participants with a link to the survey (making it pretty transparent for them) as well as export the findings as xml or .csv files that can be opened and analyzed with excel or any other spreadsheet program.

Anyway, the point of this ramble is… there is technology that will allow you to accomplish what you have proposed. Good luck with designing the evaluations!

Anonymous said...

Thanks for the tips. I have used phpSurveyor quite a bit in a past job. While it is a great fit in many ways, and does a great job overall, it does not address the followup piece of my puzzle.

I'll look into Nsurvey, although I'm in a Mac environment so I'm not sure where I can go with it. It might spawn ideas though.

I spoke to a genius ColdFusion developer on staff here who said he thinks he could knock out an app that does what I am looking for for the auto generated responses, but it would likely not have a visual engine for the data, and would not be transparent. While I think that is important, I might let it go, as I am not sure there is any demand for users to see the data anyway. And I can take flat text files and work wit those for my needs I'd think.