Feeds:
Posts

Posts Tagged ‘Evaluation’

Article Title: Give Tests a Test Run

Publication: Integrated Learnings: eLearning

Summary: When a training initiative includes an assessment to evaluate learners’ knowledge or skill, it makes sense to ensure that the assessment is accurate and reliable prior to using it for training. While some may view this as a cumbersome step, this article suggests simple ways to test an assessment.

After putting the knowledge assessment through your review process, consider administering it to a group of subject matter experts to see how they do. This might help reveal any shortcomings prior to using the assessment with actual learners. Watch for items with low success rates and any distracters that were frequently selected.

Click here to read the full article.

Read Full Post »

Article Title: Writing Distracters for Multiple Choice Questions

Publication: Integrated Learnings: eLearning

Summary: Training frequently includes knowledge assessments with multiple choice questions to evaluate learners’ potential performance. Although multiple choice questions are not ideal for testing every performance objective, well-written questions can offer a more robust assessment than many may realize. This article describes some best practices for writing high-quality multiple choice questions.

Another common offense is offering distracters that are obviouslyflawed. A good option surrounded by three really bad ones is often easily recognized, even by someone who may not understand why the correct answer is the best option. Simply writing “bad” statements as distracters misses an opportunity to show learners valid examples and non-examples of applying a skill.

Click here to read the full article.

Read Full Post »

Article Title: Isolating the Results of eLearning Impact

Publication: Integrated Learnings: eLearning

Summary: To show clients how their organizations benefit from training, it helps to measure improvement in key business metrics. Though it sounds simple, other factors can complicate the ability to do this accurately, such as related marketing campaigns and other organizational initiatives that occur within a similar time frame as the training effort. This article summarizes approaches for isolating the results of training.

Thankfully, the book acknowledges the challenges many organizations face with using a control group approach, such as the difficulty in forming two equal yet randomly selected groups and the eagerness of clients to apply a training solution broadly in the organization. With that in mind, the authors not only describe the ideal approach to using a control group, including what to keep in mind when selecting individuals for those groups, but they also describe alternative control group approaches. Even if you’re already familiar with the basic concept of a control group, you might pick up some new ideas from this book.

Click here to read the full article.

Read Full Post »

Article Title: Specifying a Criterion in Performance Objectives

Publication: Integrated Learnings: eLearning

Summary: Writing precise and effective performance objectives for training includes specifying criteria for successful performance. Most resources suggest that a criterion must be measurable – ideally, associated with an objective number. Since this is not feasible for all performance objectives, this article suggests descriptive alternatives to use as criteria.

Many of the resources that explain how to write objectives suggest that a criterion should be specific and objectively measurable. Ideal candidates include just about any measure you can associate with a number – defect levels, speed/time, quantity quotas, etc. Like in the example above. This makes perfect sense. But most of my projects include several objectives with behaviors that aren’t directly countable.

Click here to read the full article.

Read Full Post »

Article Title: Manager Engagement in eLearning Transfer to the Job

Publication: Integrated Learnings: eLearning

Summary: For a training initiative to succeed, learners’ managers must reinforce new skills and behaviors on the job. However, busy schedules or a lack of coaching skills often cause reinforcement to slip through the cracks, weakening the benefit of training to the organization. This article describes how a meeting-in-a-box approach can increase the likelihood of management follow-up with employees after training.

If the purpose of an eLearning course was to meet a specific business need – something that would increase revenue, save money, or protect the organization from risk – it makes sense that a learner’s manager would be accountable for goal-setting and coaching for new skills after training is complete.

The logic makes sense, but it doesn’t always work that way.

Click here to read the full article.

Read Full Post »

Article Title: In Defense of the Four Levels

Publication: Integrated Learnings: eLearning

Summary: Many in the training industry posit that Kirkpatrick’s four levels of evaluation is outdated, and they challenge the field to propose a more relevant model. Although the model dates back to the 1950s, this article argues that it remains comprehensive enough to address today’s training evaluation needs. The article reviews the model, the arguments against it, its shortcomings, and invites readers’ position on the debate.

We talk a lot about the need for improved diligence in the field with measuring job performance and business results. I agree that we should do this consistently. And so does the model (levels 3-4).

Click here to read the full article.

Read Full Post »

Article Title: Evaluating eLearning in a Crunch

Publication: Integrated Learnings: eLearning

Summary: Though much of what is published about evaluating training effectiveness makes it seem like a large undertaking, it doesn’t have to be. A small-scale, simple evaluation can be better than no evaluation effort at all. The findings revealed by a limited evaluation can help boost a training team’s credibility and may even persuade stakeholders to delve deeper with additional evaluation efforts. This article offers advice for evaluating training by using limited resources.

Need to get a sense for on-the-job performance? While there are many elaborate ways to do this, you have simple options too. If you’re dealing with performance measures that are already tracked, it may be as simple as requesting the appropriate reports and asking someone to spend an hour teaching you how to interpret them. If the performance measures aren’t so clear, you might send a quick email to learners’ managers asking for their impressions.

Click here to read the full article.

Read Full Post »

Older Posts »

Follow

Get every new post delivered to your Inbox.

Join 754 other followers