If you work in a regulated environment, you already know the pattern.
New online course goes live. Everyone clicks through. Slides, multiple-choice, a certificate at the end. Compliance box ticked. Six weeks later, someone breaks a rule that was clearly “covered” in the training.
So the real question is not: “How do I get people to complete the course?”
The real question is: “How do I design an online course that actually changes behaviour under pressure – and keeps us compliant when it matters?”
Where most compliance courses go wrong
Let’s start on the ground, not in the LMS.
Think about your last health and safety, environmental, or workplace compliance course. If you tested your staff 48 hours later and watched what they did on the job, what would you see?
Typical problems:
- Too passive: 30–40 minutes of clicking “Next”, zero decisions, zero consequences.
- Too generic: “Work safely”, “Respect the environment”, but no link to your site, your risks, your equipment.
- Too easy to pass: You can pass while half-distracted, phone in one hand, coffee in the other.
- No follow-through: No measurement on the floor. No link between training data and incident data.
In sport, that’s like giving players a 40-slide presentation on “How to score goals” then expecting them to perform under match pressure. Makes no sense.
So let’s treat your online courses like a proper training programme: specific, measurable, and built for performance under stress.
Start from real incidents, not from the regulation text
Regulations matter. But if you start your design from the law or the ISO standard, you’ll end up with a legal summary, not a learning experience.
Instead, start from what actually goes wrong.
Before you design anything, pull three sources:
- Incident reports from the last 12–24 months
- Near-miss logs (often more revealing than accidents)
- Audit non-conformities linked to behaviour
For each recurring issue, answer three questions:
- What did the person actually do (behaviour, not attitude)?
- In what context (time pressure, lack of tools, confusion, poor signage)?
- What should they have done instead, step by step?
Example: Health and Safety (manual handling)
- Observed behaviour: Staff twisting and lifting boxes alone instead of using trolleys.
- Context: Short staffing on late shift, trolleys stored far from loading bay.
- Required behaviour: Use a trolley for any load > 15 kg; if none available, call supervisor and delay lift.
Now your course has a target: “In this training, we will reduce unsafe solo lifts on late shifts by 50% within 3 months.” That’s specific. That’s measurable.
From there, you can map the regulation to the real behaviour instead of flooding learners with legal text they will never apply under pressure.
Train for behaviour, not just knowledge
In a match, nobody cares if a player can recite the offside rule word-for-word. What matters is whether they behave correctly at full speed.
Same for compliance. Knowing the rule is not the same as applying it at 06:00, end of shift, with production behind schedule.
When you design a module, define behavioural objectives, not just knowledge objectives. Move from:
- “Learners will know the company’s fire procedure.”
to:
- “In less than 60 seconds, learners can choose the correct action in a fire scenario based on their location, role, and equipment available.”
That change of wording forces you to design differently.
Instead of 10 slides describing the procedure, you might create:
- 3–5 short scenario videos (30–60 seconds)
- Decision points: “What do you do now?”
- Timed responses (limit: 15–20 seconds to choose)
- Immediate feedback linked to real consequences
Ask yourself for each key regulation:
- What does “doing it right” look like on the floor?
- Under what pressure or constraint does it usually fail?
- How can I simulate that pressure in the course (time limit, incomplete info, conflicting priorities)?
If your assessment can be passed purely by guessing or re-reading the previous slide, you’re measuring memory, not behaviour.
Structure your course like a training week, not a lecture
On the field I never run a 90-minute monologue. I break sessions into blocks: warm-up, technical, tactical, game. Each block has a purpose.
Do the same with your online course.
For a 30-minute compliance module, a simple structure that works well:
- Block 1 – “Why this matters here” (3–5 minutes)
Real incidents from your site, not generic horror stories. Data from your own operation. One clear metric (e.g. “We had 7 near misses with forklifts last year. Our target is 0.”) - Block 2 – “What right looks like” (8–10 minutes)
Chunked into 3–5 key behaviours. Show, don’t just tell: photos, short clips, diagrams of your actual workplace. - Block 3 – “Decide under pressure” (10–12 minutes)
Scenario questions, branching paths, time limits. Learner must choose between realistic options, not obvious “good vs evil” ones. - Block 4 – “Transfer to your job” (5 minutes)
Two or three short actions they will take in the next week. Optional: downloadable checklist or a one-page protocol.
Notice what’s missing: 15 slides of “History of the regulation” and “Definitions” that nobody needs to see to act correctly on Monday morning.
Use engagement metrics like you’d use training data
Coaches don’t guess if training works. We measure: time, heart rate, bar speed, distance covered.
You have similar tools in your LMS. The problem is most people only look at course completion rates. That’s like only checking if players showed up to training, not what they did there.
At minimum, track:
- Average completion time vs planned time
- Drop-off points (where people exit or go idle)
- Question difficulty (which items have high error rates)
- Number of attempts per assessment
Then use that data to adjust:
- If average completion time is less than 60–70% of the planned time, the course is likely too easy or rushed.
- If many learners drop at the same point, something is unclear, too long, or too heavy cognitively.
- If a key safety question has > 30% failures, either the content is unclear or the situation is not reflected on the floor.
Practical approach:
- Release your new course to a pilot group of 10–20 users from different roles.
- Measure behaviours for 2–4 weeks after completion (incidents, near misses, supervisor feedback).
- Adjust the course once, then roll out to everyone.
This is the same loop as in training: test → adjust load → retest.
Design interactions that matter (not just clicking for the sake of clicking)
Interactive ≠ engaging. Making people drag-and-drop icons for no reason is just annoying.
Every interaction in your course should pass a simple test: “Does this help them perform the right behaviour faster or more reliably?”
Useful interaction types for regulated environments:
- Decision trees: “You see X. Do you A, B, or C?” → each branch shows a different consequence.
- Spot the risk: Photo of your actual workplace. Ask: “Select all non-compliant elements.” Immediate feedback with explanations.
- Order the steps: Especially useful for procedures (lockout/tagout, spill response, emergency shutdown).
- Simulated forms: Let learners fill in a near-miss report or inspection checklist inside the course, then show a model answer.
What to avoid:
- Decorative games that have nothing to do with the job.
- Overcomplicated branching that is impossible to maintain when regulations change.
- Interactions that require high tech skills unrelated to the work (tiny drag targets, complex gestures on mobile, etc.).
If you’re not sure, ask a simple question: “Will this help someone handle a real incident better next month?” If the answer is no, cut it.
Balance regulatory requirements with real-world constraints
Let’s be honest: sometimes you’re asked to create training mainly “for the audit”. And yes, auditors need to see coverage of specific clauses, records of completion, assessment results.
You can respect that without turning your course into a copy-paste of the regulation text.
Three practical strategies:
- Separate “learner view” and “auditor view”
Keep the course lean for learners. Put detailed references, legal texts, and mapping tables in:- a downloadable PDF for auditors, or
- a back-end document that maps each screen/question to each regulatory clause.
- Use layered content
On screen: the essential behaviour and key rule in plain language.
In optional pop-ups or “Learn more” links: full regulatory reference, definitions, legal wording.
Fast learners can move on; auditors still see the legal content is there. - Document your design choices
Keep a short design log:- Objective (behaviours targeted)
- Regulations covered (with article numbers)
- Assessment strategy (types of questions, pass score, retake rules)
- Review cycle (how often you update content vs regulation changes)
Auditors love structure. Give them that without punishing your learners.
Make it role-specific and context-specific
One-size-fits-all is efficient for admin, terrible for learning.
The forklift driver, the lab technician, and the office manager all need to be compliant – but not in the same way.
You don’t always have time to build three separate courses. But you can still tailor with branching and role selection:
- At the start of the course, ask: “What is your role?” with 3–5 options.
- Based on their choice, show:
- Scenarios relevant to their tasks
- Procedures they actually use
- Assessments that test their real decisions
Example – Environmental Management course:
- Role A: Production operator → focus on waste segregation, spill response, chemical storage.
- Role B: Maintenance → focus on oil handling, leak prevention, lockout/tagout with environmental impact.
- Role C: Office staff → energy use, paper waste, remote work impact, basic reporting.
Even if 60–70% of the content is shared, that final 30–40% tailored to the role is where behaviour changes.
Reinforce learning after the course (without spamming)
A single long course per year is like doing one big workout in January and then sitting on the sofa until December. You tick a box, but you don’t build capacity.
Better approach: one main course + short, targeted refreshers.
Practical system:
- Core course once per year (20–30 minutes)
- Micro-reminders every 4–8 weeks (2–3 minutes):
- 1 scenario question via email or LMS
- 1 micro-video from a supervisor on a common mistake
- 1 updated case from your incident data
- Trigger-based refreshers:
- After an incident in a specific area
- When new equipment or a new procedure is deployed
- Before high-risk seasons (e.g. heat, storms, peak production times)
The goal is not more content. The goal is the right content at the right time, connected to what people are actually facing that week.
A simple checklist to design your next regulated online course
If you’re about to build or redesign a course, use this as your working template. Print it, tick as you go, adjust to your context.
- 1. Define the real-world target
- What 2–3 specific behaviours must change?
- What metric will you use? (incidents, near misses, audit scores, observation checklists)
- What’s the time window to see change? (4–12 weeks)
- 2. Map incidents and regulations
- Collect last year’s incidents and non-conformities related to this topic.
- Identify the regulations that apply to those incidents, not in general.
- Create a simple table: Incident → Wrong behaviour → Right behaviour → Regulation reference.
- 3. Choose a tight structure
- Target total duration: 20–30 minutes for a core module.
- Blocks: Why it matters → What right looks like → Decide under pressure → Transfer to job.
- Maximum slide count: only what’s needed to support those four blocks.
- 4. Design meaningful interactions
- Minimum of 1 realistic scenario per key behaviour.
- At least 50% of questions scenario-based, not pure recall.
- Clear feedback on each choice: what happens next, why it matters.
- 5. Set assessment rules that mean something
- Pass score aligned with risk (e.g. 80–90% for critical safety topics).
- Limit retries or require a short review segment between attempts.
- Flag learners who fail more than X times for targeted support on the floor.
- 6. Plan follow-up and measurement
- Which behaviour will supervisors observe after the course?
- What will you track in the LMS besides completion? (time, errors, drop-offs)
- When will you review and adjust the course content? (set a date now, not “sometime next year”)
Online compliance training doesn’t have to be a checkbox exercise. If you design it like a serious training programme – clear objective, realistic practice, measurable outcomes – you’ll get fewer incidents, fewer audit headaches, and a workforce that actually knows what to do when the pressure hits.
And just like on the field, the difference is never in the fancy graphics or the buzzwords. It’s in the basics, applied consistently, and tested against reality.
