Instructional Designer • Learning Experience Designer
Most training is designed to deliver information. I design systems where behavior actually changes — and you can measure the difference.
About Me
I'm Hannah Shambley — an Instructional Designer and Learning Experience Designer focused on the gap between what people learn in training and what they actually do on the job. That gap is a design problem. I treat it like one.
My work spans curriculum architecture, LMS-based modules, assessment strategy, and AI-augmented learning systems. I approach every project from the same starting point: what does the learner need to be able to do, and what's currently getting in the way?
Whether I'm designing a branching scenario, auditing legacy content, or building a certification module from scratch, the measure of success is the same — observable behavior change, not completion rates.
Tools & Technologies
A needs analysis isn't a formality — it's how I find out whether training is actually the right solution. Objectives come from what the job demands, not what the SME wants to cover.
Recognition isn't recall. Every activity I build is designed to make the learner do something with the knowledge — because retrieval practice is what moves information into long-term memory.
Working memory is limited. I sequence new concepts by managing extraneous load first, so learners have the cognitive bandwidth to build meaningful schema — not just absorb a content dump.
Training that doesn't transfer is expensive decoration. I design for near and far transfer explicitly — using context-rich scenarios, spaced practice, and feedback loops tied to real performance conditions.
Design Philosophy
Most organizations don't have a training problem. They have a performance problem that training gets assigned to solve — often without the analysis to confirm that training is even the right intervention.
I came to instructional design from an educator's background, which means I've seen firsthand what happens when content is delivered without a clear theory of how people learn. Learners leave with information but not capability. That gap is preventable.
The work I find most interesting lives at the intersection of learning science and design craft — building systems where every decision is intentional, every activity has a cognitive purpose, and the measure of success is what the learner can do differently afterward.
I won't build a course until I understand the performance context. Sometimes the solution is training. Sometimes it's a job aid, a process change, or a conversation someone needs to have. I'd rather tell a stakeholder that upfront than build something that won't move the needle.
I take cognitive load theory, retrieval practice, and spaced repetition seriously as design constraints — not as buzzwords. If I'm making a decision about sequencing, activity type, or feedback design, I can usually trace it back to a principle from the research.
I use AI as a design accelerator — for rapid prototyping, scenario generation, and content structuring — while keeping instructional judgment where it belongs: with the designer. The model doesn't know what good learning looks like. I do.
Case Study 01 • Content Redesign
Same objectives. Stripped of what doesn't serve the learner. Resequenced for the frontline workflow.
The Solara Yoga All-Access Membership provides members with access to a comprehensive range of classes and studio amenities. Membership benefits include: unlimited access to all class formats including vinyasa, yin, restorative, hot yoga, and aerial yoga, access to all Solara studio locations, on-demand video library through the Solara mobile app, live stream class access, monthly workshop priority registration with 20% discount, complimentary mat and towel service, locker room access with showers, 15% retail discount on all studio merchandise, two guest passes per month, membership freeze options for up to 60 days per year, and a complimentary new member orientation session. Studio associates should familiarize themselves with all features in order to assist prospective and current members.
Know these three. The rest unfolds.
From catalog label → learner question. Anchors to the real studio moment.
Kept what matters in the first conversation. Removed cognitive overload.
Scannable structure reduces load and supports in-the-moment recall.
Rewrote in 2nd-person. Speaks to the associate, not the archive.
Outdated rev. date undermines trust. Replaced with clean attribution.
Case Study 02 • Module Design
The performance gap was specific: new associates were completing existing onboarding but still requiring significant floor coaching during their first two weeks. Stakeholder interviews revealed the root cause — the original content listed menu items but didn't build the associative knowledge needed for real conversations with guests.
Objectives were written around observable behaviors: describe, match, and apply — mapped to Bloom's levels 1–3. The certification threshold (75%) was set to reflect minimum job-readiness, not arbitrary difficulty. Every activity was selected for a specific cognitive function before any content was written.
Built in custom HTML/CSS/JS to allow interaction types not available in standard authoring tools. Progress gates prevent learners from skipping to assessment without completing formative practice — preserving the instructional sequence intentionally.
Post-launch, floor coaching requests from new associates dropped in the first two weeks. The branching scenario's feedback was later adapted into a quick-reference card used by shift supervisors during on-the-job coaching conversations.
Portfolio
How I Work
I start with the performance gap, not the content list. Stakeholder interviews, task analysis, and learner context research determine whether training is the right intervention — and what it actually needs to accomplish.
Objectives are written in behavioral terms. Assessments are designed before activities. Storyboards and interaction blueprints map every decision before development begins — because it's cheaper to fix a storyboard than a built module.
Activities are built with a specific cognitive purpose — not just variety. Each interaction type is chosen because it serves retrieval, schema building, or transfer. Formative feedback explains the reasoning, not just the right answer.
Content is deployed with SCORM or xAPI tracking configured to capture meaningful learner data — not just completion. Facilitator guides are built into the rollout plan when the performance context requires it.
Evaluation is scoped at Levels 1–4 based on what's measurable and meaningful for the client. Learner data informs iteration. The goal is always the same: did behavior actually change?
Expertise
01
02
03
04
05
06
Let's Connect
I'm open to new projects, collaborations, and conversations about learning design — especially work where the goal is durable capability, not just a course that gets checked off.
Tell me about the performance gap you're trying to close. I'll tell you whether — and how — a learning design solution could actually address it.
Send a Message →