Learning Technology 2016

This is Rob's review of the conference. Click the images below, or use the red button at the bottom of the page, to jump to Fraser or Karen's.

Jump to Fraser's article Jump to Karen's article
Unsplashed background img 1

One thing was apparent at the Learning Technologies show, and that was there were a LOT of companies wanting to show about how well they manage the learning in their organisation (and more trying to sell you stuff on the back of it.)

Fast forwarding to my thoughts at the end of the day, for a moment, the consensus was that great success comes from “Bigger Picture” thinking, i.e. tying in your learning and development into the overall company performance review structure and having learning professionals that oversee, guide and recommend all of our staff through a more relevant develop journey for them and their request.

This was centric to a seminar I saw from Haringey council, who over the last 12 months have deployed a portal bringing all “learning and performance in to one place”, to push the organisation “towards a learning and performance culture”, built solidly around the ethos of, “It’s all about the conversation”.

From this general message, two seminars stood out for me:

  • Can e-learning do everything? – Graeme Youngs, Head of Learning at Omniplex
  • 70:20:10. – Charles Jennings

70:20:10 - It's about results not activity

Charles Jennings, Director of the Internet Time Alliance, a co-founder of the 70:20:10 Institute, and a director of Duntroon Consultants
Martin Baker Managing Director of the Corporate eLearning Consortium

Let’s look at 70:20:10, first. A key quote from Charles was:

“If you ask a Training Consultant for training, you will get training”

and I think, over the years at Sky, this has come to fruition. We don’t explore deeply enough into whether structured training is actually needed, nor have we been influential enough to explore other ways that it could be completed. He argued we should “think more like an architect, than like Bob the Builder”.

Charles was passionate that certain roles and responsibilities were key in creating the perfect learning culture in an organisation; one that moved “away from courses, and towards campaigns”, where “learning is embedded within performance”.

The video above, by Charles Jennings, explains the basics of the 70:20:10 model.

The key roles Charles defined for 70:20:10 to truly succeed were:

  • Performance Detective:
    An all-round performance (learning) consultant that can field requests and scope tactical interventions alike, identifying what learning is actually required, and guiding that down the appropriate 70:20:10 route… (and/or blended approach).
  • Performance Architect:
    These would validate and agree the design or learning, and design for the 100 (70+20+10), not just the 10; making sure that we use what we learn; particularly focusing on making sure that during the 70, the learning is available at the “point of need”.
  • Performance Master Builder:
    This person will start with the critical tasks and co-create effect solutions, always reviewing against the 70:20:10.
  • Program Game Changer:
    This is the role that really changes “courses to campaigns”. It’s no longer a course we attend, but a way of life. A project manager of sorts, who owns the whole process across internal comms, marketing…etc). They will manage the full comms of the project, ensuring the embedding, programme plans and support of line management…etc, all happen and work together.
  • Performance Tracker:
    This role is as we’ve come of think of evaluation, but bigger. They will obviously focus on measurement, results and the plan, but will really drill down into stakeholder metrics to fully monitor impact and return on investment. With this, they will produce a comprehensive measurement plan and report to focus on performance improvement, throughout the 70:20:10 rollout, not just NPS of training.

If you would like to read more about the roles, take a look at this link

It’s worthy of note that Charles did say that it’s not imperative to have these exact job titles, as long as the responsibilities all exist, are defined and work together with a structure of other roles.

Through all of this, Charles further mentioned that it’s important that these role begin to think in reverse from what they are used to; particularly the Detectives, Architects and Trackers, who are involved in the original identification. The two main ways be described this were in terms of evaluation and classroom learning.

  • a) He argued that we should look to place classroom training at the end of the learning journey, as an embedding and consolidation tool, led by a facilitator… rather than at the beginning of a learning journey, as a knowledge dump, and then trying to remember the key learns when embedding.
  • b) To support this, the evaluation needs to concentrate on the performance of the role first, before the effectiveness of the training… What is the point of asking someone if they are comfortable doing their job after a training course, when they haven’t really tried the role on for size yet? Charles argued that while the widely-used Kirkpatrick model of training evaluation is good, companies rarely get to Level 4 of it, making it ineffective.

All really interesting stuff from Charles.

"Would you trust a brain surgeon, who had only learnt the process via eLearning?"

Unsplashed background img 2

Converting classroom materials for online delivery: is eLearning the only answer?

Graeme Youngs, Head of Learning, Cursim/Omniplex

So, onto Graeme Youngs, from Omniplex.

Omniplex are a providing of eLearning within the learning community, and Graeme asked one simple question…

“Can eLearning do everything?”

His (very honest) assessment was, “no”. His extreme example being, “would you trust a brain surgeon, who had only learnt the process via eLearning?” (Thinking of 70:20:10 here, would you trust a brain surgeon that has only learning the process in a classroom?)

Rather, the “overused term” of blended learning was the key here, which really fits into the 70:20:10 model. Yes eLearning would be great to learn and identify parts of the brain, but it would need to be blended with many other approaches to be truly successful.

Thinking of the model again, Graeme emphasised that “The method of delivery has to be decided after we has defined our learning objectives and outcomes”… Yes, there is no getting away from Learning Objectives & Outcomes…

He used the diagram below to illustrate that, in order to “break a classroom course down into what it could be, before deciding to replace it”, you need to understand the options available, and what their strength are, as well as what needs to be “learnt”.

Card Title

Graeme argued that electronic means are more suited to the information-based learning, and physical learning is planned more around activity; the sloping line between them dictating the level “blending”.

He focus on exploring many options and picking the right one(s), concentrating on these five:

  • eLearning (Captivate, Storyline…etc)
  • Virtual classrooms (GoToTraining, Adobe Connect…etc)
  • Chat/Conferencing (Microsoft Lync, Skype for Business…etc)
  • Forums (Yammer, Chatter…etc)
  • Face-to-face (Classroom).

His key presentation was then to consider “how we decide” what to use from all of these. To do this he broke down his “scoping” in to 5 areas:

  • Information
  • Individual Activity
  • Assessment
  • Group Activity
  • What else?

Below, is a recreation of the slides Graeme used to illustrate the ways in which to explore the best method to truly blend your learning, and get the most from it. Click each slide to zoom.

Slide 1
Slide 2
Slide 3
Slide 4
Slide 5

In short, this guidance offers a lot to our world of design and consultancy, when considering what is the best solution for our stakeholders and customers, and the most effective learning method.

So, in conclusion, it was a great day down at Olympia, which really energised me to think bigger picture, 70:20:10 and blended learning.