- Investing the effort in defining your learning outcomes up front and ensuring that people buy in to them is hugely beneficial. Even though the way the learning outcomes will be different for this environment we were still able to use them as the driver for the solution we came up with.
- Getting the right local people involved is critical. More by luck than judgement we had some very influential local people involved in the process. This really opened up the doors and smoothed the way so that we were able to agree upon a solution. Next time I don't want to simply rely on luck.....
- Taking the time to really understand the local environment and not making assumptions is important. I tried really hard for the first couple of days to put aside what the 'answer' is and just absorb as much of the local culture as I could.
- Getting the local team to come up with the solution worked really well. I had prepared my thoughts on some potential options and also the criteria by which they should be assessed. This actually facilitated some really valuable discussion that led to them coming up with their own (and better) solution - great result!
Saturday, 30 May 2009
Friday, 22 May 2009
OK, I have to admit that I’m a little excited today! I am lucky to be in a job that I enjoy – it gives me the challenges I need at work but also enables me to balance it with other things that I enjoy in my life, with my family being the priority. However, right now I am sitting on a plane bound for the Middle East thinking about the week ahead and the challenges it will bring.
I have formally worked in the learning and development world for about 5 years now but been involved in the world of L&D for much longer. In that time I have been engaged in many aspects of L&D ranging from the design and implementation of learning programmes through to talent management and coaching. However this week I have a new challenge and that’s adapting a learning programme so that it can be deployed in a completely different culture.
Working for my current client I have been heavily involved in the creation of a learning programme which is being deployed through a mixture of elearning and ‘classroom’ based sessions. It’s more about raising awareness and building a common level of understanding rather than ‘training’ but it’s really been designed with a ‘Western’ audience in mind (USA, UK, Australia and some parts of Europe).
Deploying this learning programme to a Middle Eastern country just won’t work in it’s current form. Why not?
- The obvious one is language. Whilst we have translated the course into a number of different languages already, Arabic isn’t one of them. The specifics of the Arabic language bring their own challenges, especially with an elearning course.
- The main one though is culture. The ‘classroom’ sessions involve exploring and discussing some case scenarios that are designed to challenge people’s thinking and help them understand various perspectives. The simple assumption that this will also work in a Middle Eastern culture is just not true. Whereas in my culture people are used to discussing an issue and then listening to others points of view, this isn’t necessarily the case in a Middle Eastern culture. Here people are brought up to have a strong point of view and it’s a very proud culture – showing that your view isn’t the ‘best’ may not be easy.
So for me there are a couple of important principles that I intend to follow:
- We spent a long time working through and agreeing the required learning outcomes (knowledge, skills and attitudes). These were the driving force in designing the programme and measuring it’s success. Whilst the way the outcomes are achieved may be different I will be looking to ensure that they are still central to all of our thinking.
- I need to make sure I really understand the culture and not make any assumptions about what I think will work. Pay attention to the small things as well as the big items!
- Understand the audience: I intend to spend a significant time understanding the various audiences that will be taking part in the learning programme. What’s different about each of them and what do they have in common. How will I know that what we are suggesting will actually work?
- Help the local team develop something rather than ‘do it for them’. I think it’s really important that the local team really own the piece of work – after all they are the people that really understand their environment and are the people who will have to deploy it. Picking up someone else’s work is a recipe for failure….
Well, that’s my initial thoughts – I wonder what other factors will come into play……I’ll keep you posted! If you have any advice then that would definitely be welcomed!
Saturday, 16 May 2009
- Create a clear and integrated cross-government strategy for economic transformation and renewal.
- Develop a simpler, more agile and demand-led skills and employment system, capable of anticipating and addressing both existing skills needs and emerging industrial opportunities and challenges.
- Transform individuals' aspirations, maximising motivation and opportunity for everyone to develop their talents.
- Build employer ambition and capacity to be world-class, capable of competing globally as high skill, high value added organisations.
- Support better integration of skills into economic development activity in cities and local economic communities.
Saturday, 9 May 2009
Someone said that “feedback is the breakfast of Champions” – it’s a bit of a cliché I suppose but my recent series of posts about feedback hopefully shows that the way it is delivered can have quite an impact – be it positive or negative.
In my work at the moment I have been evaluating some initial feedback from training that is currently being rolled out on a global basis. The audience for the training is large (100,000 people) and diverse so designing an intervention for such an audience has certainly been a challenge. Results so far are encouraging and we are now looking at what we do with the information. The process has made me think about some key learnings that I wanted to share with you:
- What are you trying to evaluate? For me there are broadly two kinds of things that are useful to evaluate:
- Did the intervention meet your desired learning outcomes? Assuming you defined some learning outcomes (knowledge, skills and attitude), how will you know if people have achieved them? Without wishing to delve into a debate about evaluation and Kirkpatrick, it is often valuable to ask delegates how well they have met outcomes. For example asking them if they feel confident in their ability to do something new, whether they know where to go for help etc (assuming these are defined learning outcomes).
- Was the method of delivery appropriate? You will probably be able to tell this from the answers to your first set of questions about learning outcomes. However it is useful to know why….was the course too fast, too slow; was the content pitched at the right level (eg. was it patronising); would they have preferred classroom/elearning; was it relevant etc.
- How are you trying to collect evaluation data? There are many ways to obtain feedback from your audience and you need to select the most appropriate medium. This could be via a survey (preferably online) but could also include a more qualitative follow up approach via focus groups or one to one interviews (a recommended approach). It is important to always try and get a representative view using some statistics and then some analysis.
- Look for themes. When faced with a ton of statistics from your survey results it may be quite daunting. However the first activity is to pull out some themes. Start at a high level – did we broadly meet our learning objectives. Then, drill down further – did we meet all the objectives fully, were some met better than others etc. Once you have some themes the next step is to get some qualitative information to validate the ‘why’ – use focus groups, workshops and interviews to explore the themes further.
- Don’t just focus on the negative. I always find it tempting to look at the negative comments – what didn’t people like, what didn’t work… However it is really important to also understand what worked well…..and why. It is really important to understand the positive items if you are to be able to replicate them again in the future.
- Make it easy for people to give feedback or provide an incentive. Most people aren’t too keen on providing feedback and often it is the people who have strong feelings (positive or negative) that will provide it. So it’s important to make it as easy as possible for people to contribute their views.
- Try and collate the feedback as soon after the course/event as possible. The longer you wait the less likely people are to provide it.
- Give people an incentive for sharing their views. This doesn’t have to be a prize but can be as simple as letting people know how important their views are and how it will help shape future courses/events
- Follow up with delegates. They took some of their valuable time to share their views with you so the least they deserve is to know what the information will be used for. Provide a summary of the key themes (what worked, what needs some attention) and what actions will specifically be taken as a result. Oh and one other thing – remember to say thanks.
- Do something with it. There is no point in collating the feedback if you are not going to do anything with it. Sounds obvious – but I have seen feedback ignored on a number of occasions. Even if the message can be hard to swallow you need to take some action. Also remember to pass on any learnings to other parts of your organisation that may find it useful. If people didn’t like an approach to online training in a particular part of the business then this could be valuable information.
I would be very interested to hear your learnings too – if you can find a couple of minutes then please do let me know.