Sunday, September 25, 2022

Week Three Discussion Posts

Q1. This is the..uh.. third question I ask every client when kicking off a project: "What does success look like for you?" Often times, it's the same ol' "We want them to like it" or "People took the training" (whatever the hell that means), but every once in a while I'll get an honest answer. It's an incredibly simple and deceptive question when you really try to answer it honestly. Look deep into your own environment's experience design and ask yourself: What does success look like? What does success look like for your (various) stakeholders? How will you know this learning intervention succeeded?

Success looks different for each project and is completely dependent on its purpose. In the past year, I have been responsible for creating two online modules that will be mandatory for all new permanent full-time faculty. Here success would be engagement, motivation, and the ability for the learner's to apply what they have learned. In other words, have they achieved the learning outcome? The online modules include a number of assessments, discussions and a final project; the module is pass/fail based on the completion of all of these items. There is a final evaluation of the course as well.

I also am responsible for producing one-off professional development sessions. Success would be attendance. Were people interested enough in the topic/description to attend the session. Then I would question whether the material was valuable, applicable and useful? We do send out evaluations here as well, which ask for feedback on the session, along with ideas for further offerings. 

Q2 a) What's a data point that you currently work with and what can it tell you (and not tell you) about your learners or the experience you have designed?

This is an interesting question, as I am not directly involved with any data points. All of this work happens behind the scenes by an administrator and I am not responsible for the creation of the surveys nor what happens with the information that the data gives us. 

I believe we measure attendance, engagement in Zoom session (whole session, half session, etc.), and then the evaluation survey (rating out of 5, comments, key takeaway, going further [learn more about this topic]).

Q2 b) What's a data point that you would like to collect (assuming unlimited resources/permissions) and what could it tell you about your learners or the experience you have designed?

It would be very interesting to do an environmental scan of polytechnics in Canada and then compare those results to our institution. What are we doing well? What are the areas that we could improve upon.

Q3. This week, we took a look at the LX Canvas. Between Plaut's model, the CUBI model and the LX Canvas, which one do you prefer and why? 

LX Canvas is intuitive, user-friendly and can be immediately applied to my work. It is set up much the same as the lesson plan template that we use at my institution. It poses the important questions that need to be pondered in order to create a thoughtful, organized, and meaningful learning experience. 

No comments:

Post a Comment