Well here we are. The Course 5 journey has been long, exhausting and informative.
It started out with me designing this unit plan that would add some technology to a unit focusing on how the world works in terms of simple machines.
In this introductory post about my project I had outlined where I planned to try to navigate a few heady concepts with my Grade 2 mostly English Language Learner students.
Here is a quick summary of what my objectives for learning were (which were actually quite scaled back from some previously planned coding objectives):
- Students would gain more insight into how to create, cultivate, and add to a personal learning network;
- Students would make individual practical choices about which types of technology/apps to use for various purposes;
- Students would understand and make use of the fact that computers and apps on iPads help us engage in valuable simulations of real world forces and how these relate to simple and complex machines;
- Students, in discussing simulations, would also build up their use of academic vocabulary. I hoped that simulations would help students discuss variables using explicit vocabulary. For example, I wanted students to use a simulation of a catapult and describe how moving the <fulcrum> <further / nearer> to the <load> would affect the <distance> the load would move.
How it all went…
What I quickly noticed while trying to teach this unit was that I was just not going to have as much time and access to my students (almost a third of the Grade 2 year group who had placed into our school’s EAL program) as I needed with them to meet all my objectives.
Unfortunately what this meant for my teaching was that I was not going to be able to offer students many opportunities to choose appropriate tech tools for a variety of tasks (beyond an initial decision on whether to use Google Web Search, Google Translate, or Google Images search when trying to understand unit vocabulary). As it turned out, I had to teach students some aspects of Google Slides, Google Forms, Tinybop Simple Machines, and Padlet, as they were learning to use these apps for the first time. Therefore, students did not make choices about tech use beyond whether to create videos or type responses on Padlets. We did discuss in end of unit interviews which apps they felt helped them learn more and why, but I did compel them to use all of the tools I set out to introduce.
I also did not have much of a chance to help students cultivate a personal learning network beyond searching my blog for curated research links and class products. I tried to set up some links between other schools who might be studying simple machines, but to no avail.
Students did on the other hand get a lot more experience with contributing to information online and collaborating with others on digital learning products. Students all added to class Padlets about simple machines. Students filled out Google Forms and then reviewed data contributed by students in other classes. Students created Google Slides presentations, some with videos, that showed the process of their scientific explorations of simple machines.
Students did meet the other objectives on the whole, however. Students realised the utility of computer simulations as can be seen in some of the end of unit interviews in my video. They understood how computer simulations allowed us to use materials that might be impossible to use in real life, allowed us to be more creative, and allowed us to perform a greater number of tests.
In addition, use of computer simulations very much prompted students to push their use of language, especially when simulations used labels to reinforce vocabulary that I taught them as an EAL / Language Acquisition teacher.
The Final Project Video
The main challenge of this course was the creation of my video. I wrote to Ben Sheridan, our Cohort facilitator, earlier this month that I was obsessed with this movie. I have to say that the process of documenting, conceiving, scripting, storyboarding, filming, editing, soundtrack recording, and compressing a movie, not to mention learning about various technology you need to do this, was engrossing to say the least.
The first step of making my video involved checking out a tripod and what I thought would be a great camera to use because various teachers had mentioned how good the microphone was from my school’s library. (The camera later turned out to be a bit obsolete when compared to the HD video recording capability of my iPad and iPhone but had much better optical zoom capability – which I have to add I did not really use as I did not have a film person.) I then began to blanket record every teaching session that had a clear connection to the unit. Most of this footage was unused in my film, but was still great to see in order to process how the teaching and learning went.
After the unit finished in early April, I began to view the footage to help myself think of a clear narrative for the film. I decided to go with a sequential narrative based on the timeline of planning and teaching the unit. I then drafted a script in written form while sitting at many coffee shops in my blue project notebook. I used the dictation feature of my Mac to turn this into text one morning, finding that the script ran to almost eight pages. I copied this into an invaluable app called Teleprompter Lite and read it out as it scrolled. Realising after nearly twenty minutes of reading that I had way too much detail in the script, I set about making drastic cuts consciously thinking about the differences between what sounds good when reading text, versus what sounds good when that text is spoken. What I found is that I write in a very different way from how I speak. The script sounded way too technical and expository. I began to really think how I could represent a lot of my ideas in visual form.
Here the fun began. I began to think how to make my rather dry script into a video that was at least visually interesting in some way. I began to draft my storyboard with ideas for shots at a local coffee shop. (Big shout out to the friendly folk at Deja Cafe, Phnom Penh.)
After conceiving about 50 shots and shot sequences, I began the long work of filming, creating animations, recording an audio soundtrack, and editing my existing footage using a Samson CO1U Pro mic, Garageband, and iMovie. I did all of this in my music studio that I removed a keyboard from. I knew early on that in order to have the animations I wanted, I would need a green screen of some sort. I walked around the garment/textile district that is Orrussey Market in Phnom Penh, settling on a metal clothes stand up hanger, some green plastic sheeting, and green felt. I hoped these materials would allow me to achieve decent green screen effects. (In the end…they were not really enough; I also need to purchase real lighting at some point to counteract the darkness that is my long apartment.) Realising finally that the best green screen I could create would be a green felt one meter wide strip hanging from the curtain rod in my studio, I set about setting up camera, tripod, teleprompter, and microphone alongside my computer in the studio.
After filming and recording myself a few times going through the script, each time willing myself to be more of an actor, I had acceptable footage and a soundtrack with minimal errors in reading from the teleprompter.
I then began the three week long process of editing video and photographic footage I had shot in my classes, finding images and video that were free to reuse online, and creating animated titles and clips that I could use for green screen sequences. I spent days shaving seconds off of clips, making sure that audio tracks were free from pops, and creating amateurish, but workable animations using Explain Everything and iMovie.
The result is below. Please enjoy and thank you for any feedback you can offer.