I am a purist when it comes to music education. Let me explain. There are numerous studies of the benefits of music education such as how music lessons make children smarter, or how playing an instrument benefits the brain, etc. So, through my Orff and Kodaly training, that is the approach I take in my lessons. We create music through our voices, our bodies and our instruments. How, then, does technology play a role?
Hi, my name is Kehri, and I need help authentically integrating technology in my elementary music classes.
I prefaced my approach to teaching music – it is created through our bodies, our voices and our instruments. I believe that developmentally it starts there. Bruner thinks so too. Since this is not a psychology blog, you can read more about him here. The Orff approach (which is the basis of my teaching practice) allows students to create, experience, improvise music in, again, the most organic way which through our bodies, voices and instruments. Where is the research study that suggests that music making VIA TECHNOLOGY also has its benefits? I suppose one could argue that a pencil and paper is technology; but, you all know what I mean. Perhaps I just haven’t found THE THING that combines the best of both worlds.
When I taught MYP, I gave myself a pat on the back for “integrating” technology in music class by using Google Classroom. We would access our classroom with 1:1 Chromebooks and students uploaded their video recordings. These videos were of the students performing on instruments. Google classroom gave us a platform for submitting assignments and providing feedback and reflection. After discovering the SAMR technology integration framework by Dr. Ruben Puentudura, much of what we were doing was only enhancing and not transforming the lesson. Ugh. I really thought we were making strides. Back to square one.
What am I missing? Perhaps I haven’t scoured the latest apps or programs. As far as I know, there are apps that offer fun games with a “learning element”. Rhythm Cat is a game that reviews and challenges you how to read music notation. GarageBand is an app that allows students to create an excerpt, a short music sample, that combines loops or prerecorded sequences. It has many more uses, but for now I’ll stick with creating excerpts. But shouldn’t the students KNOW how to create those loops or sequences acoustically? And for those who say that students can compose, there is a web browser based app called Noteflight that does just that. You’re correct. But then again, insert pencil-paper here. Maybe I’m just old school and need to get with the program. Program. See what I did there?
I feel like I’m stuck in the enhancement phase. I’d like to get to the transformation phase. But I just don’t know HOW.
I’m just a girl, sitting in front of a computer, asking all of you, to help her with technology integration in an elementary music setting.
Open to ALL suggestions.