Constructing my hackable classroom – Part 1 #ThisIsMyClassroom

Untitled

In late 2015 I had the opportunity to sit down and design a classroom environment suited to Computer Science students. After surveying my students and getting lots of ideas (including one which resembled a living-room with C shaped sofa – tempting, I’ll admit) for what they consider would be their optimal learning environment I let the ideas stew while over in London in January for the Apple Leadership event and BETT 2016.

My interest in flexible learning environments roll way back to my time at Inverurie Academy when I worked in an open plan floor of six classrooms. I posted my first #ThisIsMyClassroom blog in May 2011 as a way of recording the changes to my classroom environment. Even at this time I was asking students about how they would like their learning environments to be arranged and remember the 3D walkthrough videos created by a great S3 class. It was truly excellent work that culminated in a video conference with Anna Rossvoll, who was at that time creating her own flexible learning environment at Hill of Banchory school in Aberdeenshire. I’ll try and find these videos and upload some of them.

Since the launch of the Raspberry Pi Computer Science teachers have had the ever-increasing opportunity to embed low-cost working models in their classrooms. While at Robert Gordon’s College I set up a separate Raspberry Pi lab (imaginatively titled PiLab) but when we moved to new classrooms in 2015 integrated the Raspberry Pis into my Computing classroom and made them part of the curriculum rather than an extra-curricular club.

I also used my experience from attending the PiCademy in Cambridge to investigate how Raspberry Pi might be used to allow students to access previously static areas of the classroom environment and bring them to life.

Perhaps the final piece of the inspiration puzzle came when I visited OnHouse Milano during last session. While primarily a showcase design home I had a great discussion with their programmers on how they use themes and scenarios to integrate a number of systems. This gave me the idea of creating Python API wrappers that allow the students to move easily access a number of hackable devices in the same program. These libraries could then easily be imported into a student’s programming environment and let them, for example, take the colour sensed by a Raspberry Pi camera and mimic it in the Phillips Hue lighting system.

I still want to keep the same classroom environment ethos as I introduce more (relatively low cost) interactive technology to the classroom – the students connect more by displaying their work. So areas of the room are set aside ready for student posters which can then be augmented using Aurasma, CodeBug projects can be displayed in a gallery area around the LAUNCH posters, the robotics created by students in extra curricular clubs are always on display. It does sound like I’m looking forward to the room becoming a slightly updated version of Eduardo Paolozzi’s studio

At this point the desks are in, the screen is in a more suitable position so that all students can view, the double whiteboards are up and the power provision in the classroom has been enhanced. There are also elements of the hackable classroom in place and the students will begin to use these as part of their lessons in the coming weeks and months

Augmented Reality using a webcam and laptop

I’ve long been jealous of those mobile smartphone types with their fancy embedded cameras and their Junaio, Layar and Aurasma apps. With my iPod Touch 2G I can get the apps but not the content and the closest I can get to mobile augmented reality is to watch cool videos on YouTube as I walk, or stick post-its to my headphones and play pretend…

I set myself a summer holiday target to find out more about Augmented Reality and to try and get it working on a laptop. I knew it was possible but Google searches tended to get bogged down with iOS or Android apps. However this evening I stumbled upon a web-based service called EZFlar. This site allows you to link an image, 3D model, movie, text or hyperlink to one of five fixed marker images extremely quickly (not too sure how it handles recalling the generated AR projects though – here’s a link to what should be an image of Bloom’s Taxonomy…), however this blog briefly discusses how to extend this by downloading the EZFlar program to your own machine and indulging in a bit of Flash ActionScript coding. Definitely something I’m going to try out!

I also put a tweet out asking for help in finding laptop-friendly AR applications. I had two responses, both suggesting http://www.arsights.com which uses Google Earth 3D models and a fixed marker image. It was really quick and easy to get started and I can see this being used with classes for quick and easy demonstrations of Augmented Reality. There’s a suggestion that you can use Google Sketchup to create your own 3D models and then submit links to the ARSights warehouse but I haven’t investigated it as I haven’t used Sketchup before.

So what could these applications be used for in my classroom?

  • a multimedia treasure hunt using EZFlar to store videos / clues to keep the game going
  • a fun way to display pupil work by pinning printed AR markers on the walls rather than a black and white print out of their graphic work / animation / movie
  • a method of allowing pupils to explore digital representations of computer hardware which is too expensive to buy or too fragile to hand out
  • a fantastic way of starting group tasks by using embedded audio / video on an AR-ready placemat in the middle of the group. Scanned by a webcam or mobile device, this could engage all types of learners as well as offering differentiation in the ability to replay these movies on demand (or offer extra AR markers if required)
I want to finish this blog post with a few videos I saw on YouTube. 110 Stories is an augmented reality app proposal currently attempting to get Kickstarter funding. I thought it was a great use of AR – I hope you do too.