Evaluating Coursera for Blended and Online Learning (Part 1)

Adjusting back to the heat of Milan in August is taking a little time for the family and this means broken sleep all round. Add a fantastic thundstorm right overhead at 3am and you have the perfect recipe for an early start to your day!

Inspired by some of my notes from reading Jay Ashcroft’s The Tablet Revolution (see review) I decided to investigate alternative MOOC platforms to iTunesU. I love iTunesU but the iOS app is a far superior experience than a student gets using iTunes on a MacBook. I flitted between Coursera and Udemy for a while not really finding a suitable course for comparison before stumbling upon an old article related to e-Learning: Most Popular Online Courses for eLearning Professionals. It seemed familiar and my Evernote concurred that I’d been here before.

While many of the courses listed are now long gone I found that Georgia Tech had just begun running a course titled K12 Blended and Online Learning. I decided enrolling would be useful on two fronts:

  1. Allow me to evaluate the Coursera platform
  2. Further my own professional development in the area of blended learning

Week one concentrates on the standards and documents from iNACOL. For anyone interested in deepening their understanding  of blended learning I highly recommend visiting their site. Of course Georgia Tech have linked all the required reading into their MOOC for you.


The instructional videos were clear, less than 8mins each in length and punctuated with short multiple choice quizzes. I do however wonder if Coursera allows different types of questioning similar to EdPuzzle (which I love). The iOS app reminded me of iTunesU a little, especially the ability to download videos for offline viewing.


I do wonder what the extra space is for in the video player… Might have been nice to have a transcript here.


I realise that students might be accessing Coursera on their MacBook so their experience will probably be different.

I completed the pre-assessment you can see linked in the screenshot earlier and found it very useful in helping to focus my targets. I’m comfortable with policy, online tools and classroom teaching, but want to delve deeper into intervention strategies that will enhance the learning of my online students. As the pre-assessment was a spreadsheet (also a PDF option) I would have liked the ability to upload an image or type some notes to myself and link it to my current stage in the course. I couldn’t find a way of doing this outside of the discussion forums so will have to rely on Evernote instead. On reflection it is probably good to have my notes outside of my MOOC, just in case.

Supplemental information appears to be text-only with hyperlinks and this is fine. One of the pages had embedded PDF and XLSX files that opened in Coursera’s own browser. Clicking on a world icon then opened it in Safari so documents could then be opened in other applications or saved. It would have been nice to select between Safari and Chrome as the default.

The discussion forums were basic but easy to navigate. Nothing I’d want to add there, there’s a reply and an upvote button for each post. Not sure how you are notified of new posts but will find out soon I hope!

In summary this is a good start. I’ve not tried creating a unit in Coursera yet but as a user I’m finding it easy to learn and navigate. I’m also enjoying the course and picking up new tips along the way which I intend to share with you, dear reader, in another instalment.

Book review: The Tablet Revolution by Jay Ashcroft @LearnMakeruk

The Tablet Revolution by Jay Ashcroft (amazon paperback & kindle)

The Tablet Revolution Cover

As someone involved in the planning, design and implementation of a large scale iPad rollout project a few years ago I found this book a fascinating read that echoed much of my own school’s experience. Each of the chapters cover a particular area that a school or education authority must consider before putting devices into the hands of teachers and students with nuggets of wisdom from the other side of the fence: that of the sales or support team wanting to part you from your funds!

The case studies peppered throughout the book highlight the pitfalls of poor planning or vague vision as well as the better-known success stories. I found this refreshing as many articles and books tend to make it seem like every project has been a success whereas anyone involved in an IT project in education or commerce will know this is just not the case!

Screen Shot 2016-08-08 at 18.12.35

I highly recommend this book to those who are planning a roll out of devices to students, those who are in the midst of the roll out and definitely to those who have been through the highs and lows already. I took the opportunity to reflect on my own approach, assumptions and practice as I read this book and think that Jay has given me insights in how to improve the learning and teaching outcomes of those using devices in the future.

Thank you for sharing this Jay!

Rating: 5 stars

As posted on Amazon.co.uk

You can read an interview with Jay Ashcroft here: http://www.ipadeducators.com/single-post/2015/04/03/INTERVIEW-Jay-Ashcroft

Cryptography #iGCSE #ComputerScience

One of the nice and not-so-nice things about vague arrangements documents is that you have to work out how to best fill the gap.

Encryption methods are part of the Computer Security section of the course – symmetric and asymmetric encryption to be exact – and, while I could simply put two slides up on the screen and move on, I saw this as an opportunity to break away from exam style questions and have a little bit of practical fun with the students.

I introduced my Year 10 students to the Caesar cipher this week, surprised that very few of them had even heard of it. They caught on quickly though and were soon manually translating coded messages back into plaintext.

But as the lesson progressed the messages got longer and the students began to find translation very time consuming. So, after a bit of discussion, we decided it might be best to ask a computer to do the translation for us. We would still provide the ciper-text and shift key, and it would do the rest.

This allowed us (imagine that) to then recap on relevant pre-defined functions to translate characters into ASCII codes and then build an algorithm to apply the shift key to these ASCII codes. Once the codes were converted back to characters they were ready to display on the screen! Most students managed to get to this point, with some even working on alternatives (more on that later).

The next day the students returned and were given a tougher challenge: decryption without the shift key! We discussed the letter frequency graph shown above and tried to create an algorithm to accurately calculate the shift key.

The open-endedness of this task challenged everyone and the variety of solutions suggested touched on some of the actual decryption methods utilised.

A few students suggested a brute force decryption of a digest of the entire message, looking for a small number of short English words before comparing the calculated shift key to the frequency analysis graph. When challenged further they explained that this would decrease the processing time of the decryption algorithm as it wouldn’t have to translate as many characters. Some students had even had researched further to find out which were the most used English words in written text!

The students rounded off the lesson by creating a list of advantages and disadvantages of symmetric encryption. While students took longer on the concepts it gave them the opportunity to understand some of the ideas and issues with this type of encryption.

Next week we tackle asymmetric encryption. I can’t wait!

Social media generated art in Python #ThisIsMyClassroom #Programming #STEAM

For the third blog post on this topic I wanted to use Python to generate different pieces of art without relying entirely on the random function. I decided to use the tweepy library, mainly because I had already used it to post content to Twitter but had never investigated how it could be used to read information back from Twitter.

It didn’t take long to find out how to read the latest 10 tweets from my own timeline using Python. Then I split the individual words into a list and sorted them into alphabetical order (for no real reason at the moment, but frequency analysis will follow!). Then I used the write method from the Turtle graphics library to place each word at a random location on the screen. This was my first attempt:

Screen Shot 2016-04-03 at 23.50.08

A bit tricky to read the words I thought. And I’d accidentally forgotten to penup before moving the turtle. However this accidental vector spider web became part of the artwork (because when I removed it, it looked quite boring).

A little while later I was able to change the font size at random (I changed the font to palatino after experimenting with a few others) and changing the pencolor in the same way as previous Python art programs changed the text colour too.

Screen Shot 2016-04-03 at 23.53.35

I had a lot of text to display, even just from 10 tweets, so I thought of ways to reduce the amount. I wrote a little Python subroutine that removed hashtags, mentions and URLs (as well as any other non ASCII text) and that was enough!

The video below shows the program in action. I decided to make a video this time because you can make out the individual words much more clearly at the beginning of the drawing than at the end!

As before the code is now on github (with my tweepy details removed for security). I’ve left in a commented out section of code that allows you to run a search for a keyword, hashtag or phrase instead of taking the latest timeline so you can experiment.

Any comments or improvements would be much appreciated!

SOUND GENERATED ART IN PYTHON #THISISMYCLASSROOM #PROGRAMMING #STEAM

I had a lot of fun experimenting with the subroutines and Python Turtle methods yesterday but wanted to push it a little further and find out if I could make use of a new Python library to help create automated art.

Somehow I’ve never built a program that utilises and analyses audio before, so challenged myself to find out more about libraries such as PyAudio and Wave this afternoon. My daughter was practising piano in the other room so it gave me a push to integrate live audio into my solution, rather than rely on pre-recorded wav files.

I learned about numpy a little this afternoon too. I hadn’t realised it had functions to extract the frequency from an audio block (FFT). The more I explore Python, the more I fall in love with it as a language!

Once I’d successfully extracted numeric frequencies from the 5 second wave file into a list I looped through them and attempted to place shapes on the Python Turtle screen to correlate with the current frequency. I decided on a simple X axis plot to begin with but then, as I realised the range between min and max frequencies usually exceeded 8000 I introduced a scale factor so they could be seen on the screen together and adjusted the Y axis so that each frequency appeared bottom to top in the order of analysis.

Screen Shot 2016-03-31 at 18.18.40

Quite nice, but there’s a lot of white space where the unused frequency range lies. Instead of removing this range from the visualisation (which, in retrospect, might have been a good idea) I decided to attempt to create ghosts of the circles fading out as they get further from the original position. This led me into colorsys and all sorts of bother, reminding me (eventually) not to mess with anything that returns a Tuple until I convert it back to a List first. Anyway, I removed that part of the code and put my arty effects on the back burner. You can see one example of the mess below. Ugh.

Screen Shot 2016-03-31 at 18.19.00

I decided to alter the colour of the background this time too. I think I’d like to use some audio analysis to decide on the colour range in a future version so that low audio frequencies create darker images and high frequencies create bright, bubblegum pop images.

Screen Shot 2016-03-31 at 18.06.42

The last thing I added to the program was the option to use pre-recorded audio WAV files instead of always recording 5 seconds of audio. This was very easy to add as I’d modularised the code as I went, so all that was needed was a few lines extra in the main program:

Screen Shot 2016-03-31 at 19.08.33

Trying out the program with a few WAV files from www.findsounds.com or playing a YouTube video in the background resulted in the following images:

chimpanzee.wav
chimpanzee.wav
uptown funk
uptown funk

Python files can be found at Github – https://github.com/familysimpson/PythonArt/. Feel free to fork the code, leave comments below or just enjoy the images it generates!

Computer Generated Art #thisismyclassroom #programming #steam

Screen Shot 2016-03-31 at 02.22.22

I wanted to create a task that allowed students to create a computer program in Python that would automatically create its own artwork but be customisable so that each student could experiment and personalise their own program to their tastes.

Screen Shot 2016-03-31 at 02.15.10

It’s a rough Python 3 program using the Turtle library and an array of Turtles but so far it has produced some really nice work. In the images shown below the program uses a user-defined function that draws a randomly sized square. I thought this would be easy for the students to understand and hack into something new!

Screen Shot 2016-03-31 at 02.15.24

Of course art can be created as a response to an external stimulus so a possible extension of this program would be to get input from the user (colours, mood, age) or calculate a range of colours from an input sensor or device (temperature, time, image).

Screen Shot 2016-03-31 at 02.15.38

The code is below! Any suggestions or improvements would be appreciated!

import turtle
import random
wn = turtle.Screen()
w = wn.window_width()
h = wn.window_height()

t1 = turtle.Turtle()
t2 = turtle.Turtle()
t3 = turtle.Turtle()
t4 = turtle.Turtle()
t5 = turtle.Turtle()
t6 = turtle.Turtle()

turtles = [t1, t2, t3, t4, t5, t6]

def square(item, size):
for x in range(4):
item.forward(size)
item.right(90)
item.forward(size)
item.left(random.randrange(-180, 180))

wn.tracer(False)
for iteration in range(3):
for item in turtles:
item.penup()
item.goto(random.randrange(-w,w),random.randrange(-h,h))
item.color(random.randrange(0,255)/255.,random.randrange(0,255)/255.,random.randrange(0,255)/255.)
item.pendown()
wn.tracer(False)
for move in range(2500):
for item in turtles:
item.speed(0)
square(item,random.randrange(5,25))
wn.tracer(True)

wn.exitonclick()

Screen Shot 2016-03-31 at 02.51.34

Screen Shot 2016-03-31 at 02.54.57

Using CodeBug tethered via USB on a MacBook

It has been a few weeks since our CodeBugs arrived here in Milan and after playing around with some of the sample programs and thinking about their features I have decided to use these with next session’s Year 10 students as an introduction to the iGCSE Computer Science course in September.

While they worked really well with the Raspberry Pi I struggled to get the CodeBugs working with IDLE on the MacBook. Installing packages via Terminal updated the Python 2.7 install that comes with the OS and – for me anyway – Homebrew complicated what should have been a very easy process. In Visual Studio if you wanted to use a module library you simply added it to the project and IDLE does not have this function.

I found PyCharm today – an IDE for Python that allows me to add the codebug_tether module (and any others I need) with the minimum of fuss. Now my CodeBug can be programmed while connected via USB to my MacBook! As an added bonus I learned more about Virtual Environments.

IMAG1237

To make it easier for my students to get going with their CodeBugs in September I created a 20-step guide linked here. It’s CC0 so please feel free to use and adapt as required. If you find any mistakes or it just doesn’t work for you in the same way please let me know.

Adding some WSQ to my #flipclass

I’m nearly a month into my flipped classroom approach and I’m already seeing the benefits (some of which I’m sharing as part of a whole-school INSET on Wednesday):

  1. Students are – in the main – responding well to the video introductions or lessons
  2. My tasks are becoming more diverse to cater for students who need additional challenges in the extended time we have in class
  3. My department website is the central focus of most of my lessons, where students can find or create sections on concepts
  4. EdPuzzle has been great at tracking video views and the embedded questions have helped me group students together where possible for remediation or further challenges
  5. Students are learning to make best use of the time in my class to move forward at a pace that suits them and to engage in deeper learning tasks

I’ve included a little screenshot of one of the pages of my department website to show you how I am beginning to embed deeper learning tasks into each concept.

Screen Shot 2016-01-25 at 23.37.00

While the layout isn’t pretty it is consistent and students are becoming used to completing the Task link (usually a Google Doc with some questions or challenges) before moving on to the Deeper Learning Tasks link.

I used an idea I picked up on whilst completing my Google Certifed Educator exams late last year: the Multi Media Text Set. This is where the student is given a number of different options: links to webpages, articles, videos, etc. so that they have an element of choice in each lesson. Here’s a screengrab of some of the deeper learning tasks for the Machine Instruction Cycle topic:

Screen Shot 2016-01-25 at 23.42.44

I have to thank the great Voxer group I’m part of for keeping me motivated, focussed and for sharing their own practices and challenges. One teacher (Shai McGowan) told the group about WSQ (pronounced whisk) as a way of collating feedback from students on the flipped approach. I’m currently using a mixture of EdPuzzle, Kahoot quizzes and 1:1 conversation with students (now I have the time!!) to gauge their progress but am interested in reading further. I did a little searching and found the following comprehensive guide to WSQing your flipped lessons:

http://flippingwithkirch.blogspot.co.uk/p/wsqing.html

The next step is to try the approach with a few classes. While my target class for the flipped approach has been my year 10s I have been teaching younger students the art of note taking (Cornell style) so they should be by now more than capable of completing the Summary part of a WSQ. Come to think of it, I’d be very interested to see who are better – those who have been explicitly taught to take notes in a certain way or those who haven’t.

Inspiration from “The ON House” Milano #thisismyclassroom

I was lucky enough to find out about this house through a parent of the school and visited it today. The ON House has been created by Simontech to demonstrate the various home and office automation products that they sell and how they can be integrated together via an overarching web app.

As you might remember from previous blog posts I am gathering inspiration, student wish lists and researching classroom design in order to develop a classroom with a clear Computer Science identity and purpose. In short I want a classroom that can be customised to suit particular learning and teaching tasks but also become integrated into the lessons I teach.

I plan to take some of my students there so they can also gain an insight into what is possible with current technology. What I particularly liked about The ON House was that the technology was not obvious or overwhelming however the integration of the technology made the house more accessible and customisable.

During the visit I thought about how some of the technology could be integrated into a Computer Science classroom. For example the ambient lighting presets could be used to indicate and suit different learning activities. Programmable colour changing LEDs in the ceiling or floor could also be accessed directly by students in their programming lessons. I think there would have to be a way of allowing access to these lights during lesson so that my classroom did not become a disco when a student got home!

The ability to change the machines that were displayed on the short throw projector via an app would also be very useful in class. At my previous school a custom SMART panel was used to switch sources which meant that any changes required software updates from the company. It also meant that if the panel broke – it cost a lot to fix it and rendered the AV unit useless. Multi-platform applications that perform the same function as a custom panel would, I think, allow changes in the future to be made much more easily. Also replacing an android tablet or iPad mini would be much cheaper than a custom SMART panel.

Simontech also explained that access levels can be set within their system and I think that this could be fantastic for lesson preparation. If you have the lighting, AV, etc. set appropriately for a particular task or topic you can quickly save this as a preset at the end of a lesson and recall that preset the next time the class comes in leaving you free to start the lesson without fiddling with the technology.

There were other interesting components built into The ON House including electric privacy glass. On returning to school I trawled YouTube for a while and found a great short video showing this in action.

In the video you can see that when the glass is set to white you can project on the surface and, if you think it’s useful, add a touch screen too. This could allow you to open out a classroom with few windows so that more natural light was let in and also function as a display space.

Another aspect they discussed was security. In their kitchen demo they showed that their app could prevent doors into other rooms as well as individual cupboards and drawers from opening depending on the preset.

I have linked a few videos and articles about the ON house below so you can find out more about it. If you are in the Milan area and would like to visit it you can send an email to theonhouse@simontech.it or phone +39 02 40043548.

A slightly late #teacher5aday 2016 pledge

Although spending most of the Christmas holidays under the weather (double dose of the Milan flu I’ve been told!) I really enjoyed following the #teacher5adayslowchat hashtag on Twitter (started and perpetuated by Martyn Reah). I peeked over the parapet to post my thoughts and had resolved to post a #teacher5aday pledge before the end of the week. However – I forgot!

To keep my own wellbeing in sharp focus, as well as the development of my students, I pledge to:

#connect – although I already connect a lot digitally I want to try and voice chat (already getting into @voxer)/ video chat with friends and colleagues more in 2016. I also want to take advantage of the short visits home to catch up with as many technophobe friends as possible!

#exercise – take advantage of Milan’s BikeMi service, where you can borrow bikes from areas across the city by swiping your metro card; swim in the lakes as soon as it gets warm enough!

#notice – Explore Milan, find secret places, drink in the artwork and try to slow down in my spare time.

#learn – I was lucky enough to get an electric guitar and iRig for Christmas so I’m currently using Yousician to brush up on some rusty techniques; My Italian could (and will) be better; I also want to continue to learn from my colleagues – both real and virtual!

#volunteer – As well as planning for TeachMeet Milan at the end of the school year I’d like to volunteer outside of the educational sphere. I’ve heard that Italians are some of the most generous and helpful people in the world. There must be a way to lend a hand…

%d bloggers like this: