Social media generated art in Python #ThisIsMyClassroom #Programming #STEAM

Social media generated art in Python #ThisIsMyClassroom #Programming #STEAM

For the third blog post on this topic I wanted to use Python to generate different pieces of art without relying entirely on the random function. I decided to use the tweepy library, mainly because I had already used it to post content to Twitter but had never investigated how it could be used to read information back from Twitter.

It didn’t take long to find out how to read the latest 10 tweets from my own timeline using Python. Then I split the individual words into a list and sorted them into alphabetical order (for no real reason at the moment, but frequency analysis will follow!). Then I used the write method from the Turtle graphics library to place each word at a random location on the screen. This was my first attempt:

Screen Shot 2016-04-03 at 23.50.08

A bit tricky to read the words I thought. And I’d accidentally forgotten to penup before moving the turtle. However this accidental vector spider web became part of the artwork (because when I removed it, it looked quite boring).

A little while later I was able to change the font size at random (I changed the font to palatino after experimenting with a few others) and changing the pencolor in the same way as previous Python art programs changed the text colour too.

Screen Shot 2016-04-03 at 23.53.35

I had a lot of text to display, even just from 10 tweets, so I thought of ways to reduce the amount. I wrote a little Python subroutine that removed hashtags, mentions and URLs (as well as any other non ASCII text) and that was enough!

The video below shows the program in action. I decided to make a video this time because you can make out the individual words much more clearly at the beginning of the drawing than at the end!

As before the code is now on github (with my tweepy details removed for security). I’ve left in a commented out section of code that allows you to run a search for a keyword, hashtag or phrase instead of taking the latest timeline so you can experiment.

Any comments or improvements would be much appreciated!

SOUND GENERATED ART IN PYTHON #THISISMYCLASSROOM #PROGRAMMING #STEAM

SOUND GENERATED ART IN PYTHON #THISISMYCLASSROOM #PROGRAMMING #STEAM

I had a lot of fun experimenting with the subroutines and Python Turtle methods yesterday but wanted to push it a little further and find out if I could make use of a new Python library to help create automated art.

Somehow I’ve never built a program that utilises and analyses audio before, so challenged myself to find out more about libraries such as PyAudio and Wave this afternoon. My daughter was practising piano in the other room so it gave me a push to integrate live audio into my solution, rather than rely on pre-recorded wav files.

I learned about numpy a little this afternoon too. I hadn’t realised it had functions to extract the frequency from an audio block (FFT). The more I explore Python, the more I fall in love with it as a language!

Once I’d successfully extracted numeric frequencies from the 5 second wave file into a list I looped through them and attempted to place shapes on the Python Turtle screen to correlate with the current frequency. I decided on a simple X axis plot to begin with but then, as I realised the range between min and max frequencies usually exceeded 8000 I introduced a scale factor so they could be seen on the screen together and adjusted the Y axis so that each frequency appeared bottom to top in the order of analysis.

Screen Shot 2016-03-31 at 18.18.40

Quite nice, but there’s a lot of white space where the unused frequency range lies. Instead of removing this range from the visualisation (which, in retrospect, might have been a good idea) I decided to attempt to create ghosts of the circles fading out as they get further from the original position. This led me into colorsys and all sorts of bother, reminding me (eventually) not to mess with anything that returns a Tuple until I convert it back to a List first. Anyway, I removed that part of the code and put my arty effects on the back burner. You can see one example of the mess below. Ugh.

Screen Shot 2016-03-31 at 18.19.00

I decided to alter the colour of the background this time too. I think I’d like to use some audio analysis to decide on the colour range in a future version so that low audio frequencies create darker images and high frequencies create bright, bubblegum pop images.

Screen Shot 2016-03-31 at 18.06.42

The last thing I added to the program was the option to use pre-recorded audio WAV files instead of always recording 5 seconds of audio. This was very easy to add as I’d modularised the code as I went, so all that was needed was a few lines extra in the main program:

Screen Shot 2016-03-31 at 19.08.33

Trying out the program with a few WAV files from www.findsounds.com or playing a YouTube video in the background resulted in the following images:

chimpanzee.wav
chimpanzee.wav
uptown funk
uptown funk

Python files can be found at Github – https://github.com/familysimpson/PythonArt/. Feel free to fork the code, leave comments below or just enjoy the images it generates!

Computer Generated Art #thisismyclassroom #programming #steam

Computer Generated Art #thisismyclassroom #programming #steam

Screen Shot 2016-03-31 at 02.22.22

I wanted to create a task that allowed students to create a computer program in Python that would automatically create its own artwork but be customisable so that each student could experiment and personalise their own program to their tastes.

Screen Shot 2016-03-31 at 02.15.10

It’s a rough Python 3 program using the Turtle library and an array of Turtles but so far it has produced some really nice work. In the images shown below the program uses a user-defined function that draws a randomly sized square. I thought this would be easy for the students to understand and hack into something new!

Screen Shot 2016-03-31 at 02.15.24

Of course art can be created as a response to an external stimulus so a possible extension of this program would be to get input from the user (colours, mood, age) or calculate a range of colours from an input sensor or device (temperature, time, image).

Screen Shot 2016-03-31 at 02.15.38

The code is below! Any suggestions or improvements would be appreciated!

import turtle
import random
wn = turtle.Screen()
w = wn.window_width()
h = wn.window_height()

t1 = turtle.Turtle()
t2 = turtle.Turtle()
t3 = turtle.Turtle()
t4 = turtle.Turtle()
t5 = turtle.Turtle()
t6 = turtle.Turtle()

turtles = [t1, t2, t3, t4, t5, t6]

def square(item, size):
for x in range(4):
item.forward(size)
item.right(90)
item.forward(size)
item.left(random.randrange(-180, 180))

wn.tracer(False)
for iteration in range(3):
for item in turtles:
item.penup()
item.goto(random.randrange(-w,w),random.randrange(-h,h))
item.color(random.randrange(0,255)/255.,random.randrange(0,255)/255.,random.randrange(0,255)/255.)
item.pendown()
wn.tracer(False)
for move in range(2500):
for item in turtles:
item.speed(0)
square(item,random.randrange(5,25))
wn.tracer(True)

wn.exitonclick()

Screen Shot 2016-03-31 at 02.51.34

Screen Shot 2016-03-31 at 02.54.57

Hour of Code Around the World (Event) #edchat #ukedchat #aussieED

code-944499_1280

After some discussion with a friend and former colleague (and some thinking over a few coffees) I was inspired to post a short tweet yesterday:

Screen Shot 2015-11-07 at 10.02.02

My aim is to get a small number of schools involved in a Google Hangout on Friday 11th December, code together and learn a little about how Computer Science is taught in schools in different parts of the world.

Even if timezones prevent schools from taking part in the Google Hangout there is still a chance to take part.

Interested in finding out more? Send me a tweet @familysimpson.

Programming on an iPad #compSci #RGCdevicetrial

At the start of the #RGCdevicetrial I was very cynical about the effectiveness of iPads in education. I did not think they were suitable for use in secondary school classrooms. I saw them as content consumption devices, tailored for personal use only, and an expensive gimmick destined to gather dust in a department store cupboard (much like the iPod touch devices bought en-masse a few years ago).

I’m happy to state that I was wrong. For me, the iPad is a very strong contender for not only becoming the device of choice at our school but for eventually replacing desktop PCs in the Computing classroom too.

Like many others I thought it wasn’t possible to program on the iPad. I’d heard about Scratch being removed from the App Store and, whilst working on a successful Internet Safety project at Inverurie Academy in 2011, had fought a battle of wits with XCode to create and install a series of simple apps on the aforementioned iPod touch devices. I didn’t want to rely on having a spare Macbook sitting around for pupils to code on, in a language that was fairly impenetrable, just to be able to use the iPad in a Computing Science classroom.

However, after speaking to Fraser Speirs at a SCIS event in Edinburgh a few weeks ago, I realised that it was possible. He told me about Pythonista, which allows you to create command-line or graphical programs straight on the iPad. Fraser also told me that he pays for processing time on Amazon servers and gets students to upload code from their iPads and execute it remotely. The extra benefit of this, he says, is that his pupils have access to the same programming environment regardless of their location. It allows them to continue coding at home on a task they may have started in school.

For early stage programmers one app that helps build coding foundations through sequential instructions is A.L.E.X. I downloaded it whilst setting up the iPad for the #RGCdevicetrial and accidentally syncronised it with the iPad Mini which was being used by the ICT specialist in our primary school. She loved the app so I gave it a go last week while learning more about how an iPad mirrors to a data projector using Apple TV. There were young pupils in the playground outside with their noses against the window as they watched the robot move through the levels.

This morning I spotted a retweet by Dawn Halybone and had to investigate further:

20130313-231307.jpg

Snap! is a web-based drag and drop programming language developed at Berkeley. Very similar to Scratch, you create programs by associating scripts with sprites on a stage. It runs through a browser so you have to be online to use it however it looks very stable on the iPad. Even though the most recent Scratch beta is also web-based, it does not work with the iPad due to the fact it needs Adobe Flash to play content. I wasn’t even able to access the code screen on the site so, for the moment anyway, Snap seems to be the only option.

Do you know of any other apps or websites that allow programming on the iPad? Please share!