Tuesday 7 February 2012

Storifying Media

People 'ify'-ing stuff.

It's been happening a lot, 'gamifying' being a big thing over the last few years.

This week's sandbox focuses on storyfying...well...stories.

We were telling the story of the Goldsmiths election day through live, multimedia uploads curated through Storify.

By curating media and content from across youtube, twitter, instagram, facebook, foursquare, audioboo, amongst others, we were able to build an ongoing story.

It was great to have an inspirational @Documentally, Christan Payne to show us the ropes and guide us through the best way to utilise the tools available.

Take a look at the story below.

Friday 3 February 2012

Tweeting Terry's Armband.

I wanted to explore using Twitter to create real time analysis, reaction and to find stories as stories break for my lastest article for matchchatter.

Not many stories this year have been as big as Terry losing the England captaincy which broke around 11am this morning.

First step was using @mhawksey's twitter archiver script in google spreadsheets and twitter visualiser to follow and map conversations across twitter.

Originally the visualisation was messy with a huge volume of individual tweets, and with such a large data set, fairly unresponsive. I planned on working out where to include a for loop to include only tweets RT'd at least once or tweets which have replies - to catch real conversations, however the script seemed to be slowed down after two runs - not allowing me to download the second scrape (this is where running the scraper independently of google docs, and therefore servers would have been much better). The RT's however were an interesting field to look into itself - what are the most popular things people are saying and who are the most influential? At least the latter we could tell from the TAGSExplorer visualisation.

TAGSExplorer visualisation.

Gary Lineker's England captaincy tweets.


You can see where the conversations are spawning from - following are conversations with Gary Lineker for example. (I use the word 'conversation' loosely, really they are tweets directed at him after his original tweets on the subject.) This was interesting to see what people were saying in reply to an influential Twitter user engaged in the conversation.



The first scrape collected 4110 tweets using the search terms "terry" "captain" "#eng" and "armband". Therefore the scrape will have included in tweets complete unrelated to the subject. That's why statistical analysis of the words collected are important.



In the first scrape we can see that there are 309 mentions of "Parker", 267 of "Gerrard" and 161 of "Ferdinand" - this was worked out by "wordle's" analysis when creating the word clouds. Bear in mind a second analysis is yet to be run for terms such as "@rioferdy5" - Ferdinand's twitter account and "Stevie G" etc.



We can see Parker also features the heaviest in the google doc's word cloud feature - an independent algorithm also states Parker is the most commonly mentioned player in relation to the conversation.

The word cloud was a good way to see what the broad conversation was saying most frequently and so I used wordle to create the John Terry word clous which is the featured image in the article.


----------------------

Tuesday 31 January 2012

Pipes

In our sandbox this week we had Tony Hirst from OUseful.info come in and deliver a great session on both mapping data using google docs, yahoo pipes as well as Gephi to map social networks. This is how we created a map of the UK's largest towns and cities by population.

We created location based spreadsheet by scraping a Wikipedia table, then published it as a csv spreadsheet using google docs.

Piped csv file through yahoo pipes - using regex & location builder module -
http://pipes.yahoo.com/leonpuplett/uktownpop

Run pipe - creates yahoo map.
copy link from "get as kml" and pasts into maps.google.com
Can embed:


View Larger Map

Friday 27 January 2012

Olympic Stadium Timeline

Olympic Stadium Timeline test for ELO.
I've been working on an interactive timeline for East London Olympics about the history of the Olympic Stadium so far.
I created two versions as the first created using Dipity is ad supported ($99p/m to get rid of them!), the second was using timetoast which is not ad supported and loads quicker, isn't as visually interesting.

Tuesday 11 October 2011

An Original Idea

Coming up with an original idea. Some say it is impossible.


My latest project work says it's necessary.


I am working on two different projects.


My Digital News Writing class has me researching the Olympic Borough of Waltham Forest in search of new news in relation to the noisy sports event moving into their back garden.
I expect I'll be hounding local shop owners and kids playing football what they think and if they have heard any new news which is newsworthy for our upcoming website to sit alongside the existing east london lines site run by the MA Journalists.


The Digital Sandbox class project has troubled me.


I'd say I am a decent 'ideas man'. Granted these ideas are normally hair-brained, inherently stupid and contextually redundant, but I can come up with a few (anyone remember my lunchtime stuffing-muffins?!).


We have been set the challenge of coming up with a [news] theme, then using google keywords to find our niche, hone in on an idea for our ongoing digital escapades for the next year.


Where do I start?


There must be a website for literally everything.


This is the whole point of a 'googlewhack'.


I was stumped.






Not even a nibble of an idea-lett.


"I forgot better shit than you ever thought up", I can't help but envy Kanye West's mind sometimes (!).


Maybe I was going at this the wrong way. Instead of looking for an original idea, I should instead be looking for a new way of delivering.


We had a really interesting guest lecturer in the form of Neil McIntosh, Deputy Editor of Wall Street Journal Europe last Monday, who spoke about invention and innovation. WSJ's new 'Live' app was particularly impressive, as was it's popularity to them.



By Friday the world had lost one of the greatest ever innovators.


I think Stephen Fry perfectly describes how he was so good at what he did, 'Steve Jobs didn’t invent computers and he didn’t invent packet switching or the mouse. But he saw that there were no limits to the power that creative combinations of technology and design could accomplish'.


He spoke about the Knight-Batten Award and what it looks for in a journalist/medium. The words included;
'engagement', 'sharing', 'interactions', 'conversations', 'meet information needs', 'actively involve'.


I looked at what I was interested in and mind-mapped from there.


Although I'd love to cover the latest in tech, and would be fascinated and could geek out over NASA/ISS/ESA press releases, I'm not directly involved and would be reproducing their content with my comments pasted over the top.


Then in our lecture preceding Ed Roussel's talk on innovation in Newsroom, The Swiss Ramble's blog was mentioned. This economist by day, football analyist by night tore apart Arsenal's financial situation and looked at football from an economists point of view (it's long but take a deep look when you get a moment - its engrossing for football and economics fans alike!).


From there I have been playing with the idea of creating a case study based website, looking at either the up and coming young players based at the London clubs, or to focus on the U18s set-up as a whole, using the London clubs as  examples.






The majority of the work will hopefully be going up onto Kicked in the Alberts which is my pet project. I'm looking for football contributors to pen articles for it so if you're interested get in touch.


One thing I've learnt from both McIntosh and Roussel is that media organisations don't get much, or see much value in the comments and message boards sections. Roussel admitted much is 'outsourced' via integration through Twitter and Facebook. I find it quite amazing how there is only a very small percentage who actively invest time in commenting, but we can only look to YouTube as to why media companies may not take much inspiration on future debates, content and delivery from these posts.


Sports Photography

Tuesday 27 September 2011

Creating my own Markup Language.

Creating our own Markup Language

Inspired by my lifelong addiction to Football Manager I was thinking of creating a markup language with a purpose for football. The idea of this could be used with a real time location based infographic app/data viewer. This came out of a few hours work and is very much an example of understanding how markup works and what are its possibilities. Given more time, this example could become a lot more complex. 


The  < Matchinfo > is fairly self explanatory. The < ID > section refers to the associated info to that could be assigned to that specific stadium - ie travel info, average attendance, current people check-in there (read Facebook/Foursquare tie in).

< match2 >

  < matchinfo >
    < date > 2011-10-08 < /date >
    < stadium > Reebok
        < id > 16 < /id >
    < /stadium >
    < city > Bolton < /city >
    < kick-off > 15:00 < /kick-off >
    < time > 15:23 < /time >
  < /matchinfo >
< /match2 > 



This section describes a specific event.
The fictional event describes Rooney equalising for Man Utd at the Reebok, with Cahill at fault for the goal.

The  < match2 >  refers to which game it is in the league fixtures for that day.

Now we get onto the information of the specific event.
We can see the event which occurred was a shot (no 4 of the game), it was on target and resulted in a goal (the second of the game). The < placement > 301 < /placement >  refers to a grid reference given the the goal itself.
If you imagine the goal is split into 3 horizontal sections, each split into 4 boxes.

  < Event >
      < action >Shot
          < shotno > 04 < /shotno >
      < /action >
      < type > On Target < /type >
      < outcome > Goal
          < goalno > 2
      < /outcome >
      < placement > 301 < /placement >
  < /Event >


The idea of  Context  is to give a more human approach to the data and event.
It is given a rating between 1-10 as to how against the run of play the goal is. This has been given an 8 with 10 being completely against the run of play, 1 being very, very likely. The  timing refers to how this goal affects the scoreline - this being an equaliser.
The idea of  blame  is perhaps an unnecessary one but gives an indicator as to who was at fault in the game. In this respect Cahill can be seen as being at fault. This could always be set to  none  if the event was a moment of brilliance rather than poor defending or goalkeeping.

  < context >
      < run-of-play > Against < /run-of-play >
          < rating > 8 < /rating >
      < contraversial > no < /contraversial >
      < timing > equaliser < /timing >
      < goals > 1:1 < /goals >
      < blame > Cahill < /blame >
   < /context >
The Ball data would be the most important when thinking of a live feed graphic. The < assistfrom > & < endpoint > directly refers to the start and end point of the pass that led to the goal.Again, like the goal, the pitch would be split into a grid reference system. In this grid, the pitch would be split a1 - e5 with c straddling the half way line and a or e being either goal area or corner. There are also a few extra data entries connected to the pressure and assistfrom fields. The extra data assigned to assistfrom is the play. This related to either open play or dead ball situation. A dead ball situation could then be further extended to being a corner, free-kick, goalkick, etc.
   
     < assistfrom > d2
       < play > open < /play >
     < /assistfrom >
     < pressure > none
       < rating > 2 < /rating >
     < /pressure >
     < endpoint > b3
     < height > feet < /height >
     < /endpoint >
   < /ball >

The next section describes the players involved, their locations, affiliated ID's, image database #'s, etc.
From these ID's you could incorporate player profiles, stats from the match and the rest of the season and so on.

 < team > Man Utd
    < player >
      < position > attacker < /position >
      < name > Rooney < /name >
      < playerid > 0028910 < /playerid>
      < location > b3 < /location >
      < colour > Red < /colour >
      < number > 10 
      < img /> 289
    < /player>

    < player >
      < position > midfielder < /position >
      < name > Young < /name >
      < playerid > 0031518 < /playerid >
      < team > Man Utd < /team >
      < location > d2 < /location >
      < colour > Red < /colour >
      < number > 18 < /number >
      < img /> 315
    < /player >
< /team >

    
< team > Bolton
    < player >
      < position > defender < /position >
      < name > Cahill < /name > 
      < playerid > 0003206 < /playerid >
      < location > a4> < /location >
      < colour > White and Blue < /colour >
      < number > 06 < /number >
      < img/ > 32
    < /playe r>

    < player >
      < position > goalkeeper < /position >
      < name > Jaaskelainen < /name >
      < playerid > 0011222 < /playerid >
      < location >x2>< /location >
      < colour > Sky Blue < /colour >
      < number > 22 < /number >
      < img / > 112
    < /player >
< /team >

There is a lot which could be edited and streamlined into a more cohesive script and language. I could make use of putting each team into a separate <div>;

< team id="Bolton" >
  
  < player id="Cahill">
  < position > defender < position >
  ...

This could allow for better understanding of the script and use of easier manipulation by making separate sections for each team then player (div within div).

Monday 19 September 2011

Creating your own server


Today we've looked at creating our own servers.


We began by pinging the servers of some household names (google, facebook, twitter - the slowest, who'da thunk?!), then following the route of a packet between here and the US servers, and then having a look at the new IPv6 addresses including facebook's '2620:0:1cfe:face:b00c::3'.


Using Python once again, we created a script which is akin to Tim Henman's famous serve *run to the net* and volley approach.


By running our two scripts each in its own terminal window we could try and connect to a newly created socket, printing on the server side the sent & received data, and on the client facing side, printing a personalised message (in red here).


Here is "Tiger Tim's" serve[r].


Socket script


import socket


s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)


s.bind(('127.0.0.1', 4900))
s.listen(1)


client_socket, address = s.accept()
print 'received connection from ', address


while True:
   data = client_socket.recv(4096)
   if not data:
      client_socket.close()
      break


   print 'received data', data


   bytes_sent = client_socket.send(data)
   print 'sent:', bytes_sent


s.close


And here is his volley (in a very abstract and perhaps incorrect analogy).


Connection script



import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.settimeout(5)
s.connect(('127.0.0.1', 4900))
s.send('Bugger off!')
print s.recv(4096)
s.send('Oh, okay. Come on in...')
print s.recv(4096)
s.close()

In a typically British Semi-Final kind of way, my first attempt crashed out, causing an outcry of (albeit, excpeted) anguish and demoralising anti-nationalism.


Why I hear you cry?


Because I spelt 'True' with a non-capitalised 't'.


Once again, Python had spanked me with its racket of discipline and grammar.


*spank* An artists impression of Python giving me a once over using my Henman metaphor.


I will learn soon enough to avoid the wrath of Python. And people shall flock to a hill named after my coding...

...only to leave with a bitter resentment of losing days to rained-off exasperation, filled with the false hopes of being part of a moment in history.

At least will be strawberries and cream.

Cheering on Java vs Snake.

I also managed to sort out my public folder on the University server.

It can be accessed here