Technology isn’t about to kill map-reading

The Royal Institute of Navigation have been in the news today with a press-release titled “Society “Sedated By Software”, Needs Nav Skills Taught At School”.

It’s about the demise of traditional map-reading and navigation skills. They say, with the growth of digital mapping and GPS-enabled devices:

“generations are now growing up utterly dependent on signals and software to find their way around.”

Now, I’m a bit disappointed they’ve gone with the hackneyed line invoking the technology-as-sedative argument and inevitably saying the solution lies in schools but I do have a lot of sympathy with their main point.

I love maps. No, wait. I bloody love maps! They’re how we represent place to ourselves and, more importantly, how we relate to that place. And they’re beautiful and functional at the same time.

So, being able to read a map, place yourself in relation to it or even carry around a reliable mental map are, I think, an essential part of learning about the world. I hope I can help my kids become good map readers.

But I also think that this isn’t a maps-good: technology-bad  debate which some people might see it as. Technology has got a crucial role to play in engagement with mapping and interpretation of place.

The UK’s Ordnance Survey maps are the gold standard for mapping. Although tools like Google Earth and Streetview don’t come close they still offer a massive amount in the way of data layering, scaling, interactivity and social media. It can only be a good thing that children (and grown-ups!) spend ages flying through Google Earth to find their house as well as exploring far off destinations, enjoying the experience of maps.

And we shouldn’t see GPS-enabled mobile devices as necessarily meaning the death of navigation skills either. I use apps like Strava and Google Maps routinely on my phone when I’m cycling round Northumberland. Combining the use of the apps with the actual experience of being in the landscape has really helped me develop my geographical understanding of a part of the country I love.

If our aim is better spatial awareness and understanding of place then we need to combine the best aspects of traditional mapping, digital technology and physically being out in the field.

Image from Pixabay – Public Domain

 

 

Video editing on the go with Splice #BlappSnapp

I’m writing this as a contribution to BlappSnapp, a series of posts on mobile apps in the classroom for Julian Wood (@ideas_factory). You can see the previous post in the chain here.

If you’ve done any work with digital video you’ll know that the results that learners achieve can be amazing. You’ll also know what a pain in the back side managing a video editing project can be! It’s a potential mess of incompatible devices, cables that don’t fit, unintuitive software that won’t recognise the file types you’ve recorded and so on ad infinitum.

Which is why I think Splice is such a fantastic app for getting students straight into producing their own digital video without many of the usual barriers. Mobile video editing software has come on in leaps and bounds in the last 2 years and the gold standard is probably the iOS version of iMovie. iMovie comes with a (small) price tag but Splice is a free alternative you should investigate.

Screenshot of Splice
Screenshot of Splice

Splice allows you to take videos and photos you’ve taken on an iPad or iPod Touch, edit together, add basic effects, record narration and overlay music. The results can be published as a single MP4 that goes back into the device’s camera roll.

Because of its stripped down interface, experienced video editors might find it lacking in more advanced features. For the rest of us it means that the app is simple and fairly intuitive to use. It would certainly pose few problems to learners at KS2 and above. My then 5 year old was getting the hang of it for editing together the stop motion Lego animations he’d created.

Screenshot of Splice
Screenshot of Splice

Potential uses

Digital storytelling – mobile devices are ideal for capturing personal reflection either through video or audio. With Splice these can edited together into rich digital narratives
Fieldwork and placements – I’ve been talking to a lot of people in HE about the use of apps like Splice as part of fieldwork. The key benefit is that images and video can be captured and edited in the field without the complication of having to download the footage onto a desktop. You don’t even need to be near a wi-fi signal.
Capturing labwork
Working with stop motion animation – there are a few really good apps for capturing stop motion and time-lapse sequences on iOS devices. Splice is a great tool for turning these into something more coherent.
Previsualisation – I used to do a lot of work with GCSE media students on their production project work. Splice would have been a fantastic tool to help in the planning stages as a sort of video storyboard before the expensive cameras came out.
Creating learning materials – Why should students get all the fun? If you’re into flipping your classroom why not think about using Splice as a way of creating videos that can sit on the learning platform and prepare your students for classroom activities?

Restrictions

OK, Splice isn’t perfect. There are a few considerations to bear in mind:

  • Simple interface so lacks advanced features you’d find on the desktop
  • It needs a particular workflow which might be different from the order of things that you’ve used on other software
  • Mobile devices not the best for getting great footage – you’d still need to work with your classes on what makes a good quality image, the importance of framing and using a tripod.
  • The sound capture on mobile devices can still be problematic, especially outdoors, although there are peripheral mics that can help with this.
  • It’s only available on iOS. Android and Windows devices aren’t well served for reliable video editing apps. Andromedia was the best that I could find in the Play Store and I think Samsung Galaxy tablets come with their own serviceable video software.

To put the quibbles into perspective, a friend of mine once compared picking holes in apps like Splice to criticising a talking dog for its accent. You’re basically shooting, editing and producing a movie on something that is basically a glorified telephone. Would you have imagined that 6 years ago?

My advice would be to just try it out and get learn it’s features and its quirks yourself. Having said that, there’s a lot be said for giving learners the tools and see where their creativity takes them. In my experience they quickly learn to deal with the constraints and produce some surprising results.

Footloose Digital Storytelling at the EFL Showcase 2012 (#efl2012)

Enhancing Fieldwork Learning 2012 Showcase

I had obviously behaved well enough at last year’s showcase in Wales to be invited back for another event from the HEA funded Enhancing Fieldwork Learning Team. This year was at the rather lovely Preston Montford FSC just outside Shrewsbury.

I had mixed feelings about going. Last year’s event had been brilliant but last time I was in Shrewsbury there was an earthquake. Thankfully, this year was fascinating educationally and boring seismically.

The event brings together geographers, geologists, biologists, environmental scientists and the like to share their experiences of enhacing fieldwork through the use of technology.

Now, I don’t do fieldwork (although I do like to get out and about) so my session was mostly about seeding ideas and experimenting. 

This year I chose to focus on “Footloose Digital Storytelling”. I had a morning session to talk about what digital storytelling was and then embarked on a rash plan to get the entire group to film, edit and publish their own digital story using iPads and iPhones (one person used an iPod Touch).

We used the free version of Splice which, although doesn’t have the most features of mobile editing apps, is one of the simplest and suited our purposes really well.

It’s not without its bugs and quirks but in the end the group had created 16 movies and given that they’d only really had an hour to make it I was pretty bowled over.

For the record, I’d never suggest squeezing an actual storytelling session into on hour. It needs time to do it right. This was a bit hit and run and the attendees did really well to cope. One said it had simultaneously been a good experience and hell on earth, which sounds about right.

Originally I’d thought of setting them the task of creating a movie with a specified title but in the end I thought I’d surrender that side of things to them and just asked them to tell their own story of the event. Let many flowers bloom. You can see most of them here but here’s a couple showing the varied approaches.

 

Mobile devices are not ideal as tools for this sort of thing but it’s still pretty amazing what that you can shoot, edit and share a movie using your phone or tablet. YOUR PHONE!

FOR FREE!!

Pip Hardy likened this to criticising a talking dog. Do you quibble over its accent and vocabulary?

As part of the experiment we discovered that Splice works quite happily without any wifi or 3G connectivity. A couple of the attendees are now considering getting their students to do digital storytelling whilst on field work abroad – I’m eagerly awaiting the results of those.

Coming shortly – my reflections on the rest of the event…

Immersive video on the iP*d – more than just interesting?

Just saw this on the Neiman Journalism Lab blog…

A collaboration between Condition One and The Guardian it’s a slightly more interactive approach to video where you have an element of control over where the camera is looking.

Is it more than just interesting? It would make some intriguing digital stories where the viewer is more actively selecting what elements of the story to view.  

Is it a flavour of AR where the “reality” can be displayed independent of location (picture it with points of interest embedded into the video)? 

Could this be what Google Streetview looks like in a few years’ time? 

Event report: Enhancing Fieldwork Learning

The Digested Digest

Get out more!

The Digest

Fieldwork is an essential part of the learning experience for the geo, geosocial and life sciences that needs protecting given current squeezes on finances. We saw how mobile learning is enabling a blurring of the boundaries between field, lab, library etc. People are more interested in QR codes than I expected. We got wet.

The Detail

We had 2 days based at the Margam Discovery Centre, near Port Talbot where a wide range of people presented what they had been doing in relation to technology enhanced fieldwork. It was an entirely appropriate venue with fantastic facilities and a feel that you were in the field even when you were in the building. It was run by a project team  funded by the Higher Education Academy.

I was there mainly out of my interest in digital mapping and geolocative stuff and also because I’m a geography graduate so it feels like home turf. I was also presenting a short presentation on how QR codes can be used to enhance fieldwork by creating easy access to extra layers of information.

I Audioboo’d my reactions to the main activities here…

Enhancing Fieldwork day 1 (mp3)

…and made a short video of some of the stuff we did here…

Enhancing Fieldwork Learning from Chris Thomson on Vimeo.

Incidentally, both Audioboo and the Vimeo app are great ways of easily creating digital media content in the field. The video was filmed and edited on my iPhone without any need for connectivity.

This was the session I delivered (without the running around in the fresh air bit). 

For the practical activity I placed 10 laminated sheets with 2 QR codes each on around the site. One QR code linked to a Google Map of a significant UK location where the participants had to guess the connection. The other QR code was a one of the Top Tips from slides 11 and 12. I figured it was more fun to make them hunt for the advice than just spoon feed it to them!

I was a bit worried I’d picked a topic that people would already be familiar with but in the end it was new ground for a good number of attendees and a few people were fomenting their own fieldwork plans involving QR codes by the end of the weekend.

Main learning points from the weekend:

  • Everyone there was a passionate advocate for fieldwork as an effective learning experience and there was plenty of discussion about the uncertainty facing departments with squeezed budgets. How assured is the future of fieldwork in institutions? 
  • There was a determination that technology should be there to enhance the field experience and make it accessible to all students. It shouldn’t be there to replace it.
  • Fieldwork should be fully embedded in curriculum planning with a clear sense of progression of skills development from one year to the next, gradually building students up into independent field researchers.
  • Technology allows us to do the current things better (thinking about the collaborative spreadsheets mentioned in the Audioboo) as well as creating new opportunities.
  • Access to devices is complicated and relying on students to use their own isn’t the easy answer. What if they object to effectively subsidising someone else’s fieldwork because they have a device and the other doesn’t? Poorer students who can’t afford a device are at a disadvantage.
  • Best tech experiences
    • Seeing ipads used in conjunction with Twitter and Flipboard for students to co-create field guides for New Zealand prior to visiting. Carina Fearnley (Aberystwyth).
    • Using iPads to layer GIS, satellite and maps imagery while in the field, then using GPS to help students contextualise that information with their current location. Apparently it’s difficult for many students to do this. Peter Bunting (Aberystwyth)
    • Wifi. In a field! Trevor Collins and John Lea demonstrated their portable wifi network that can extend over kilometres thanks to a series of relays 
    • Gigapan, which you can see demonstrated by Ian Stimpson (Keele) in the last few clips of the video, takes multiple images of a location, zoomed in and in hi resolution that can be stitched together to create interactive panoramas with gigapixel levels of detail. Here’s an example. Great for pre-field trip familiarisation, giving students with mobility issues a chance to see inaccessible locations or simply having an “if wet” option.
    • Students in Singapore conducting surveys of a mangrove swamp, entering their data onto a shared spreadsheet on a tablet over wifi, enabling the group to analyse their data and discuss it all within the context of the site (Julian Cremona, Field Studies Council)

Reflections

Thinking back to the fieldwork I undertook as a student, they were transformative experiences not just because of what I learnt but also because they were important milestones on the way to becoming a geographer; it helped dismantle the boundaries between staff and students. It was an essential part of my student experience and I think it would be a shame if fieldwork was squeezed out for some courses.

Also, fieldwork is a multi sensory experience. I’m keen to discover more about using things like AR, QR codes, and mobile digital media but if it means less being part of the landscape and more staring into an LCD screen we’ll have lost something.

Lastly, I’m sorely tempted to investigate whether I could make the Netskills geolocation workshop a residential event and run it at a field centre like Margam Park. It would certainly be more challenging but a much more fulfilling experience to mix computer lab activity with field work [ponders].

 

Making location-based activities with ARIS

Aris

I’d stumbled upon ARIS about a year ago but never had a chance to try it out until recently. Given that I’m now doing workshops on geolocation stuff and I’m presenting at an Enhancing Fieldwork showcase, it seemed the right moment to try. 

ARIS is a free tool for creating location based “games” that can be accessed and played via an iPhone app. It’s similar in many ways to HP’s Mediascapes that I did some stuff with back in Sheffield a few years ago. It’s been designed by a team based at the University of Wisconsin-Madison.

Using a Flash based editor on your desktop you add “objects” to a Google Map and then set the behaviours for each of them. Objects can be:

  • Items – virtual objects that can be picked up and stored in the iphone app’s inventory, or carried and dropped by players
  • Plaques – information and media (audio and video) pop-ups, perhaps telling you about a location or giving you new instructions
  • Characters – essentially virtual people that you can programme with branching converstaions for a player to interact with.

Each of these items can include “requirements” so that they behave in a particular way. For example a character won’t appear on the map for a player to talk to until then have picked up a certain item.

Objects and the tasks and behaviours surrounding them can then be grouped into “quests”. So, to complete a quest a player has to find a certain number of items, talk to various characters or visit locations etc; there then might be a follow-up quest to complete, and so on.

When the player is out in the field using the GPS-enabled app, they access the game on the server (you need internet connectivity) and away they go. As they get within a defined proximity of an object, the phone will vibrate and give off a (pretty loud) tone. They then view the plaque, pickup the item or interact with the character. The player can see a map of their location and the game can be played blind or the designer can choose to reveal the location of some or all objects.

Another neat touch is that the app allows the player to take pictures or record audio within the game. In the mock-up I did I had a “Will Allen” avatar pop up while I was outside Haymarket Metro instructing me to take a picture of a particular statue. Once I’d done that, the avatar popped up again to confirm I’d done what he asked. Neat.

You can find much more detailed info on their site and get started here.

Some random initial thoughts:

  • There’s obvious applications for enhancing fieldwork, either from the point of view of giving students added information about the locations they are visiting or going to the other extreme of highly interactive stotytelling activities like the Dow Day “participative documentary” example on the ARIS website.
  • The ability to easily create dynamic objects is a step forward from the clunky (but pioneering) Mediascape.
  • I designed my game to have a mix of objects that were viewable on my iPhone’s map all the time along with others that were hidden or appeared after completing a task. Having a completely invisible game is likely to lead to confusion. At least consider having a starting location marked.
  • It’s just about simple enough for more able school students (and I guess most FE and HE students) to be able to create their own content. You don’t need any web development skills.
  • I think I prefer this sort of approach to other AR apps like Layar or Junaio. They’re difficult to create stuff for and cost to publish.
  • The iPhone is a much more reliable and enjoyable device for this than those nasty EDA things we used to use for Mediascape!
  • Having said that, the GPS isn’t pinpoint accurate so I had to include a fairly large margin of error (30-40m) on the placing of the objects. Having the game in a built-up environment also created problems for the GPS accuracy. Not insurmountable, but I think these games are likely to work better wheer they range over a wide area.
  • Having the iPhone as the only device that can run it is a real barrier. They suggest a possible solution for mifi and GPS-enabling iPod Touches but I’ve no idea how well these things work. An Android option would also be good but a web app would be even better.
  • I encountered a few server errors and app crashes when testing mine. I was still able to complete my “quest” but it was enough to make me feel jumpy about running this sort of activity with large numbers of students.
  • Planning and testing are really important to ensure functionality but also comprehensibility. Do game players understand what they have to achiveve and how to go about doing it? It could easily get frustrating for players without clear directions.

For all it’s problems, it’s still really good.There’s lost more stuff I’d like to try with ARIS but for reasons of time I’ll have leave it there for now.

Leave a comment if you’ve had any experience with it.