Tag Archives: AR

MERGE Cube – bringing the virtual into the palm of the hand⤴

from @ ICT for Learning & Teaching in Falkirk Schools

MERGE cube – lets the teacher or learner move a 3D object as if it’s right there in their hand! A 3D viewer app (for Apple or Android devices) which can let you move the object round and over, letting you interact with it, all while viewed through a tablet device or projected onto a classroom screen. Whether it’s an inanimate historical object, which viewed in 3D lets you turn it round, look underneath, zoom in closer to examine details, or a simulation or game which lets you interact with the scene, changing what happens as you make choices.

You can use the MERGE cube with a wide range of resources created by others elsewhere (as can be found on MERGE Miniverse or shared in Co Spaces), or a teacher or their learners can create their own virtual 3D objects or environments using Paint 3D or TinkerCAD (once you’ve created a 3D object or scene in TinkerCAD simply use the save-as option to save as a stl format file, then upload this to your Miniverse account, from where you’ll be able to then share the code to view in the MERGE object viewer app, or Open your Tinkercad design, click “Send To”, then choose “Object Viewer for MERGE Cube”), or even use the Qlone app to scan a real object to convert it into a virtual object, all stored in Co Spaces online so that a user can access the shared virtual creation simply by entering the code and downloading to the app.

So how do I get started?

  1. First you need a MERGE cube. Once a teacher has registered a free MERGE account, verified the email address and entered one activation code (which is included with the MERGE cube), you will be able to log into  multiple different devices with that email, and without the need for additional activation codes. You can create your own additional MERGE cubes from paper or card just by downloading a template (click here for a printable pdf of each of the faces of a MERGE cube by Jaime Donally which you’ll be able to print and stick onto a cardboard cube – you can also use this to try out a MERGE cube before purchase) which you can print out, cut out and fold into a MERGE cube – click for a printable net by Clint Carlson of the MERGE Cube faces  (you can also enlarge these templates to any size of cube – click on this video to view Gabe Haydu showing how to make an enormous MERGE cube from cardboard):

2. Then you need the app MERGE Object Viewer app on a tablet device to view MERGE cube 3D creations – the MERGE cube is compatible with a wide range of devices (click here for information about devices).

3. View the 3D creations included in the MERGE Object Viewer app – or sign up for a Miniverse account or CoSpaces account where you can find 3D objects/environments created by others – then all you need to do is take a note of the shared code for the object you wish to view, type it into the MERGE Object Viewer, wait for it to download and then start interacting with the 3D creation. Click on this link for some additional Object Codes ready to try on your MERGE Object Viewer app

More help for getting going?

Click here for  MERGE cube getting started guide on the Miniverse website

https://miniverse.io/cube-start

This getting started guide takes you through the same steps as above with additional videos as well as further information which may be helpful.

So how can a MERGE cube be used in the classroom?

There’s a host of places to have a look at how others are using a MERGE cube in a classroom setting. Click on the links below to browse to find something which might spark the imagination of your learners and fit in with what you’re planning to teach:

  1. Miniverse.io – browse through the range of Miniverse MERGE cube experiences https://miniverse.io/cube
  2. MERGE Educators Facebook group https://www.facebook.com/groups/mergeeducators/
  3. MERGE Educators Activity Plans https://mergevr.com/edu-resources
  4. MERGE VR on Twitter https://twitter.com/mergevr
  5. Guide to the MERGE Cube in the Classroom – presentation by Mary Howard
  6. #ARVRinEDU – a hashtag in Twitter where anyone can share examples of the use of VR or AR in education, including the use of MERGE cube.

 

 

Augmented Reality in the Wild⤴

from @ wwwd – John's World Wide Wall Display

I’ve been using the PeakFinder app for a month or two now. It is a nice app for showing what hills are in view. Basically it give a ‘live’ wireframe of hilsl from your location or anywhere you like. All the features are listed PeakFinder App.

Today I opened the app and it must have been updated, because it gave me a message saying:

Augmented reality
For a long time many of you have asked for an option to combine the image of the camera with the panorama drawing. l’ve finally implemented this feature in this newest version and so PeakFinder now also supports true augmented reality.

This is quite amazing, and in my tests it works a treat.

I think this is the first AR I’ve seen that makes be think this could really be useful and soon. It is not much of a stretch to imagine a botany app that can recognise flowers.

What is cool about peakfinder is that the data is loaded so that you do not need a connection to use the application.

Exploring the use of Augmented Reality in Education⤴

from @ Adobe Education Leaders

I have a lab dubbed “The Knowledge Garden” where I jump, feet first, into the unknown with my students. Change comes so fast in the Technology landscape that waiting until I have a demonstrable grasp of the subject matter—enough to tailor assets tied to predictable learning outcomes—seems completely at odds with the lay of the land. Instead, the classroom is flattened and my role shifts from being an authority on a technology to being a co-explorer with a few more notches on my belt than my students. Typically, we wade into Beta environments where documentation is scarce to non-existent. There are few signposts and worn paths in these environments and even fewer materials. This allows my students and I to experience a just in time or JIT learning paradigm. What we explore, we map, document, demonstrate, illustrate and publish. It is a form of informal, applied research. My students and I then curate the collective knowledge gleaned from these explorations into a learning repository that is hosted on a course WIKI and made searchable and usable by future groups that may wish to repeat what we did or expand the horizon of discovery in some area that we did not previously investigate and, so, in this fashion, we put our collective shoulders to the task of moving the ball further up the hill.

Last year my students explored mobile publishing on a beta deployment of Adobe’s Digital Publishing Suite and, for the very first time, my students and I had produced learning assets that pre-dated the public release of that software by one month! This meant that we had moved from JIT to BIT learning (BEFORE ITS TIME)! This was a very exciting proof of concept that demonstrated how student-based research could be an extremely valuable mechanism for pushing the exploration of new technologies in education.

Testing triggers for Augmented Reality

Testing triggers for Augmented Reality

After my students finished their explorations, we then teamed up with interested faculty members to mentor them on using these technologies in their own teaching practice. This resulted in the production of our school’s very first App on the Apple App store and stood as a use case for integrating the power of the Adobe DPS system as an internal communications vehicle. This has spawned several knowledge transfer workshops to other stakeholders in the school that included using the platform for Academic Publishing at our Institute Without Boundaries (http://worldhouse.ca). Students from the Knowledge Garden are providing leadership in the transformation of how we do things by actively promoting and mentoring the use of the technologies that they have explored.

This sort of knowledge transfer represents a complete inversion of the original educational hierarchy. This winter we worked on using two Augmented Reality products called Aurasma and Layar to support an interactive exhibition on The History of Game Design. Students used Adobe After Effects to produce short, 2 minute documentaries on seminal games in the evolution of game design. These videos were then “bound” to “Trigger” images that were vinyl cut and displayed around the halls in our new School of Game Design. This content was then geo-located on a GoogleMaps API within an Aurasma channel titled “The History of Game Design” and then socialized for discovery. Interested users can “Follow” our channel or perform location-based browsing that indicates that there is content nearby. Once they have subscribed to our feed they are given thumbnails of all the visual triggers or “auras” so that they can look for them on location. Exhibit goers used smart phones and tablets to access this video content by pointing their devices at the triggers  or Auras (Aurasma). We also produced a printed catalogue for the exhibit that a person could read in the conventional manner, yet when they scanned its pages, their devices pushed the video content to their  devices (LAYAR).

 

It was amazing to see throngs of people actively engaging in learning that had exploded beyond the traditional confines of the boxed classroom. One student lamented “I wish we could learn like this.” To which I added. “That is the point of this exercise. This is paving the way for new models of delivery.” It allows us to rethink the locus of learning as well as our conventional notions of time and place. The learning is always there, waiting for the intrepid explorer to find it and uncover its bounty. The notion of geocaching learning invites comparisons to a treasure hunt. Exploring the hallways of our school with a smart device is a little bit like having those X-Ray specs that they used to advertise on the back of popular comic books years ago. Our space is bristling with information you just have to know how to look!

Below is a sequence showing short introductory sequences that we shot against a green screen then rotoscoped in After Effects. We created pixelated avatars of each team member as our trigger images and matched up the video so that when the user pointed at the screen (see image above) the video image of the person would dissolve in over the avatar and tell the viewer what that video game that person first played and what they were currently playing. CLICK below to learn about MY gaming habits!

jim_kinney_avatar

Below is a short student sequence documenting their interaction and impressions of the medium.

student ar interaction with AR

Below is a sample of one of the documentaries produced by one of my students Evan Gerber.

Mini Doc on Halflife game

If you are ever in the Toronto area, please drop by the George Brown, School of Game Design at 241 King Street East, 5th floor and discover the learning that silently and invisibly clings to our walls!

I am currently working with a small group of Design and Fashion faculty to share what we learned on our journey into AR. I am assisting them with creating short demonstration videos and tying this trigger images that they will be able to post up in their labs.

I would like to hear from anyone else who is using this technology in a teaching and learning context.

Regards,

Jim