Our Digital Learning blog site has been re-located to
Our Digital Learning blog site has been re-located to
Our Digital Learning blog site has been re-located to
The end of February was the annual Media Team trip to the British Video Expo (BVE) at ExCel in London. In the past we have used the event to investigate new equipment, chat to other industry experts and learn new techniques from some of the best media professionals in the world. And although the event isn’t aimed at the education sector it is a very good way to gauge both current and upcoming trends.
This year however, felt like a bit of a surprise. Before the seminars were announced, we were expecting most of the talk and exhibits on the showfloor to be about Augmented and Virtual reality. We’re several years into the technology and it is becoming more established both in the media world and education sector. It is something (as can be seen by recent blogs) that we are trying to push forward with, but was only represented with a smattering of products to try out and no talks (at least on the day we were able to visit).
Instead, it was about the use of artificial intelligence (AI) in the development of media (either through AI itself, machine learning or automated workflows). It seemed like an odd thing to be showcasing so strongly at first, but the more we heard and saw, the more revolutionary it seemed.
The applications for these technologies that were talked about were many, some of which were very useful and relevant to what we are doing here at the University. A short list of uses of AI in the media sector include:
The most interesting one, to us, is the use of machine learning to improve logging of data (adding automatic metadata to the assets we created) and also to automate processes such as transcribing and subtitling videos.
That second one is key to us. A new EU directive has come into place in the last few months that aims to make digital content accessible over the next 2 years. This is potentially a huge amount of work (but also incredibly important). One way of improving the accessibility of video content is to add subtitles and/or a transcript. Over the years we have created lots of videos, so that means lots of subtitles and lots of time!
Machine learning has developed greatly in this area in the last few years and automatic caption generation has gone from hopeless to amazingly accurate in a surprisingly short amount of time. We currently use YouTube to automatically generate subtitles for our videos and the technology has come on leaps and bounds in the time we have been using it. It is still an imperfect solution tho (both in terms of the accuracy of the transcription, but also in the process of getting the videos up on YouTube and then the transcript files back). Hopefully this move towards using technologies such as machine learning will help to create a universal platform for transcription, that can benefit everyone!
In January, Sarah Fielding and Chrissie Metcalf attended BETT – an education showcase event hosted by Microsoft. Bringing together 850 leading companies and 103 exciting new edtech start-ups, BETT gives its 34,700 attendees from the global education community an opportunity to find inspiration and discuss the future of education
Whilst the majority of edtech is focused on compulsory-aged education, it enabled us to delve further into what kind of background the students may have come from, think about how we can take advantage of this within Higher Education and see the expectations and developing role of technology within the education sector. BETT also features a Higher Education area and some inspiring high-level keynote speakers from companies such as Canvas.
Interacting with start-up companies at an early stage is a great way to shape the development of those tools, making them more fit for purpose in HE environments. Notable companies and individuals that we spoke to on the day included VR guru Steve Bambury, pi-top, Thinglink and Google Expeditions.
We also watched informative presentations from other HEIs about their experiences of using Microsoft Teams for education (UCLAN) and Minecraft to teach biological sampling techniques (Teeside), providing valuable insight into how they are using technology to engage students and staff in dialogue.
All-in-all an exciting and productive day, with lots of inspiration brought back into the team. BETT is free entry, so for the price of a train ticket it’s a fantastic opportunity for us to find out about new technologies and how they are being used.
Nominations are open for our seventh annual Blackboard & VLE Awards. Students have the opportunity to feed back on what they’ve found particularly helpful to their learning, and invite recognition of the staff who are supporting them in this way.
A ‘virtual learning environment’ has the potential to be exactly that: an online space for learning, not just a file store. We are seeking the best examples of online provision that makes a difference, and want to hear how. What features or aspects of a module’s VLE site did you wish were in all sites? Or, how was your VLE site tailored to best support your module? What contribution did the VLE make to improving your experience?
Students may nominate up to three courses, with each entry submitted to a prize draw for a £50 Amazon voucher.
Go to http://go.soton.ac.uk/724 to submit your nomination.
Each year the awards highlight a wealth of valuable content and inspiring ideas. Shortlisted course representatives sometimes comment that they’re “not doing anything special”, but simple changes can make a difference.
Previous award winners have shown various ways of making their VLE into an active learning space, perhaps getting students involved in designing and developing their course site. View examples from previous winning and shortlisted courses on our VLE Awards page.
While we’d urge all students to nominate, staff can participate too. Why don’t you add in a course announcement encouraging your students to have their say (feel free to download the graphic above), or add in the course banner below? The guide How to add a course banner to your Blackboard course from the MLE team explains how.
Staff also have the opportunity to put forward their own courses for consideration in our self-nominated category.
Nominations are open until March 1st.
February’s eCoffee saw attendees taken on a series of virtual adventures, using Oculus Go headsets. Recent university Virtual Reality (VR) projects include creating interactive 360 experiences from footage gathered on field trips. This supports students in getting the most out of the trip by preparing them in advance, helping answer any questions or address anxieties they may have. It’s also a way of bringing the field trip experience to students who may not be able to go in person.
There are a range of apps available to download to the headsets, and we explored some healthcare examples, such as working through a patient case following a road accident.
As well as offering a safe environment to experience certain situations and try out procedures, the headsets can also encourage empathy by giving users a different perspective. What does treatment look and feel like from a patient’s perspective?
There are a range of VR tools available, with simple options like Google Cardboard giving users a VR experience via their phones. The next step for the Digital Learning team will be exploring further along the spectrum with an HTC Vive, offering a more immersive experience.
Headsets aren’t essential for VR – we also viewed examples of interactive 360 images in Thinglink.
If you are interested in knowing more about VR, or have an idea you’d like to develop, contact the digital learning team, or join us for the Digital Learning Connect event on Feb 27th, when there will be another opportunity to try out some VR kit.
To find out more about the use of VR in Medicine, see: A Look at VR in Medical and Nursing Student Training and Top 10 Incredible Uses of Medical Virtual Reality.
Our next eCoffee session is April 4th, 10-11am in LF9, when we will be looking at the need to make resources accessible for all users, and sharing some tips for doing this.
I am mindful that there are quite a few people throughout the University who are or have created content using Articulate but may not be aware of the new legislation and what it entails, so I thought I would write a short piece to share my understanding of things to consider when designing learning content to make it accessible and inclusive.
Please note that the accessible content I have created and updated so far has not been tested by users so will ultimately have its flaws.
Updating existing Storyline content to meet WCAG can be simple or complex depending on the type of content and its interactivity. For updating existing content, I have focused mainly on adapting this to comply with the JAWs screenreader, you may want to adjust other items listed below in the Creating new content section.
Unfortunately, JAWs screen reader is the only type of reader that is supported by Articulate Storyline. A trial version of JAWs screen reader can be downloaded from this link:
Please note that this trial version has a 30 minute restriction on it, after 30 minutes the pc will need to be restarted for the software to be operable again.
Alternatively, Enabling Services have an ATS room in the library on Highfield campus (level two/2047), please contact email@example.com for more information.
For the JAWs screenreader to read content in the correct order, the Tab Order for each element on the slide must be placed in the correct reading order and elements that do not need to be read out removed.
Tab Order can be found on the Home ribbon .
Select Tab Order to open a new window. The new window will list all the elements as per the slide timeline, select each element in the window and the corresponding element will highlight within a red box on the slide. Note, the window will also list any elements on the layers but when selected will not highlight, this is because the layer will need to be selected before the element becomes active.
With an element selected in the Tab Order window, use the up or down button to move the element to its correct reading position. Repeat for all other elements. Save the setting to return to slide view.
To test, preview the slide and press the Tab button on the keyboard, a yellow box will surround the element to be read by the screen reader.
Remember to add Alt text to images.
Alt text will not be read when Shapes or buttons have existing states attached. Unfortunately, all states will need to be removed, alt text added to the shape and then states can be added.
Hotspots are not recognised in Tab order list and should be replaced with shapes with triggers formatted so that it is transparent.
I have found this video useful to understand Tab Order better.
For existing learning content with complex custom interactions (drag and drop interactions or hotspots) you may wish to consider:
For those who are creating new content, I have compiled a simple table of items to consider and which type of learner it impacts/benefits.
Users with visual impairment will benefit most from high-contrast colours. Although I have indicated that colour contrast is not applicable to non-inclusive, hearing and mobility groups, this is just a suggestion that it is not vital to these groups.
If unsure print the page in greyscale to ensure that all the content is legible, if not then a user with poor visibility will have difficulty seeing this.
All users will benefit from a good use of font type and size. Although I have indicated that font size and type is not applicable to non-inclusive, hearing and mobility groups, this is just a suggestion that it is not vital to these groups.
Use sans serif fonts such as Arial, Helvetica, Lucinda Sans, Tahoma or Verdana. The recommended size is 12 point or larger.
Users will be able to enlarge the browser window using the scale tool for readability to enhance readability.
Articulate Storyline only supports JAWs screen reader, this may restrict users with visual, mobility or cognitive impairment if they do not own this type of screen reader.
Adjusting the tab order when authoring your course will enable JAWs to read the page in a particular order so that it makes sense to a user with visual impairment.
Naming objects on the timeline helps identification on the Tab order list, this is helpful if you want to delete an item so that the screen reader does not read it.
This is useful for all of the users and can be added to Note Panel. Users are able to print the note panel or copy and paste onto another document. Visually impaired users are able to print the transcript and use it to convert to braille format.
Closed captions are a legal requirement for videos. If the videos are embedded in Storyline, the captions will need to be generated. Fortunately, YouTube automatically generates captions for videos hosted on this site, generally the captions are quite accurate but may misinterpret some technical terms or phrases.
Closed captions for videos are useful if some computers do not have built in speakers or if the user is working in a quiet area such as library.
Users who are unable to use a mouse will use the Tab key to navigate around the page, similar to users with visual impairment. Use Tab order on Storyline to enable this function.
All users will benefit from jargon free content, if jargon is used ensure that there is a glossary to explain the definition of each term.
The Storyline player is very intuitive, if however the inbuilt interface is not used, ensure that navigation buttons such as next and previous are placed in the same position for each slide.
Susi Miller from the IT Training and Development team has put together a very informative Accessibility guide:
For more information about the new regulations – Public Sector Bodies (Websites and Mobile App) Accessibility Regulations 2018 – or making your eLearning compliant with WCAG 2.1 regulations contact Susi S.H.Miller@soton.ac.uk.
I have also found this site to be useful.
I hope this blog helps anyone who is creating Articulate Storyline content. Any questions or comments will be gladly received Mimi.firstname.lastname@example.org.
As part of our Christmas Challenge this week, can you locate the Digital Learning ‘reindeer’ in the pictures below? Ten pictures are on Highfield Campus (let us know the location or building number) and two are elsewhere (let us know which campus).
Click on the images for full-size versions.
Email answers (as many as you can guess) to email@example.com by the end of December; we will reveal the answers and announce a winner in January.
Congratulations to all our winners: Trevor Newbury, Steve Barnes, Caroline Stevens and Patsy Appleton.
Panopto: not just lecture capture was our theme for December’s eCoffee, looking at the various ways Panopto can be used to support teaching, learning and assessment.
The university help pages for Panopto at http://go.soton.ac.uk/panopto are a great source to help you get started, and provide step-by-step guides to some of the other uses of Panopto that you might want to try. Panopto is available via Additional Software on university computers, but can also be downloaded to your own laptop – details at the link above.
Panopto integrates with Blackboard, so once a Blackboard module is provisioned with Panopto, recordings can be directed straight into the module site.
Sally Curtis talked about how Panopto supports learning on the BM6 programme, and described herself as a ‘convert’ after her initial concerns about using Panopto proved largely unfounded. Sally had felt she might have to adapt her teaching style, or that students might stop attending. It’s agreed with students that lectures will be recorded as long as: attendance does not fall below an agreed percentage over a few weeks; students are responsible for reminding the instructor to record; and the content-heavy lectures in two modules are recorded, but not the more reflective discussion sessions in the other modules.
Panopto’s statistics show how students are using the recordings, and Sally has found this really helpful. It has also demonstrated that students who watch the recordings are usually those who have attended the lecture. While a lecture recording might be helpful for a student to catch up on a lecture they have missed due to illness, students who regularly don’t attend also don’t watch the recordings.
On the MSc Allergy programme, watching a Panopto recording back enables students to rehearse, reflect on and adjust their presentation style. It is also valuable for lecturers to do the same!
There are various ways that Panopto can be set up so that students can record their own presentations for assessment – details are included under ‘Information for Tutors’ on the help pages here: http://elearn.southampton.ac.uk/panopto/students/.
The MSc Allergy course also includes online tutorials run through Skype for Business. Not all students are able to make these, but the session recordings are uploaded to Panopto so that other students can access these here.
An example of how Panopto is used to support OSPEs in Health Sciences is explained in the linked blog post. Students carrying out a practical assessment are able to watch back the recording with the examiner, reflecting on their performance.
Recording a short video in advance of a face-to-face session can help prepare students, introducing content and questions to be discussed in the question. Panopto quizzes can be embedded into the short video, offering an additional way for students to engage with the material and check their learning.
Mp3 audio files can be uploaded to Panopto, with an ‘enable podcast feed’ available in Panopto’s settings. An RSS feed can be set up to link Panopto with iTunesU, allowing users to subscribe.
Our next session will be Weds 6th Feb, 10-11 in LF8. All welcome!
Throughout December we would like to invite colleagues to take part in our Digital Learning Christmas Challenge. Each week there will be two challenges – one online, and one based in our Digital Learning office (35/2025). The challenges will give you an opportunity to try out some of our resources, and we will be awarding festive prizes to each week’s winners!
We will be available in the office each day from 10am-3pm, Monday to Friday. Come along to take part in the week’s challenge, or if you prefer, just come and say hello and help us eat our hoard of festive snacks. You are welcome to drop in at other times but we can’t guarantee someone will be there. Directions to our office are here; we are a bit hidden by works outside the building at the moment, but you can still get through.
Can you top our Oculus Go challenge scoreboard? Each participant has two attempts to hit as many targets as they can. The winner will be the leading scorer on Friday at 3pm. We’ll also be offering this in the 1GS kitchen on Monday afternoon.
Follow our Digital Learning Twitter account this week and see if you can spot the Christmas song hidden in our week’s tweets (each tweet will conceal a word from the title). Email answers to firstname.lastname@example.org before the winner is picked at 9am on Monday (10th).
Come and try out our new teaching tech. In return for your feedback, we will enter you into a draw for this week’s prize. (NB. The office challenge will be unavailable from 12-3pm on Thurs 13th)
Add your suggestions to this week’s Padlet walls. We’d like to know what’s top of your TEL wishlist, or what festive deskorations you have to share! Winners will be the most liked suggestions. Your post will need to include your name or office location if you want to receive your prize.
Come and design the University of the Future using our Lego Serious Play sets. All entries will be photographed and tweeted via the Digital Learning account and the most liked tweet will be the winning entry.
Subscribe to our Digital Learning newsletter! The December issue will include a ‘where is the office reindeer?’ photo challenge.
In October’s eCoffee we looked at ways Microsoft Teams can support collaborative working and learning. Teams is part of Office 365 (https://www.southampton.ac.uk/365) and works in conjunction with Office 365’s SharePoint, helping build communication and context around documents stored here.
After creating a team (via the link at the bottom of the menu), it’s straightforward to add members and create ‘channels’ – specific subsections that all members of the team can access.
The ‘Get app’ option on the left is also recommended, as this gives more functionality than the web link.
Teams provides an online space to create and upload shared documents. Colleagues can work on the same documents, editing online or within the usual software, with changes saved automatically.
These documents can be accessed online (using a university login) from any location with web access.
The ‘planner’ app integrated into Teams allows files to be linked together with contextual information, including notes, deadlines, a task checklist and comments. As well as being a useful organisational tool for individual projects, this ensures relevant information is shared between colleagues.
Teams builds on the functionality of Skype for Business, providing a useful tool for online meetings.
Teams integrates with the Outlook calendar, so it is easy to set up an online meeting link (such as the one we used for this session). This allowed us to broadcast a view of the room during the session, as well as sharing the screen used to demonstrate the different aspects of Teams.
Teams includes a Chat function, for private discussions between specified individuals and groups. Each channel also has a Conversations tab, for discussions and links that are accessible to all members in that channel. It’s a good way of sharing useful information without overwhelming colleagues’ mailboxes, and using “@” to tag a particular user or team will ensure they are notified.
Teams can quickly become overwhelming with the creation of multiple channels, so it’s important to agree how it will be used within the team. It takes practice to get used to the way it works, and time to explore all that it offers.
With any shared workspace, organisation can be challenging as even with an agreed structure, not everyone approaches things in the same way. Teams offers a good solution for this though – the search bar at the top of every page is a really helpful way of tracking down the content you need.
The following article gives some useful tips: 19 Microsoft Teams Tips that will help you and save you time. You may also find the following document helpful: Tips for using Teams. Teams tutorials are available through http://go.soton.ac.uk/lynda.
Do join us for our next eCoffee session (and mince pies) on Tues Dec 11th, from 10-11am in LF8.