17th Durham Blackbord Conference

Technology Enhanced Learning was represented at the recent 17th Durham Blackboard Users Conference at Durham University. The theme this year was “Ticked Off – Towards Better Assessment and Feedback”, the aim of the sessions was to show how presenters had improved the student experience in terms of the conference themes.

An interesting keynote entitled “Translating evidence-based principles to improved feedback practices: The interACT case study” by Susie Schofield, University of Dundee opened the first day. She suggested that without a carefully constructed assessment criteria, feedback is useless. In other words you cannot give appropriate and worthwhile feedback without this as what exactly are you feeding back.

We then heard from Wayne Britcliffe, Richard Walker and Amy Eyre of the University of York. They described the various contexts in which the delivery of electronic feedback to students is being facilitated at the University of York through the use of learning technologies. Their main objective being to improve NSS scores or simply making the management of assessment and feedback processes more efficient and that  the electronic management of assessment (EMA) is undoubtedly a hot topic across the higher education sector.

Patrick Viney from Northumbria University described their journey with the Pebblepad e-portfolio tool and how they have replaced the paper system of submitting undergraduate dissertation proposals. With over 800 students supported by over 100 academic tutors, logistical issues in managing such large numbers were significant. Patrick demonstrated how using Pebblepad had resulted in a robust, auditable, paper- free processes for managing dissertation proposals, ethical approval submissions and tutor support during the dissertation.

Thursday finished with a demonstration and talk by Blackboard on their new product “Ally”. This will make course content more accessible and allow assistive technology such as screen readers (JAWS, Window Eyes for example) to be able to more easily access the content. In the demo Nicolaas Matthij from Blackboard took a PDF and converted it on the fly into various formats including ebooks, on screen display and through JAWS. Whilst it is not a substitute for badly created content, it’s use could be seen as advantages to the university and the student experience.

 

Day 2 commenced with Alan Masson, Head of International Customer Success at Blackboard presenting on how Blackboard themselves can assist in the assessment and feedback. He used examples of presentations from the day before and also touched on forthcoming ones.

Steve Dawes from Regent’s University talked about their common module and the difficulties and challenges that face assessment and engagement in a University-wide module and how these issues were met using a blend of e-learning tools. He explained how the Learning Technology Team assisted academic staff in utilising a range of digital tools to maintain engagement such as using Poll Everywhere classroom voting to engage large student audiences, promoting Blackboard Journals for consistent formative feedback, enhancing efficiency in the Blackboard Grade Centre, and using Turnitin Rubrics for Summative assignments.

The next session saw Christian Lawson-Perfect & Chris Graham from Newcastle University demonstrate and discuss “Numbas”. This is an open source mathematical e-assessment system which is now being used in subject areas outside of maths. Two such examples being psychology and biomedicine. Two case studies were discussed including how using existing open-access material, course leaders were able within a short time period to create a large bank of formative and diagnostic tests and deliver it to students through Blackboard.

Finally, Pete Lonsdale form Keele University discussed and demonstrated a custom in house solution for assessing nursing students. At the time there was nothing available that fulfilled the requirements identified. He described how the system included such features as audio feedback and the option to take and upload photos. He also explained how since introducing the system, requests for more complex marking criteria have been received and implemented such as the use of rubrics. He concluded that their design and implementation story highlights the appetite for online  assessment tools as well as the importance of getting the details of the system just right: they found that off-the-shelf tools just did not work for a variety of reasons, and even the  bespoke system required many iterations to get to a version that worked for all.

For me, this conference is significant in that it was my first time that I have presented at a conference to peers and others in the academic world. My presentation was about how we have used Blackboard OpenEducation (A Free online version of the VLE that we use) as a diagnostic tool in the recruitment process for Health and Social Care programmes. Candidates that get through to stage 1 were invited to the University to undertake numeracy and literacy tests before the next stage. Candidates that failed these tests were rejected at this stage. This method was proving expensive both in terms of money and time for both for candidates and the university and alternatives were sought.  In the presentation I discussed how this stage was adapted to work with OpenEducation, considering the likely challenges that lay ahead, how these could be factored in as well as how dealt with those we didn’t foresee.

I would like to thank the organizers and staff of the conference. It was a very relaxed atmosphere and worth going.

An Industry View of Virtual and Augmented Reality

The recent VR & AR World event in London provided an overview of the current state of Virtual and Augmented Reality.  Part conference, part trade-show, the event provided some visions of the future (such as the market being worth $120 Billion by 2020); but mostly provided an opportunity to see products and companies that had already launched.

Two presentations that stood out from an education and training point of view were from Boeing and the company Ubimax.  Both provided case studies that showed augmented reality guidance/instruction could reduce the time taken to complete tasks, while at the same time improving the accuracy of those tasks.

Ubimax have a number of augmented reality supported applications for manufacture and maintenance; but the application demonstrated was to support picking items in warehouses (you can see a demonstration video here).  The most beneficial aspect seemed to be the timely presentation of context-specific information – particularly in highlighting errors.

The Boeing presentation described a study comparing different ways of presenting information.  Participants were tasked with assembling part of an aeroplane wing using instructions either on a desktop screen, a mobile tablet, or through augmented reality.  The content was roughly similar to this published recording, and summarised in this online article.

Something that came through from the variety of things on show was how broad the definitions of virtual and augmented reality have become.  Two products under the same label could have substantially different features; whereas something labelled as augmented reality could have very similar characteristics to something else labelled as virtual reality.  I’ve produced a summary of the different attributes that either might have, and included it below.

Infographic defining the categories of mixed realities as content, level of activity, presentation, viewpoint, relation to place, connection to place and perception.The main distinguishing characteristic between augmented and virtual reality is the perception – with VR enclosed and AR transparent.  It is also the characteristic that by far has the biggest impact on the user’s experience.  Are they separated from their physical environment and transported somewhere else, or is data brought in and added to their surroundings?

With current technology levels many AR projects demonstrated had relation to place (e.g. guidance on a task that the user is currently engaged in); but the information was detached and free-floating in the display.  In contrast there were HTC Vive demos that included physical objects (e.g. a car seat) that did have a connection to place – with the virtual car and seat perfectly aligned with the virtual environment.

Photograph of car seat in the centre of an open space, with a person using a HTC Vive.

Virtual Reality Demo where a virtual car is align to a physical car seat. The participant is to the right of the image (partly obscured by the pole)

The VR content that people are most likely to be exposed to are recorded real-world content, where the viewer is mostly passive.  Whereas VR games would generally need to use computer graphics in order to allow interactivity.

It was widely acknowledged that virtual reality in particular had been around for over 30 years in various forms; but that it was the new wave of technology (arguably reignited by the Oculus Rift) that is starting to allow actual experiences to meet user expectations.  However, there were also warnings that poor experiences (with low-budget equipment and/or badly designed environments) could still give people bad impressions; and so there was a responsibility to make each individual’s first use of the technology as positive as possible.

Digital Challenges, New Tech & the Space In-between: Thoughts on JISC Connect More

This week, a few of the TEL team attended the final event of the JISC Connect More series at Nottingham University. The day provided opportunities to connect with peers, share practice and explore new ways to teach and learn using digital technology.

Barriers, Challenges and Aspirations

The first presentation, led by Rachel Challen from Loughborough College, was on the barriers, challenges and aspirations that we face in the field of Learning Technology. Tying together institutional strategies and processes to work effectively within the changing digital landscape requires a lot of people and systems to work together, and it’s a tough job.

This theme continued throughout the day and it was encouraging (I think) to see that we’re all in the same boat – How can we engage with everyone in our institutions to think differently (and cohesively) about Technology Enhanced Learning and digital capabilities? It’s clearly a difficult challenge, and one of the things that’s great about events like this is that we can share the different ways, however successful, that we are trying to solve it. I got the sense that we’re all trying to move away from the perception of Learning Technologists as ‘point and click’ presenters, and embed ourselves much more within the academic community as specialists. Personally I think it’s a great thing, offering better value to the staff and students we work with in a collegial environment.

New Tech!

We also got the chance to try out some new technologies like the HTC Vive and Nao, a programmable robot. The HTC Vive was particularly interesting given the work I’ve already done with virtual reality in the last year. This was the first chance I’d had to use handheld trackers and they enabled me to create something in a 3D space – I was virtually painting, using TiltBrush by Google.

Instead of just having a flat canvas to draw on, I could now interact in all directions – forwards, backwards, up, down and everything in-between. If I drew a three dimensional shape, I could get inside it. I was able to experience the digital world as an actual space in which I could interact and move around, not confined or separated by a twRob VRo dimensional screen. There was a sense I was taking ownership of my own personal virtual space.

And this week, as I’ve watched Pokémon inhabit a shared digital space in the world, I’ve wondered if the convergence of technologies like VR and AR will allow us all to create our own personal digital spaces – They probably will and that’ll provide us with lots of exciting opportunities for creating new digital learning environments.

The Gap In-between

It was interesting to experience new technologies that are heading towards the classroom and at the same time hear how colleagues are meeting the current challenges of embedding digital capabilities within education. There’s clearly a gap in the middle that a lot of us sit in, connecting the dots between ever newer technologies and their educational application. It’ll be fascinating to see what an event like JISC Connect More looks like in 10 years. Over to you Nao…

Digi Know: Learning & Teaching Conference Special!

At the University of Derby, Learning and Teaching Conference on the 4th July, Lawrie Phipps from JISC delivered the second keynote presentation. The theme of the conference was “What does the future hold” and Lawrie’s presentation was called “Perspectives on Digital: Change isn’t coming, it’s here and it’s permanent”. His presentation largely focused on the JISC Digital Capability Project; Learning Enhancement have been work on embedding the principles within the university through the Digital Derby project. If you would Iike to find out more about the project or how to improve your digital skills, please contact the TEL team – tel@derby.ac.uk or on ext 1865.

Lawrie also asked the audience about their skills in using Microsoft Word, and their knowledge of styles. To find out more about Styles in Word, please our Help Guide or a more comprehensive training document from IT Services

Lego for Learning

I recently attended a Lego for Learning workshop hosted by Manchester Metropolitan University and led by Chrissi Nerantzi and Dr. Steven Powell. The idea of the workshop was to look at how Lego Serious Play methods can be used in teaching and learning. A range of colleagues from different educational sectors attended, which gave the day a nice, well rounded perspective.

Rob's Lego AnimalThe day began with a reflective warm up exercise using Lego – first create an animal, then add something to it which represents yourself, which wasn’t that easy when you’ve only got a minute to do it! The group then shared thoughts on their own creations and the representative elements we had each added. The idea of this was to get us in the mode of thinking about creating metaphorical models using Lego, and also reflecting on what they represented.

One of the interesting things about this process, even in the early part of the day, was that everybody in the group contributed, and this would continue throughout as we took part in the various exercises – it all felt very democratic.

Lego Learning EnvrionmentIn the next exercise we were asked to build a model of our ideal learning environment and then draw out shared themes which we could identify in each others models. We then constructed a shared model which collated our thoughts on the various themes – ours turned out to be a boat.

Lego Boat

 

Whilst this all might seem to be a bit fluffy on the surface, it actually led us into a deep discussion around learning environments and find commonalities that we felt were important in their design. It wasn’t really the Lego model which was important, it was how we used it to express our thoughts on learning design, and we articulated our thoughts differently than we would have if we had written them down. It was a very reflective process,  and by making something, however abstract, we’d engaged with the thought process in a different way and one which enabled us, very quickly, to engage in a creative discussion and generate a lot of ideas.

There’s plenty of theory around how and why Lego Serious Play methods work and the ArtLab website has lots of information on research projects which have been conducted. This video from Professor David Gauntlett is also well worth watching in which he explains the theory behind some of the methods he used to gather research using Lego:

Overall, the workshop was very enjoyable and I came away with a lot of ideas for different methods for using tools like Lego to engage students with theories, ideas, research or reflection. I also got some free Lego 🙂

Going to the Polls

I had the opportunity to participate at a conference intended to highlight current use of Polling/Voting technology in Teaching and Learning at the University of Birmingham.

The keynote speech was given by Dr Fabio Arico, a senior lecturer in Economics at the University of East Anglia. He talked about how he uses polling data to produce learning analytics and pedagogical research. He has actively adopted the active learning approach in his practice which he says took him 3 years to get it to work. He emphasized that planning is key. The participants were able to practically participate in the demonstration of this methodology. Some Economics questions were presented on the screen and participants were given time to respond using TurningPoint / Responseware, and then the results were displayed on the screen. Then participants were given the opportunity to discuss their responses with their peers or neighbours in the conference room then the same question was asked and participants selected the answers again this time influenced by their discussions. It was interesting to notice the change in results graph with more participants actually getting the answers right after the discussions. Dr Fabio Arico has been able to successfully influence more than 50% of his colleagues to use the peer learning approach. He uses the students’ feedback to improve his teaching.

Another very interesting presentation was given by Professor Prem Kumar, Professor of Physiological Science at the University of Birmingham. He talked about his journey from traditional teaching to using the flipping approach hence the title of his presentation: ‘If in doubt, try, try and then try again: the very real perils and pleasures of adopting a flipped approach’. He particularly mentioned his use of this approach with medical students. Some of the key points from his talk included the fact that, to successfully use the flipped approach, a lot of preparation has to be invested before the session and that the lecturer has to be self-confident and believe in the methodology and be very knowledgeable of their subject. He also advised to ensure there is value added to the session after students have already seen the Panopto videos in their own time.

Other contributors included Bob Ridge-Stearn from Newman University who talked about their attempt to use OMBEA voting tool to run exams of which they experienced some challenges. David Mathew from the University of Berdfordshire talked about using TexWall to help shy students to participate. Annette Margolis from the University of Birmingham talked about her use of Socrative in her classes to provoke debate and to get students’ feedback. I presented on a few quick case studies to highlight the different ways technology is used here at Derby and that includes for: recapping, revision, mock tests, to provoke discussion and to get feedback among other uses. Both TurningPoint and Socrative are popularly used at the University of Derby. Socrative has also been used with collaborative partners in Malaysia.

Those who are using voting technology are clearly seeing the benefits of it in helping students to learn and engage with learning material. While flipping approach is upheld, a point was also raised that different learning styles should be considered as some people still preferred the traditional approach.

2016 e-Assessment Question Conference

TEL represented the University at the recent e-Assessment Question conference in London in March.

The theme of this years conference was ‘The Future: e-Assessment opportunities and barriers, risks and rewards’.

The presentation that most stuck out for me was that of Tim Burnett from BTL Learning and Assessment. He asked “What’s the future for written exams?”.

The context behind this being that education is changing and technology is being used more and more in the classroom by learners and teachers. Although technology is unlocking further potential in learning, it is struggling to do the same for assessment. If learning and assessment fail to develop together, we have to start asking the following questions: ·

  • Is it fair to ask all students to write in assessments? ·
  • Are we at risk of assessment being left behind with the growth of educational technology? ·
  • Is there becoming a social divide in assessment technology use? ·
  • Are we at risk of invalidating assessment?

In the session we looked for evidence to support change, listening to representatives from different aspects of the education sector and reviewing the options for change and the challenges that come with them. Delegates from all sectors contributed with their viewpoint as assessment specialists, but also as parents and lifelong learners.

Another interesting session was that of Jeremy Carter of Cirrus Assessment about educational technology systems (edtech) or using standalone solutions and the role of the assessment tools within them. There were some interesting discussion points that were raised including

  • Are we providing the best experience for students
  • Should we be in house or utilize 3rd party services in order to meet our objectives.
  • Where does e-assessment fit within the learning and teaching

Although aimed at the level 1 -3 qualification ( Apprentice/NVQ) Ecom Scotland demonstrated a electronic version of an marking matrix that has replaced paper. Unlike previous methods this system can work with multiple devices and also offline. (Useful as many locations where apprentices where working do not have internet access or somewhere for laptops. )

There was also a couple of interesting presentations on how organisations have used e-assessment including how the British Council have overcome the issues of localisation in places like India where being chased by a rhino is as likely as powercuts.

Xerte16 Conference

xerte-banner

The Xerte16 Conference was the first official Xerte conference marking a shift from being managed by Nottingham University to being managed by the Apereo Foundation ( https://www.apereo.org/ ).  Apereo is a network of institutions that support software used in thousands of educational institutions worldwide.

The First Key note was from Ian Dolphin, the Executive Director of Apereo. The presentation was about Apereo, How Xerte fits in the group of software products that are supported by Apereo, the future of Xerte and some of the other projects that come under the umbrella of Apereo.

After the first key note the conference split in to parallel sessions. So I can only account the a third of what happened, However all the sessions were recorded and should be available online soon.

The first session I attended was about the use of Xerte for problem based learning. It reported on two case studies with medical and veterinary students, which used the nonlinear branching option within Xerte to create a branching scenarios for the diagnostic procedures of medical conditions.

The second session was about the accessibility features in Xerte and how Xerte was designed from the off set to be accessible. It showed how now Xerte is totally HTML based it can take full advantage of all the accessibility future of the browser. A very useful free Chrome extension called ClaroRead was demonstrated showing how text can be spoken out for Dyslexic and visual impaired users simply by activating the plugin and selecting the text.

The next session addressed previous complaints that Xerte was very linear and didn’t look very good. This was done by demonstrating Xerte templates which were designed to look like Articulate Storyline (A leading commercial package for creating learning objects). While this session did show how attractive Xerte can look. And demonstrated very well the nonlinear capabilities offered by some of the Xerte page types, It did seem that to accomplish this effect you had to be a developer with some experience in using Xerte and that basic users would not be able to accomplish the same results.

The Second Key Note was by Sal Cook OBE: This was about her involvement with the Xerte project from the beginning and how easy Xerte is to use.

The last two parallel session I attended were both about student use of Xerte.

One was about how they adapted Xerte online tool kits 3.1 and Moodle to be used a portfolio tool to be used with school children as summative assessment for the Welsh Baccalaureate. A pilot has been run, while there have been complaints from students about having to learn how to you a new product, it seems that the finical advantages of using Xerte out way these. They are hoping to roll this method of assessment out to over 54000 students throughout Wales.

The second use of Xerte with students was a case study from Lincoln University. This showed how first year undergraduate history students used Xerte online toolkits 2.0 to create a Xerte learning object instead of text documents for a summative assessment. Here again there were student complaints about having to learn how to use Xerte. There were also a number of complaints about problems in using Xerte online tool kits 2.0, Most of these complaints could be resolved by upgrading to Xerte online tool kits 3.1. This session ended with a discussion about the problems of getting the IT department to implement Xerte.

The conference ended with a Q and A session with the Xerte development team. The main question being “What do you want us to do next”.

My thoughts on Xerte online toolkits 3.1

Xerte has come a long way since we last used Xerte online toolkit 1.2 here at the University of Derby. Xerte can now be fully HTML based. This means that it is not reliant on flash so it can be used in all browsers for both viewing and editing. It is also now compatible with most modern mobile devices, if the content is created with mobile devices in mind. You can now even create and edit Xerte learning objects from your tablet.0

There are however still a number of issues.

One is that the basic templates that come with Xerte are not brilliant. This is an issue that has been there since the early days of Xerte and has still not been looked at. One example of this is the question types are inconsistent. Some of the drag and drop questions don’t allow you to put the answer in the wrong place others do. While developers can fix these issues and create customised templates (as we did with Xerte 2 during our initial trials of xerte) there is no guarantee that these template will work in the next version of Xerte. The same goes for content, when you upgrade Xerte there is no guarantee that existing content can be moved to the new version without having to re-input it, particularly if that content was created from a customised template. This is also an issue with some of the competitors’ products but imagine how would feel if you couldn’t open your PowerPoint 2010 presentations when you upgrade to PowerPoint 2013.

Another issue is ease of use. Allow they have improved the text editor and you no longer see the HTML code in the text editing window, you still have to fill out forms with no real idea where the text will appear until you press the play button to play the presentation and navigate to the slide you are working on.

At the conference Xerte was being compared to other learning object creation tools such as iSpring and Storyline. Xerte came out on top for cost and output quality whilst the  usability is comparable. Although Xerte is comparable to these products I feel that the Xerte online toolkits are now more comparable to WordPress, WordPress being a well-established open source web publishing product with many plugins including plugins for creating learning objects. Compared to WordPress, Xerte is a long way behind but the area where is does shine is where it started in creating small interactive learning objects for embedding in other content.

With the improvement to Xerte I no longer see any reason for not having Xerte online toolkits at the University of Derby if we are willing to use the standard templates and should there be a demand for it.

#openbadgeshe

Leaving the house at 5.30 in the morning, meant that the conference I was attending, not only had to have nice cakes but some interesting content to have made the commute worthwhile.

Open Badges in HE was hosted by Southampton University, was attended by 150 delegates and featured two parallel sessions.

Badges

The event was opened by Doug Belshaw, an independent consultant specialising in Open Badges and Digital Literacy. Doug provided an overview of Open Badges and some of the challenges.

I attended another 7 sessions and here are some of the key themes and points of interest

  • The universities who were presented were only using badges at a local level. For example tutors were badging a particular module, or a TEL unit were badging their CPD offering.
  • Those using badges, had simply ‘dived in’ and were refining processes as they went along.
  • There is a steep learning curve, from achieving a badge to sharing it via a profile on social media.
  • Offer badges on a voluntary basis.
  • Communication to the badge earners is key.
  • Need to ‘build in’ rather than ‘bolt on’.

I thought it was a good conference and gave me lots to think about. The University of Derby is beginning a pilot project to look at awarding badges. Reflecting on the event it seems that we are in a unique scenario, in that there is support from management. The Academic Innovation Hub are also developing a bespoke server, which again seems to be an innovative approach.

Further Resources and information

If you would like to know more about Open Badges or would like to be part our of our pilot, then please contact tel@derby.ac.uk

EvaSys International Conference

Here is a summary of selected topics covered at the EvaSys International Conference held in London in May (5-6).

Preamble

EvaSys is an evaluation system, developed by Electric Paper Ltd, that the University of Derby chose to use in order to get student feedback on their learning experiences per module. It was previously difficult to get this feedback. EvaSys questionnaires are printed and given to the students to complete in class thus raising chances of getting the feedback. The completed questionnaires are then scanned using EvaSys Scanstations and the results are stored on the system and reports are generated and sent to various stakeholders e.g. module leaders or deans etc. This system also has the online survey option which UDOL uses. It was noted that more feedback is attained via paper based surveys than online. University of Derby also uses EvaExam system for administration of exams at the partner college in Malaysia. Currently 54 UK and Ireland universities use EvaSys.

Updates

Electric Paper Ltd. plans to release the next version of EvaSys (v7.0) in January 2016. With regards reporting and statistics, improvements include 20 custom fields for Courses and Participants; complex filters for Report Creator; and merged PDF reports show table of sources. Customer-driven improvements include, hiding answers to open questions; improved categorization, flexible naming of reports; improved maintenance mode; as well as ability to export statistical data via CSV (DAL Light) among other improvements. There are also improvements to the EvaExam system in v7.0 and they include two-column layout; reduction of paper consumption (up to 50%); Intelligence Character Recognition (ICR) for calculations and fill-in-the-gap questions; Formula Editor (MathML, LATEX);

latex

images as background for open questions; and also adjustments clause for grading scales; etc.

 

Other improved services include updated online support system. Via the Extras tab, users can get live content directly from Electric Paper and also direct access to webinars and newsletters. Updates are highlighted with a + symbol.

The Enterprise Option

Electric Paper is offering The Enterprise Option which is useful for trying the latest release without interfering with production environment, conducting staged updates and developer license to programme Web Services (SOAP API). The Enterprise Option consists of Test License and Software Development Kit (SDK) which is free, 6h Developer Support for 795 € per year, and operation of Electric Paper Developer Network (EPDN) solutions. EPDN comprises of Electric Paper, Customer internal IT and 3rd Party Provider.

EvaSys Dashboard

The EvaSys system can be embedded with Intuitive Business Intelligence Dashboards which provide detailed visual data for reporting purposes. The Nottingham Trent University (NTU) has been using the dashboards since 2011 and the users have found the dashboards to be useful for supporting a range of monitoring at various levels. Dashboards provide quick and easy access to real time data. Reporting can be on anything including VLE, attendance, timetabling etc.

 dashboard

NTU dashboard

Module Benchmarking by EvaSys Governance

Electric Paper is in the process of establishing sector-wide Module Benchmarking using questions that are similar to those used in the NSS survey. As there are now more than 50 universities in the UK using EvaSys, institutions will be able to compare their evaluations with other institutions using the same questions. Each institution will have access to its own data and just aggregate data for benchmark questions of other participants. Benchmark outputs from pilot participants are expected in October.

Learning Analytics

There was also a panel discussion on learning analytics which is thought to be relatively new in the UK but is becoming topical. Dublin City University is looking at engagement with the VLE to predict future performance. It is hoped that library usage and attendance among other areas will be included. University of Surrey switched to online surveys and got 60% response rate. Professor John Taylor from the University of Liverpool said that data collected should be more attention directing than an end in itself.

Electric Paper Research

Here are some of the key outcomes from research that was recently conducted by Electric Paper on how to deliver best practice in course evaluation.

  • HE providers require central visibility to drive teaching quality and enhancement. Course evaluation is tied to the NSS used as an early warning mechanism
  • Survey fatigue is one of the main concerns regarding participation. In-class evaluation gives the best response rates as compared to online evaluation.
  • Regarding staff development, course and module evaluation is about supporting and enhancing academic achievement, not meant to highlight negative results. There should be policy decisions and involvement from the departments and academics regarding the use of data.
  • Course evaluation is at the centre of student experience. Student representatives should be involved in all aspects of quality measurement and enhancement efforts.
  • Collaboration between HE providers is important. Electric Paper affords this via user conferences and regional group meetings etc.