Difference between revisions of "Evaluation"

From Scarlet Wiki
Jump to: navigation, search
(Created page with "<h2>Rationale/Background</h2> <p>Although since the beginning of the Project, it was determined we would put together a toolkit, that only came into being at the very end of t...")
 
 
Line 163: Line 163:
  
 
[http://teamscarlet.wordpress.com/2011/11/23/scarlet-focus-group/ SCARLET Focus Group]
 
[http://teamscarlet.wordpress.com/2011/11/23/scarlet-focus-group/ SCARLET Focus Group]
 +
 +
<br/><br/>
 +
 +
[[#top|Back to Top]]

Latest revision as of 10:46, 14 September 2012

Rationale/Background

Although since the beginning of the Project, it was determined we would put together a toolkit, that only came into being at the very end of the work, since it was then that we could start to reflect on what we had done. As a distinct piece of work, the toolkit is meant to be something which can assist others in creating a project which uses a mobile app and Augmented Reality to enhance the experience of working with a material object. One of the key things to consider when putting together a toolkit for any context, but particularly an educational one, is how you will evaluate the outcomes from the surveys and focus groups. In a workshop, the SCARLET Team used a dialectical approach, initially to put down the key processes we used to evaluate those outputs, and then to imagine you could somehow talk to yourself in the past – the” you” who is just starting to put together an evaluative process – and then to tell your temporal doppelgänger what you need to know to make the process as easy as possible.

Reflecting on what is ostensibly a dialectic, what surfaced was the intuitive way we all worked and the difficulties involved in trying to reify those efforts into something usable to another. We recalled that in working toward some sort of cohesive evaluative process for a project like SCARLET, it’s important to determine methods for making sense of how the project has developed, targeting key success factors and ideas for improvement. There are a number of ways of doing just that, outside of more formal, academic assessment, which doesn’t necessarily point to the factors which might influence take-up of a particular project or pedagogical enterprise. In addition, the evaluative process needs to be one which works toward accessing audience views before and after the experience of the project. Initial surveys and focus groups were determined to be the best way for evaluation and were used in order to get an idea of what students knew about Augmented Reality and other technologies before, during, and after.

It is important to note that not every factor in the evaluation process has significant risks, but it is important to consider what risks there are prior to embarking on any kind of project. It would seem obvious, but the toolkit is meant to emphasise any of the steps necessary to move toward completion. It also maps nicely back to a general bid-writing process, which also has to consider potential risks and pitfalls.

In the end, we believe that the project is valuable and significant, responding to the Horizon Report’s call to watch AR as a pivotal technology in the coming years (2010). We have produced something which uses technology to enhance the experience of working with a material object, AND which importantly does not replace or get in the way of that experience -- and most students felt that the use of AR with Special Collections was, indeed, valuable to their experiences with the texts in Special Collections.

Moving forward, the development of this toolkit has also directly benefitted from the additional courses run as part of the project, where the students have exhibited a great deal of interest and enthusiasm for the content and the means of delivery. If as Confucius said that “success depends upon previous preparation”, this toolkit promises to be a useful and compelling aid in the creation of future projects involving Augmented Reality in a variety of places and contexts.



Self-completion survey

Before the course actually started to meet regularly, we felt that it was key to gauge the students' abilities and understanding of Augmented Reality and mobile technologies. We learned that most students are aware of AR, although few really understand what it actually is. To that end, an initial survey helps to determine the amount of training and demonstrating of the app would later be required.


Considerations:

  • Agree the most appropriate format for the survey (paper-based or electronic) dependent on the audience and situation
  • Be careful to create good open-ended questions
  • Include a mixture of open and closed questions. Closed questions are easier to answer and provide more structured data but open questions may provide more detailed responses
  • Split questions which have multiple subjects into simpler questions
  • Create questions which encourage students to participate
  • Plan design carefully to ensure ease of use and maximum survey completion rate
  • Aim for shorter surveys to increase response rate
  • Create clear, unambiguous questions and clear instructions

Risks to consider:

  • Too few students participate
  • Data is irrelevant or incomplete
  • Questions did not encourage engagement, i.e., students simply answered “yes” or “no” without explanation, can’t prompt or probe
  • Surveys generally provide lower response rates



Focus groups

The focus group stage is often fraught with some of the biggest problems. Generally, it is difficult to get people to attend, and if you do, the idea of incentives can often create problems. With regard to incentives, it is important that they are presented as a “thank you” for participating, and not perceived as some sort of payment. More problematic is the make-up of the group; you don’t want to have a group made up of individuals who are too similar or too different. Liz Spencer, lecturer for the Social Research Association at the University of Essex, talks about the fact that the best kind of results generally come from a heterogeneous collective of people who are largely unaware of the subject matter but who are interested in learning more – in our case, three groups:

  • Guyda Armstrong's 3rd year undergraduates from the pilot course for "The Book and Its Body"
  • Guyda Armstrong's 1st year undergraduates from "Contemporary Italian Culture"
  • Roberta Mazsa's 3rd year undergraduates from "Advanced Greek 3"
  • Roberta Mazza's 1st year undergraduates from "The Body and Society: Christianity and the West"
  • In addition, two less formal focus groups were organised and run

The conceptual framework of a focus group, then, is one which helps the Team see and reflect on the design of the project’s platforms, technologies, content, and delivery, as opposed to getting an exit interview or simple feedback form we have all had to fill out after a workshop, e.g., “on a scale of 1-10, how likely are you to recommend this course to others?”. That kind of information is valuable, to a degree, but it hardly tells you why somebody thought the course or the app was valuable.

In addition, it is also key to understand that focus groups do not create quantitative data; they are not scientific studies, whose results we can point to and say, with confidence that, “95% of the users believe that brushing with Colgate improves their social standing at the office.” Yes, they are participative in a branch of sociology, but focus groups again do not necessarily create that level of precise data. Their inherent value lies in that they can potentially give us an idea of why 95% of the group felt or thought that way about a relatively ordinary toothpaste.


Considerations:

    • Select members which are heterogeneous
    • Consider how many group sessions are required or feasible and the size of the group
    • Develop a session guide or plan
    • If students, ensure that you get them early enough in the module, otherwise they are likely to give skewed answers, influenced by the lecturer or library staff, rather than based on their own perceptions
    • Consider recording and transcribing the session


Risks to consider:

    • The group was too homogeneous, so answers don’t tell you much
    • The group lacked any cohesion – no bonding, no sharing
    • The environment was not conducive to the session, i.e., too warm, too cold, room was too small
    • The incentives offered didn’t work to produce anything meaningful



Final online survey (using Bristol Online Surveys)

After Guyda Armstrong's 3rd year undergraduate course, "The Book and its Body" had finished, we also designed an online survey to get a sense of what the students thought about the course and the project once they had some time away from it. As a means of again gauging the overall value of the project, in order to improve and revise for its continuation and for this toolkit, we thought it was important to get feedback at these different stages: before, during and after. For this final survey, we wanted to know if the students' abilities to conduct primary research, as well as their experience with specific learning objects, had been helped or hindered by the use of AR and the app and how they felt about their own growth end-users of a cutting-edge technology.


Considerations:

    • Create open-ended questions
    • Ensure that each question focuses only on one item/issue
    • Encourage students to participate
    • Get academic buy-in


Risk to consider:

    • Students no longer feel obligated to respond since the module is over
    • Students feel resentful that they feel compelled to participate
    • Too few respond regardless of encouragement
    • Online survey malfunctions or is inaccessible



Exit interviews

The last phase for the evaluation process were exit interviews with two of the three academics, with whom we have worked. In attempting to evaluate the overall success of a project, this bit of feedback is essential, particularly because there remains a strong emphasis on working with the HE sector.

As The Horizon Report of 2011 suggested, AR remains a technology to watch, both because of its relatively low cost, which makes it attractive to a range of projects, but because it offers so many different kinds of applications to a range of possibilities, and education remains one which has only begun to be tapped in the UK markets. Finding out how the academics perceived of the project, from its perception, through to its completion, is important to the continuing design of future projects, which involve AR, as well as the design and construction of the toolkit.

The interview design is straightforward enough and based upon comments that the academics had already made, with regard to the project throughout, posing issues which underscore how it has affected their teaching, as well as the students’ learning. As with the vast majority of the feedback gathered, these interviews are also be made publicly available through various means, most notably in the toolkit iself with a typed transcript and the video recording.



Roberta Mazza (Lecturer in Ancient History and Early Christianity)


Word Transcript


Guyda Armstrong (Lecturer in Italian Medieval Studies)

Word Transcript



Dissemination

In addition to evaluation, to gather a sense of impact for the project, a dissemination plan was also put together. More about that plan is available in the dissemination section of the toolkit.


Considerations:

    • Consider a variety of dissemination methods and channels including:
      • Blog posts
      • Professional publications,
      • Case Studies
      • Conferences, meetings, workshops and events
      • News items, articles and features on websites and in journals
      • Academic journals

Risks to consider:

    • Academic articles require a great deal of time and the direct input of the academic; therefore, this form of dissemination is often most problematic.


Results and highlights:

    • Write up the highlights from the survey and focus group as soon as possible
    • Share with others who may have also been present, to ensure accuracy
    • Work toward getting quotes, although do not attribute them (confidentiality)
    • Make it relatively short so that others can re-use in blog posts, case studies, etc.



Related blog posts

SCARLET Dissemination Workshop

SCARLET Evaluation

Demonstration du projet SCARLET

Thoughts on the ELI Conference

Demonstration Content

The SCARLET Focus Groups: 2nd and 3rd Year Undergraduates

The SCARLET Project Survey

The SCARLET Project with First Year Undergraduates

SCARLET Focus Group



Back to Top