Gathering and presenting feedback

Choosing a sample

Random

Best for performances, installations, and other face-to-face live events.

Accost visitors with a clipboard and pen.

Hand-picked

Best for early tests of an online project.

Actively try to get a representative sample.

Create occasions for testing, eg schedule demos of installations or performances for your classmates.

Mass-invited

For mailed/emailed surveys, expect less than 10% to respond. So if you want 10 responses, you need to contact 100 people.

Requires a lot of promotion.


Scheduling testing

Schedule each phase separately in your Gantt chart:

Pre-alpha phase

About 1-2 weeks of deployment.

3-10 people.

Don’t worry about a representative sample.

Can be friends, roommates, relatives.

Alpha phase

About 1-2 weeks of deployment.

10-20 people.

Draw from classmates, unless they are not a representative sample.

May require throwing out any previously accumulated content.

Beta phase

About 4-8 weeks of promotion, followed by 4-8 weeks of deployment.

Can be shorter for live events.

Should not be friends, roommates, or relatives.

May require throwing out any previously accumulated content.

Postgraduate phase/s

Occur in the year after you graduate.

Goal is not just to improve the project, but also to attract the means to sustain it (funders, advertisers, clients).

Ideally should not require throwing out previous content.


Feedback format

Data types

The more quantitative, the easier it will be to compile statistics. But leave some room for comments if possible, to capture what you didn’t include but is important to your respondents.

Quantitative

Continuous (decimals)

What’s your GPA? (4.0, 3.84, 2.37, …)

Discrete (whole numbers)

How many computers do you own? (1, 2, 3, …)

Qualitative

Ordinal (rank ordered categories)

How much computer experience do you have? (Expert, Moderate, Newbie, …)

Nominal (categories without rank)

What’s your ethnic background? (Latino, White, African-American, …)

Freeform text (comments)

How did you like this course? (“It sucked because…”)


Explaining differences

One of the main purposes of testing is not to come to a single conclusion (“Everyone likes my capstone!”) but to identify and explain differences.

Differences in respondents

Correlate demographic differences with other responses.

Example: A new media installation asks how technically savvy visitors are, to see if that affected their enjoyment of the work.

Differences between versions of the work

Create two versions of the experience and share them with different visitors to test for a preference or demographic disparity (“A/B” testing).

Example: A Web site randomizes which of two versions of its front page visitors see, then logs which spurred more click-throughs to deeper content.


Helpful tools

Online survey sites

SurveyMonkey.com

Doodle.com

Google Charts

Offline applications

Microsoft Excel

Apple Numbers

Onboard surveys

Best for catching users when their experience and motivation is fresh.

Drupal Polling module


Survey examples

Seeing Double: a test of emulation as a preservation strategy.