Age and Creation

A practice journal of creative experience and aging

Ask Your Seniors: A practical primer on accessible community research tools

Simon Lebrun

In Where are the seniors? Colina Maxwell explains why a community-based artist- run centre needed to reach out to a specific group in the community to meet its goal of being an accessible and vibrant community arts space.

In Arts Experience and Needs Survey, the team behind that project presents the questions that Centre[3] asked, and invites you to ask similar (or identical) questions in your community. What could work for older adults in Guelph, Hamilton and London may not be what works for you.

Artists and community organizations that have connections with academic institutions and professional researchers — often through relationships with universities — have an extremely valuable resource that is likely to ease their own work, increase the odds of successful outcomes, and maximize the good change that can flow to the community. This is because researchers have a significant professional knowledge base that covers logistical, ethical, resourcing, communications, incentives, and partnerships before even getting into a researcher’s individual subjects of expertise.

Our project would not have exceeded our expectations like it did without the expertise of our partners at McMaster University and the University of Guelph. We are left with the question: do other artist-run centres need to undertake a formal research study supported by professional researchers in order to consult older adults — or anyone in a population that is not connected to the centre — in their own community?

We don’t think so. And part of our project is sharing not only what we learned, but how we learned it. We hope you can learn from us, avoid our mistakes, and find ways to improve on what we accomplished.

To that end, we present a primer on areas you may want to consider if you’re setting out to consult the older adults in your community.

 

Ask a researcher

Before you test our theory — that cultural organizations can do effective community-based research when professional researchers are not available — make sure you have to. Ask people on your team and in your networks about their experience with academic and community-based research work, you may be surprised where experience and expertise can be found.

Reach out to local institutions, especially those that have a community-focused department or initiative. For example, in Hamilton the McMaster Research Shop is a program of the university’s Office of Community Engagement that connects professionally-supervised student researchers with community organizations that have research needs. If you’re near Hamilton they’re a great option, if you’re not you may find a local institution with a comparable program. 

 

Be led by lived experience

We wanted to know why people aged 60 and older weren’t engaging with the programs of an artist-run centre in the same numbers that younger adults did.

Can you see the catch? If we weren’t reaching certain people with our arts programming, how would we reach them to ask about why that was?

Our project wouldn’t have been so successful if we hadn’t centred the wisdom and experience of older adults. By setting an agenda of evaluation and discovery with our funder instead of presupposing a solution, establishing a community consultant panel of older adults that represented the three communities we were working in and was paid well for their time, and by providing support and opportunity for consultants to move into development and leadership positions within the project over time, we were more successful in avoiding the trap of idealistic young adults creating “great solutions” for seniors that just don’t work.

Be aware of the challenges that come with doing this well:

  • Good community-engaged work takes time, often more time than you might expect. Things as simple as getting everyone together for a meeting can take longer when space is made for many perspectives.
  • People need support in order to excel. When you select someone to join a group because their social location gives them a perspective that your project team lacks, remember why they were brought on board. If team members need soft or professional skills to be successful, consider providing training or mentoring to level the playing field at the project table.
  • Power dynamics can cause friction and may require specific facilitation, collaboration, or conflict resolution training. Sometimes these can be addressed by examining who’s doing the work and who’s getting paid; don’t rely on “the community” to do intellectual and emotional labour for free when other team members are being compensated.

 

Decide what you want to know

Over three years that turned into five there were times we felt like we were going in a hundred directions at once, and it felt like every new lesson opened up another possible path forward. It was important to us that our purpose was set out clearly at the start, and we reviewed it regularly to make sure we were all pulling in the right direction. (We also made changes, as a team, when we felt it was necessary.)

For example, our project goals were:

  • Adopt a community-based research framework to learn the needs and wants of older adults, regarding the creation/production, appreciation, and engagement with art.
  • Create interactive moments and experiences for older adults to receive information and ask questions or express likes or dislikes about the art they see or experience.
  • Conduct a multi-phase initiative to pilot and evaluate multiple iterations of a prototype.

The concrete quality of these goals were important. We could test whether we’d reached them, or how close we were getting.

Through the project we expanded our definition of art (we often used the phrase arts and crafts so that potential participants wouldn’t self-exclude on the mistaken belief that we didn’t think their artistic practice “qualified”) and we refined our ideas around a prototype based on ongoing evaluations of our experiments.

 

Identify the research concepts that are important to you (and to the people you want to learn from)

Introduced to us as ethical considerations in research, there were a set of concepts that we learned about from professional researchers and came to value in this work. These include things like voluntary participation, informed consent, anonymity, confidentiality, preventing harm, and access to results.

We were fortunate to receive certification from the McMaster and University of Guelph research ethics boards for various interventions within the overall project — something we would not have been able to navigate without the in-house professional researchers — and this strengthened our work while simultaneously charging our wider team with additional learning and introducing some unexpected tensions between the needs of the academy and the needs of the community.

It was important to us from the outset to prevent harm to participants and to respect their privacy, and also to make sure the results of our work were widely available to the community of which the participants were a part.

As we progressed there were times when we wanted to repurpose something we had done earlier in the project based on new findings, and found that our commitment to informed consent and confidentiality meant that we couldn’t move forward. Even though we were confident that the research participants would support the use of their contributions in a way we hadn’t envisioned at the outset, our agreements under our certification for research ethics prevented it.

We learned that ethical considerations were important to many of the individuals in our networks that we relied on to circulate information about the project. These are professionals embedded in the community who receive an onslaught of email and don’t have time to look into every survey, symposium, or poster they’re asked to pass along. The difference between being quietly ignored and getting critical exposure through these networks can come down to highlighting the project’s commitment to ethical research.

 

Use free collaboration tools

A team with many perspectives and voices tends to have people from different parts of a community and with different communications preferences, IT systems, and technical knowledge. Keeping information flowing well and available to all team members when it’s needed is an important job: something that’s invisible when it’s happening well but can grind things to a halt when it isn’t taken care of.

We found a lot of success with Google Docs and Google Sheets, the freely-available web-based word processor and spreadsheet applications from Google. Creating a shared folder in Google Drive to hold project files was much easier than allowing outside people into corporate Microsoft 365 web-based tools, and the Google tools seemed to be easier to learn and use than others.

We also used Google Jamboard, another free tool that provides a virtual whiteboard, to support visual note-taking and brainstorming during Zoom meetings.

You may also seek out tools you can access for free or for discounted rates, either because a project partner has access or through a program like Techsoup. (Our project team used Canva Pro donations from Techsoup Canada to create event posters and social media posts.)

We’d recommend that you:

  • Put a stop to emailing documents for review and revision. Share live links to online versions that everyone can edit and comment on simultaneously.
  • Help team members learn to use and navigate tracked changes and comment features on documents that are written or edited collaboratively. It’s a huge help when people are working at different times of day to be able to see how changes are developing.
  • Back up project files regularly, especially to protect shared project files from accidental change while people are learning new tools.
  • Understand how access lists work and give view and modify access to project assets judiciously.
  • Discourage people from creating multiple versions of the same file, even online, in ways that make it hard, later, to figure out which is the correct version. Use Google’s version management feature to keep historical versions.
  • Understand how file ownership in Google Drive works, and protect your project against the loss of important files “owned” by a single project member when they leave the project. (Using shared drives in Google Workspace for Nonprofits instead of Google Drive shared folders resolved this issue for us.)

 

Identify a suitable survey platform

Not every survey platform is created equal. If you are seeking research ethics certification, you may have restrictions on the system that hosts your survey and houses responses.

We used an instance of LimeSurvey, an open-source platform you can host yourself, that is maintained by McMaster University for the use of its researchers. It allowed us:

  • to know everyone who had access to project data
  • to use standard security protocols to protect project data
  • to host data in Canada
  • to access the data in many forms, including forms that can be imported directly to analysis software
  • to create charts and graphs from survey data

 

Create or adapt your questions

We recommend not starting to finalize the questions you plan to ask until you get to this point, because all the earlier decisions (especially research ethics and platform selection) can influence what you ask and how you ask it.

Make sure your team understands all the question types your survey platform supports, so you don’t draft questions you can’t ask and you make use of the features available. Use online word processing (like Google Docs) to draft and refine questions; it’s easier to capture discussion and make changes here than after the questions are loaded into survey software.

Of course we invite you to start with the questions we asked and modify them to suit your needs, but you may want to start from scratch.

Be sure to include a proofreading of your questions to catch spelling and grammar mistakes, and invite a few test survey-takers to run through and make sure the questions are clear and the survey flows well. (It’s best if the testers haven’t seen the survey before.)

 

Decide how you’ll ask

We set specific goals for recruitment (10 older adults in each of three cities: Guelph, Hamilton, and London) and we used an equity reference recruitment model to maximize the variety of perspectives in our sample, reach older adults who identified with historically marginalized populations, and adjust our recruitment efforts progressively.

Once we knew who we’d ask, we didn’t send a survey link. The survey was administered by a peer facilitator. These were older adults from the three project communities who had attended training sessions on the survey and how we wanted them to interview. The peer facilitator connected with the participant, built rapport, worked through the informed consent process, and “interviewed” the participant using the survey questions over the course of three one hour zoom sessions.

We decided to ask this way because it allowed us to gather much more data than self-completed electronic survey responses would provide. In addition to the peer facilitator completing the survey using the participant’s responses (in view of the participant via a Zoom shared screen) the interviews were recorded so they could be transcribed and analyzed. It required that we recruit interviewers and train them on the survey and on Zoom hosting before we were ready to start the first survey.

Knowing how you’ll ask the questions in the survey can help focus and refine the survey questions. It’s useful at this point to revisit what you want to know on completion of the project, as it may help answer questions about how and what you ask.

 

Decide how you’ll invite and incentivize

Many have been disappointed after crafting an engaging survey on a subject they find fascinating, sending it out far and wide, and receiving little or no response.

With a brand new survey in hand it’s important to separate why you think it’s a good idea for people to complete the survey and why they might. Remember that you’re asking someone to do work, and few of us do a lot of work for no reward.

  • We had the resources to offer an honorarium of $200 to participants, split up across three survey sessions.
    • Participants could receive a physical gift card, an electronic gift card, a cheque, or an Interac e-transfer.
    • The electronic payment options were preferred by the administrative team but not popular with older adult participants.
    • A high level of coordination was required between the participant, the peer facilitator who interviewed them, the project coordinator, and the administrative team that sent payments. Occasional breakdowns in communication resulting in delayed payments caused frustration/anxiety for some participants.
    • A participant who dropped out after one or two parts of the survey would still get some of the honorarium.
    • Some participants declined the honorarium.
  • A common incentive for a survey project that doesn’t have the budget for honorariums is to offer a draw for one or more prizes that participants are entered into.
  • Some will complete a survey because they want to support the research being done or the organization doing the research, and they require no further incentive. In this case it’s often important to clearly explain what good can come from participation, and to consider if there is a segment of the population you want to hear from that may be under-represented because they are less likely than others to be disposed to participate without incentive.

 

Try an “inside test, outside test, confirmation, and launch” approach to catch issues early

We recommend this model for any complicated survey instrument, as we found it worked well:

Inside test

Members of the project team complete the survey from start to finish as though they were a participant. They are asked to make note of any part of the survey (including both content and execution/interface) that are unclear or problematic. A meeting is held to review findings. The survey is revised based on feedback.

Outside test

The project team recruits two to four people who have never seen any part of the survey, will not be participants, but are as close to the target participants’ social location as possible. (Older adults in our case.) The tester completes the survey in the presence of a project team member, who does not help or prompt the tester. The project team member takes notes of points of difficulty. The survey is revised based on the notes.

Confirmation

A small random subset of participants, e.g. 10%, is identified and the survey is administered to this group. The collected data and any feedback from participants is collected and discussed by the project team to identify any part of the survey that should change before continuing. The survey is revised if necessary.

Launch

The survey is administered to the remainder of the participant pool

 

Keep on top of demographics in recruitment

We had specific goals for recruiting participants to complete the survey and they were informed by an equity reference recruitment model that identified racial identity, sexual orientation, gender identity, ability, migration, religion, language, urban/rural, and older adult subgroup identities. The model was used to guide decisions on where to send invitations to participants and how to frame them; it was not used as an eligibility evaluation.

Recruitment and administering the survey happened simultaneously, and so the project team periodically extracted demographic information for the completed survey and evaluated it against the model so that recruitment could be refined on an ongoing basis.

With this method we got closer to our goal of a participant pool with the widest possible variety of older adult experience than we would have if we’d applied the equity reference recruitment model only once to the initial recruitment activities.

 

Connect with partner agencies

Participant recruitment for our project was buttressed in a major way by cultural and seniors-serving organizations in our communities. Of particular note are CreativeAge London, the Immigrants Working Centre (Hamilton) and City Housing Hamilton who each had extensive networks reaching older adults of all identities.

It was important that we could clearly explain the purpose of the project, what we wanted from the partner, and that older adults would be safe with and treated with respect by our project team. It was significantly easier to enlist organizations who had previous experience working with Centre[3] (the artist-run centre sponsoring the research project.)

 

Learn to use your tools & use them

Once you’ve collected your data, what can you do with it?

Do you have a data person on your team who can import the raw survey data into a statistical software suite like SPSS or PSPP? Or has someone on the team skilled-up on LimeSurvey to generate charts from the response data to show trends and suggest next steps?

If you audio-recorded administered surveys like we did, do you have the resources to have those recordings transcribed? If so, what do you want to extract from those transcripts, what tools will you use to do it, and who will take that work on?

Look out for a future issue of Art and Creation where we dive into this stage in more detail.

 

Share what you learned

Did you make any commitments to the participants when they signed up, like that they’d get a copy of the final report? Make sure to follow through and not lose track of plans that may have been made some time ago.

Strengthen the community by sharing back to it all the findings that it helped to uncover. If your project includes a final report, share it far and wide. Give some thought also to who could benefit from your findings, and if they have the time or capacity to dive into that report.

Art and Creation is our experiment in answering that question; if we know that staff at artist-run centres are often pulled in so many directions that they won’t have time to read a dense report from a five year project, are there other more accessible ways we can share what we’ve learned? We can’t wait to hear your ideas.

 

Change something

I’m not about to disparage the idea of research for research’s sake, but this is a practice journal. Don’t put your final report into a binder on the shelf and move onto the next big thing just yet. How are you going to make sure all that you’ve learned will make people in your community healthier, happier, stronger, and more connected?

Simon Lebrun was the project coordinator (phases 3 to 5) and a platform developer with the direct[message] project. He works with Centre[3], an artist-run centre in Hamilton, Ontario.