Skip Navigation

8 tips on developing church survey questions

By Jeremy Steele

If used properly, research can be a powerful tool in your communicator’s toolkit. It can help you understand your church audience or the community, identifying needs, perceptions and barriers that affect your congregation and its ministries. Yet, research is only as good as the methodology and questions you ask.

Here are some tips to help you set up a church survey that collects the information you need while keeping your frustration, and the frustration of potential respondents, under control.

1. Talk with the right people.

Keep your audience in mind as you create your survey. If you want the opinions of members, use terms they understand and avoid institutional jargon, abbreviations and acronyms whenever possible.

Your audience will also dictate the methodology you will use. Engage as many members as possible. It is easy for members to disagree with project findings if you do not invite them to participate. Consequently, you may need to consider mail or on-site options using a printed questionnaire. If you have a high number of member email addresses, you also may want to set up an email alternative.

Free and simple software programs are available to support email surveys. One is http://www.surveymonkey.com/. If you can understand the basics of Microsoft Word, you can master these packages easily. These products automatically enter responses from emailed surveys into a database and provide ways to input the answers to the printed questionnaires you collect without requiring any special knowledge or skill. More importantly, the software can tabulate and cross-tabulate results, avoiding the need to do these functions by hand.

If you opt for a printed survey over a designated period, provide a place for people to return the study or an envelope to return their questionnaire by mail.

2. Keep it short and sweet.

“Survey creep” is a term that refers to the tendency of questionnaires to grow. “Need-to-know” information is diluted with “nice-to-know” data and then extends to a set of questions not related to the issue at hand. Survey creep often happens when a committee that includes people with differing agendas develops a questionnaire. Remember, every additional question reduces the likelihood that someone will actually complete the survey.

3. Make it simple for the respondent.

When designing your questionnaire, keep your respondent in mind. Multiple-choice questions reduce the amount of time needed to complete a survey and can help you to interpret the answers. A multiple-choice query provides suggested answers that cover the range of potential responses. It assumes, of course, that you understand the appropriate alternatives to include.

An example of a multiple-choice question: If you remain in this community, how likely are you to continue attending this church?

Answer options: Definitely will continue, probably will continue, might or might not continue, probably will not continue, and definitely will not continue.

4. Make your survey interesting.

Respondent fatigue hurts a survey’s ability to reflect audience needs, perceptions and attitudes. Fatigue comes from having too many of the same kinds of questions, including battery after battery of “matrix” questions that drain the life out of a study. Matrix questions are blocks of multiple-choice questions with the same possible set of answers. If you open this survey on church satisfaction, and move down to Question 6, you will see an example of a matrix. The antidote for fatigue involves providing a mixture of questions, including some that ask for opinions and attitudes.

The survey referenced above is an illustration designed to show all the different types of questions and not for actual use. Subsequently, it is a bit long and asks more open-ended questions than normally recommended. Feel free to shorten and customize to your church’s needs.

5. Use open-ended questions sparingly, but appropriately.

Open-ended questions take more time to answer, and, generally, respondents are more likely to skip them. While we want respondents to engage mentally with the questions, we cannot expect them to spend a lot of time determining what answers are applicable. Don’t tax your participants too heavily.

There is also the problem of interpreting the answer. If answers are too broad, you will have difficulty aggregating them easily and consistently enough that any two people would interpret the answer the same way. Here is an example of an open-ended question: How could the church better serve your spiritual needs?

6. Be careful when using ranking questions.

Another type of question involves ranking items. An example would be, “In your opinion, which of the following ministries should be the church’s top priority? Which should be second? Which should be third?” These questions assume all items are good.

Please note, however, that ranking questions can sometimes distort opinions and hide the correct answer. Having a high number of items from which to choose can lead to fragmentation where none of the areas receives a majority. This can allow the opinions of a small minority to dominate the decision on what is a priority because a core number of dedicated advocates listed it as “first.” Even worse, participants may consider all of the items in your list as unimportant. The “best” of the worst popped to the top because your survey forced them to rank something as “number one.”

7. Follow ranking questions with scaling questions.

One way to ensure that the ranking provides an accurate representation of audience perceptions is to add a section where everyone evaluates each item on a set scale. In this example, the scale could reflect the importance for the church to address each particular ministry. That way, all items receive some evaluation to indicate importance in general, reinforcing whether the ranking fairly represents the attitudes of respondents. Ranking questions have their place for highlighting small differences between items. Just show care in using them.

8. Use unbiased questions.

Biased questions lead to answers that support the researcher’s intent more than that of the respondent. It is the equivalent to a “when was the last time you used creative accounting on your taxes?” question. Write your questions in a way that does not “lead” to a response.

These tips should give you a framework to begin structuring questions. For more tips and tools on designing surveys, read "The power of surveys: Discover needs and opportunities."