Introduction
Conducting surveys is one of the primary ways to collect data in public safety. In order to obtain relevant information that is concise and easy to process, it is vital to adhere to several recommendations that make the process more efficient. The present paper offers an overview and critical analysis of five surveys conducted by local police departments. Even though the security agencies have substantial experience in collecting qualitative data, there are areas in survey design that may need improvement.
Overview
The present paper uses five examples of public safety surveys utilized by local security agencies in different states of the United States. A survey posted by the North Andover Police Department (NAPD, n.d.) aims at monitoring and upgrading the quality of its services. It is a 59-item questionnaire that seems to cover all relevant points. Cannon Beach Police Department (CBPD, n.d.) created a 40-question survey to obtain similar information with more detail. A list of 42 questions is proposed by the University of Rochester Police Department (URPD, n.d.) hopes to “measure the quality of the services we provide, gauge community satisfaction, and use the information to improve services and develop future programs” (para. 4). City of Burnet Police Department (CBPD, n.d.) developed a 20-item survey that aims at acquiring information about public opinion on the quality of police services. Finally, the Los Angeles Police Department (LAPD, n.d.) created a short, safe park survey consisting only of four questions. Even though most of the surveys had a similar goal, all of them differ in design.
There clear leaders among the five questionnaires mentioned above in terms of survey perfection. The adequate survey seems to be the one conducted by CBPD (n.d.). Even though there is no precise aim identified in the opening paragraph of the survey, it has clear advantages that make it superior to the others. First, it is concise and yet covers all the aspects of public safety mentioned in the majority of other surveys. Second, it is pleasant for viewing, which may improve response rates among citizens (Fanning, 2005). Finally, the questions are straightforward and easy to understand for ordinary people. The survey created by LAPD (n.d.) seems to be the worst in the selection. It is too short; all the data needs to be entered in plain text, which makes the questionnaire less likely to be completed (Fanning, 2005). Moreover, the unstructured data that will be received will be hard to process and evaluate. In short, there significant differences in the quality of the surveys mentioned above.
Critical Analysis
Best Practices Examples
Most of the surveys discussed in the present paper have an introduction paragraph that clearly identifies the purpose of the survey. Since all the questionnaires under analysis are supposed to be completed online, there are no cover pages. Therefore, the opening paragraph may be considered a cover page, and it should explain why the survey is conducted and motivate the respondent to complete it (Fanning, 2005). URPD (n.d.) seems to be the most successful in the matter since it has an adequate introduction to the survey that creates a feeling of connectedness and importance. The opening is short, clear, and official-looking, which adheres to the quality standards proposed by Fanning (2005). In short, the first example of best practices is adequately designed cover.
The questions in all the discussed surveys are grouped according to their topic. CBPD (n.d.) seems to be the most successful in the matter since it divides the questionnaire into five parts with subheadings. The adopted design eases the cognitive burden and helps the responders to focus (Fanning, 2005). Most of the surveys mentioned in the present paper also use question groupings; however, they do not include subheadings. Therefore, the second example of best practices in the presence of an excellent organization of questions.
The directions in all the surveys are clear and easy to follow. Since the respondents to the survey are not specially recruited people, they are less likely to spend time rereading the instructions if they are not clear. Therefore, the completion rates and accuracy of acquired data may suffer if the directions are ambiguous (Fanning, 2005). All the agencies analyzed in the present paper understand their audience and word the instructions accordingly. The survey elaborated by NAPD (n.d.) seems to have the most appropriate instructions since they are concise, adequately placed, and clear. In summary, the third illustration of best practices in the design of guidelines for the survey.
Areas of Improvement
The surveys are consistent in their length, which can negatively influence response rates. On the one hand, the study designed by NAPD (n.d.) consisting of 59 questions is too long for an ordinary citizen to complete. The agency fails to appreciate the stamina of its audience, and respondents may start answering questions randomly out of fatigue (Fanning, 2005). Since there is no appropriate introduction to the survey that can help motivate the people to answer all questions, the quality of the received data may suffer. On the other hand, the study conducted by LAPD (n.d.) is too short, consisting only of four items. Even though short surveys may yield much data, the relevance of the acquired information may be limited (Fanning, 2005). An ideal survey finds a perfect balance between the two extremes. However, since there is no universal solution for the problem, it is difficult to evaluate the exact number of questions for public security inquiries.
Agencies may consider adding color to the surveys to improve response rates. According to Fanning (2015), contrasting colors seem to help respondents to complete the questionnaires. Most of the surveys discussed in the present paper fail to add contrast to the questions. NAPD (n.d.) would benefit from the matter since it is the longest survey in the selection. While the addition of contrasting colors seems to be a minor adjustment, it is worth considering to adhere to the standards of the best practices.
Conclusion
Police departments often use surveys to acquire relevant information about public security issues. While most of the questionnaires published online have a consistent structure, there are some design flaws. Critical analysis of five surveys demonstrates that most agencies are aware of the importance of adequate introduction, clear instructions, and question grouping. At the same time, the identified areas of improvement include the estimation of appropriate length and color utilization.
References
Cannon Beach Police Department. (n.d.). Public safety survey. Web.
City of Burnet Police Department. (n.d.). Public safety survey. Web.
Fanning, E. (2005). Formatting a paper-based survey questionnaire: Best practices. Practical Assessment, Research & Evaluation, 10(12), 1-14. Web.
Los Angeles Police Department. (n.d.). Safe parks survey. Web.
North Andover Police Department. (n.d.). Public safety survey. Web.
University of Rochester Police Department. (n.d.). Department of public safety survey. Web.