Leadership Conference Comments to the Census Bureau on the ACS Methods Panel

View a PDF of the comments here.

December 23, 2024

Sheleen Dumas
Department PRA Clearance Officer
Office of the Under Secretary for Economic Affairs
U.S. Department of Commerce
1401 Constitution Avenue NW
Washington, DC 20230

Submitted via email: [email protected]

RE: Federal Register Notice Docket Number USBC-2024-0027, “American Community Survey Methods Panel Tests”

Dear Ms. Dumas:

On behalf of The Leadership Conference on Civil and Human Rights, a coalition charged by its diverse membership of more than 240 national organizations to promote and protect the rights of all persons in the United States, our Census Task Force co-chairs, Asian Americans Advancing Justice | AAJC and NALEO Educational Fund, and the undersigned organizations, we appreciate this opportunity to provide comments in response to the Census Bureau’s request for public input on the proposed revision of the American Community Survey (ACS) Methods Panel Tests, published in the Federal Register on October 23, 2024 (“notice”) (Docket Number USBC-2024-0027). We appreciate the Census Bureau’s ongoing efforts to improve the accuracy, efficiency, and responsiveness of the ACS, particularly as the data collected play a crucial role in informing public policy and resource allocation across the United States.

The Leadership Conference is the nation’s oldest, largest, and most diverse civil and human rights coalition and provides a powerful unified voice for the many constituencies we represent. Our coalition views an accurate and fair census — and the collection of useful, objective data about our nation’s people, housing, economy, and communities generally — to be among the most important civil rights issues of our day. The Leadership Conference’s longstanding role as a Census Information Center has allowed us to lift up within our broad civil rights coalition the fundamental importance of comprehensive, high-quality data about our population, communities, and economy. We also have a long history of first-hand experience working in support of the decennial census and the ongoing ACS.

The Leadership Conference Education Fund also engaged in first-of-its kind national ACS messaging research to inform community outreach and education. This qualitative research included nine focus groups and ten individual interviews with diverse populations representing different geographies, ages, incomes, genders, races and ethnicities. Some discussions and interviews were conducted in Spanish and Chinese, as well as English.

The findings from this research inform some of our comments below, and also highlight the need for the Census Bureau to conduct additional qualitative and quantitative messaging research, as well as broad ACS public education, particularly among persistently undercounted communities.

Questionnaire Timing Test:
The proposed test to evaluate different timing for sending paper questionnaires is a valuable step toward increasing self-response rates. The sample for this test must ensure adequate representation of diverse respondent groups, particularly those typically underrepresented in ACS data. Specifically, the test should analyze how various demographic groups (e.g., older adults, rural populations, or households with lower internet access) respond to different timing or messaging strategies.

We also support the testing of a Quick Response (QR) code directing respondents to the questionnaire online. We recommend that the Census Bureau also consider developing a QR code that will direct respondents to multilingual resources about the ACS, including plain-language guidance about the purpose of the survey and questions asked.

Internet Instrument Response Option and Error Message Design Test:
Given the increasing importance of internet self-response to improve ACS response rates and contain costs, we strongly support testing to improve the design and usability of the internet-based survey instrument. The Census Bureau must ensure representation of a broad range of demographic groups – particularly persistently underrepresented communities – in these tests to understand how diverse respondents interact with digital surveys. Internet instrument design and testing must also address accessibility requirements for respondents with disabilities, limited English proficiency, and/or limited digital literacy.

In light of the increasing use of internet-enabled mobile devices, we also recommend testing of mobile platforms beyond traditional web instrument designs. Ensuring that the survey design is optimized for small screens and mobile-specific features likely will facilitate greater ACS response, particularly among younger individuals and other groups that favor mobile devices over traditional computers. At the same time, there are still significant numbers of people with limited digital literacy or barriers to internet access, and the development and implementation of these new technological features to facilitate ACS completion must not leave them behind.

We support the Census Bureau’s proposal to evaluate different response option formats and error message designs. However, this research and proposed design improvements must also address a broader lack of understanding about the ACS and its value to communities. Participants in our research did not understand the purpose of many of the ACS questions, indicated that they found the survey burdensome and intrusive, and expressed concerns about the security of their responses. However, after receiving information on why the ACS asks each question and how the data benefit communities, they were more willing to respond even to questions that initially concerned them.

Based on this research, we recommend the development of a “hover” or pop-up feature associated with each question on the internet instrument so that respondents can easily understand how the data collected from each question are used. Guidance integrated directly into the online survey response form would likely help to increase both overall response rates and item-specific response.

We also recommend using the Methods Panel to examine the following additional considerations:

  • Response Option Design: While the proposed comparison of radio buttons versus larger response buttons is useful, we recommend testing additional alternative design elements, such as dropdown menus or sliders for certain question types (e.g., age ranges and income brackets). These alternative formats might streamline the response process and improve the user interface, particularly for more complex questions.
  • Error Message Clarity: The proposed changes to the error message design, such as modifying the color scheme and display format, are a step in the right direction. However, we suggest conducting additional research into how error messages are perceived across different demographic groups, particularly those with limited digital literacy or English proficiency. Testing the clarity of error messages in all of the languages in which the form is available (see our recommendation on additional languages below) and evaluating how different groups react to various error message formats could help ensure a more inclusive user experience and reduce frustration. Using plain language in error messages is of paramount importance to ensuring that these messages will be effective in aiding respondents in completing the survey.

Self-Response Mail Messaging and Contact Strategies:
We fully support the exploration of improved mail messaging and contact strategies to boost response rates. However, we believe that the Census Bureau also needs to strengthen its overall communication and education strategy around the ACS, and offer the following recommendations to do so:

  • Improve ACS “branding.” Communications about the ACS should emphasize that it is part of the decennial census, which is more familiar to many people. Also, the Census Bureau should spell out the name “American Community Survey” in public communications rather than using the acronym “ACS.” Many people have never heard of the survey, and using the acronym may further prevent people from recognizing its significance.
  • Increase awareness of the ACS through public education. The Census Bureau should develop educational materials and infographics in as many languages as possible that emphasize the importance of ACS data to public programs and infrastructure in communities. These materials should focus on the most compelling uses of ACS data, particularly related to education and other programs benefiting children; first responders and emergency services; housing; and local- and state-specific services. The Census Bureau should work with stakeholder, institutional, and business partners to distribute these materials through social media and community hubs such as libraries, community centers, grocery stores, schools, health clinics, multi-unit housing lobbies, senior centers, barber shops/salons, and post offices.

Brief educational materials should also be included in ACS mailing packets. Our research found that infographics such as this one produced by the Census Bureau are particularly effective for explaining the importance of the ACS; many participants noted that receiving this infographic alongside the survey would have increased their understanding of why they should complete it.

  • Confront concerns about confidentiality and data protection head-on. Many participants in our focus groups expressed concern about confidentiality, both in how the government might misuse the data and how nongovernmental actors could hack and misuse the data. Parents are particularly concerned about providing the names and other identifying information about their children. The Census Bureau should design and produce educational materials that address these concerns and include information in ACS mailings and on the survey form itself about why potentially sensitive data are collected and how they are protected.
  • Make the survey form itself and associated guidance materials available in languages beyond English and Spanish. At a minimum, the Census Bureau should make the online ACS form available in the 12 non-English languages that were available for the 2020 Census online form. It should also co-create guidance materials with native speakers rather than translate them directly from the English version, and tailor the content (including graphics, information about confidentiality, and examples of programs and policies that ACS data inform) to be most relevant and appealing to communities who speak each language. These measures will help to make the Census Bureau’s materials more persuasive and help to build understanding of the ACS among immigrant communities. Every ACS mailing and reminders should include the phone number to call for assistance with completing the survey, and should also make it more clear that assistance is available in multiple languages.

In addition to the proposed changes in visual design and messaging, it would be valuable to explore the effectiveness of locally-targeted messages developed to appeal to households in specific geographic locations or types of communities (rural, urban, tribal lands, etc.) to increase participation. Additionally, research has shown that reminder messages or follow-ups are effective in increasing response rates. Messages sent by email or text would be particularly beneficial for households that are more digitally engaged. Specifically, respondents want to know how their contributions will affect their communities; reminders or follow-up messages could help assure survey respondents that their community will directly benefit from their participation in the survey and in turn have a positive effect on response rates.

Content Testing:
The inclusion of content testing for potential changes to questions or response categories is an important aspect of ensuring the accuracy and relevance of the data. However, prior to proposing any changes, the Census Bureau must prioritize consultation with a broad range of external stakeholders, including subject matter experts from relevant communities and individuals with lived experience. This will help to ensure that questions are inclusive and serve the needs of affected communities.

Content testing and follow-up interviews to assess the effects of question changes can be used as vital tools for understanding response variance. Content testing should explicitly address how changes in question wording or formats affect respondents with different educational backgrounds or those with limited English proficiency. Tailoring follow-up interviews to better capture the experiences of these groups could further reduce biases in the data.

Nonresponse Follow-up Data Collection Testing:
We support the proposed tests for improving nonresponse follow-up operations. Given the challenges associated with reaching hard-to-contact populations, the Census Bureau should explore whether there are ways to reduce respondent burden through alternative nonresponse follow-up approaches, such as more flexible hours for interviews or a broader range of languages for in-person or phone interviews.

ACS materials should also explain why a Census Bureau field representative might follow up with a personal call or a visit if the household does not complete the survey initially. This explanation should make it clear that the personal contact is meant to be helpful, to answer questions and assist the household in filling out the survey. Without further explanation, some participants in our focus groups viewed the statement about an in-person visit or call negatively, especially those who expressed concern about government interactions.

We commend the Census Bureau for its ongoing work to refine ACS methodology. These proposed tests will provide valuable insights into improving the survey’s effectiveness, data quality, and respondent experience. We encourage the Bureau to continue evaluating and refining the approaches to data collection and to consider integrating new methodologies, technologies, and partnerships to reduce costs while improving response rates and data accuracy.

Thank you for considering our views. Please direct any questions about these comments to Meeta Anand, senior director of the census and data equity program, The Leadership Conference, at [email protected].

Sincerely,

The Leadership Conference on Civil and Human Rights
The Leadership Conference Education Fund
Asian Americans Advancing Justice | AAJC
NALEO Educational Fund
Coalition for Asian American Children and Families (CACF)
Coalition on Human Needs
MACS – Minnesotans for the American Community Survey and 2030 Census
National Partnership for Women & Families
Project On Government Oversight
Whitman-Walker Institute