

connecting
communities
Improving the
Broadband Connectivity Assessment Tool
Working with Karen Perry from the NTIA and Will Saunders from the State of Washington, my partner Nina Showell and I completed extensive qualitative interviews to inform design decisions for a redesign and restructuring of the Broadband Connectivity Assessment Tool (BCAT) for our Masters' capstone project.



overview
The Broadband Connectivity Assessment Tool (BCAT) was developed by the National Telecommunications and Information Administration’s BroadbandUSA initiative, as an online version of a planning tool developed with input from local and state broadband leaders, advocates, and industry members. The assessment serves as a gathering place for voices in the community to come together and share their knowledge about local broadband, helping them understand and assess their needs, and then prepare them for advocacy. The BCAT is designed to facilitate the process of bringing a team together to work on and complete their community’s assessment.

Team
Ashley Callaway, user research & design
Nina Showell, user research & writing
Method
Remote semi-structured interviews
Outcome
Suggested redesign provided guidance for policymakers to improve the BCAT for redevelopment, with feedback being incorporated into the new version of the tool.

What is the Design Problem?
During the initial beta launch of the tool (15 communities around the US), less than half successfully completed the assessment using the BCAT. Our goal was to find out why these teams were unsuccessful, and how we could increase participation and completion rates by improving experience of using the tool.
Who are the users?
The tool is designed to be used by communities of all types, although many of the communities involved in the beta were small and rural.
The users who typically make up a BCAT team include librarians, government officials, internet service providers, tribal leaders, business owners, and concerned community members. They will likely be using the tool for a duration of several months, with use ranging from around an hour total (less-involved participant) to everyday (administrator).
planning
We chose interviews as our research method as they would provide rich, qualitative data that would enable us to dive deeper where needed to explore the issues users encountered.
We created a semi-structured interview protocol to guide our interviews. Because we had background information on the participants ahead of time, we also added questions specifically tailored to their experiences.
The semi-structured method worked well it enabled us to ensure we were consistent with what we ask and are covering the important information we set out to uncover, while being able to explore the nuances of each participant’s experience in a more conversational way.
-
What were you hoping to achieve, or what was your goal in completing the assessment?
-
Was there a problem you were trying to solve?
-
What were your overall thoughts on the process? What worked? What didn’t? How would you have changed it?
-
What parts of the BCAT were more or less interesting?
-
Which modules did you find most useful? Which were not?
-
Have you been able to take action because of the BCAT?
We also included more specific questions about how the logistics of using the BCAT worked, such as:
-
How did the BCAT tool fit into the work you were already doing?
-
How did you encourage participation from your team members?
-
How did you collaborate?
-
How did you determine who to include on the team? Is there anyone else you would have selected in hindsight?
-
Were you (and your team) able to answer all of the questions relevant to your community?
-
Was there anything you didn’t have the information to answer?
Research Questions
execution
We conducted ten interviews with eleven participants from seven of the fifteen communities that participated in the BCAT pilot project.
Recruitment was done through our project sponsors who were able to connect us to participants. They were located in places like rural counties and the Seattle metro area in Washington state, Kansas City, MO, Vinal Haven, ME, and Louisville, KY.
We had planned to use Zoom for our interviews, as video often helps built rapport with participants, but a challenge arose with some participants not having reliable access to internet, so on several occasions we switched to a phone interview out of necessary.
“Because it was so big, and many of our rural communities do not have robust broadband, that’s why they wanted to be involved in this project…. we’ve got make it available in a format that is not so bandwidth intense.”
“Everyone wants to be involved, and then real life gets in the way.”
“We pulled sections…specific to certain interests. For instance, getting a county commissioner to fill out a particular survey”
“Each question asked in three ways- scale of 1-7, scenarios to choose from, then ask how you would describe the scenarios- so you are answering the question three different ways”
“Technically, we really don’t have broadband...Part of what the tool ended up showing, by supplying some of the FCC information was that according to the FCC, we have broadband 100% in this county…[the BCAT]... gave us information about what the FCC thinks about us, and it gave us the chance to document from several perspectives what we felt were low points or what we needed”
analysis
Once we had completed all of our scheduled interviews, we had begun to hear a lot of repetition in responses, so we were satisfied with the number of participants. Our analysis of our data was done through affinity mapping. This gave us a high level understanding of key themes that had emerged in the interviews.
Key themes included:
-
Assessment too long
-
Final report too dense/long
-
Flexibility/customization
-
Want to compare
-
Confusion on what questions mean
Affinity mapping worked well for our purposes- we weren't going for the level of rigor thematic coding would enable, and it allowed us to get at the major issues quickly, as we were under time pressure for our project.

results
After we completed analysis, we brainstormed recommendations that would help our sponsors address the issues we found. Some issues were low-hanging fruit that would require very little investment to fix reorganizing the modules, consolidating questions, providing a downloadable version to complete offline, and updating the reading level. Others, such as the need to update/gather accurate data, tools to allow for comparison with similar communities, and admin tasks such as reminders, and team assignment management, would be more complex and more difficult to implement with limited funding resources.
To communicate our recommendations, we created both a written report and a presentation to both our sponsors and other stakeholders in Washington, D.C..
In the report, we outlined in greater detail the following issues and proposed solutions:

Provide Support for Offline Access
-
Create a way for users to download the assessment & complete it offline
Encourage Data-driven Work
-
Due to some problems with existing data sources, encourage communities to collect their own data
Shorten the Final Report
-
Team leaders need assistance creating a summary of the findings
-
The final report should be action-oriented
Shorten the Entire Assessment
-
Rephrase and/or delete repetitive questions
-
Related questions should be grouped together on a single page
-
Not all participants will want to answer every question, nor should they be expected to
Enable Team Leads to Assign Sections to Participants
-
Team administrators (leads) should be able to assign modules to specific participants.
-
Break the entire assessment into two completion paths:
-
A path that focuses on Broadband Adoption & Digital Inclusion
-
A path that focuses on Infrastructure & Broadband Availability
-
Ensure Reading Level is Appropriate & Simplify Answer Choices
-
Decrease the reading level of the questions to make sure they’re at a high school reading level, not a college reading level
-
Consider the target audience for each question & rephrase questions as needed
-
Reformat answer choices from a 7-point scale to a 5-point scale and add answers for “I don’t know” & “Not Applicable”
Encourage Participation & Celebrate Success
-
Have a way to send reminders to participants
-
Allow participants to clearly see their progress and what they need to do next
-
Display a congratulatory message after the completion of each module to helpparticipants stay motivated
Encourage Collaboration
-
Be able add additional participants (more than fifteen people)
-
Create a way for team members to upload documents to a shared drive/portal.
Recognize that Urban and Rural Communities Have Different Needs
-
Provide a choice of module pathways:
-
A path that focuses on Broadband Adoption & Digital Inclusion
-
A path that focuses on Infrastructure & Broadband Availability .
-
Teams should also have the ability to work on any modules they would like to complete (regardless of pathway)
-
-
Support Work Across Communities
-
In any area where data is presented, provide additional columns for comparison, e.g.“your community” vs.” your state” vs. “other similar communities.”
-
Retain conference calls and the cohort model so teams can lean on each other for support
-
Provide examples or case studies for new teams
Improve Interface Aesthetics & Technical Functionality
-
Use the U.S. Web Design System Style Guide to make sure the visual style is appropriate
design
In addition to providing the report, we prepared design artifacts for what a potential redesign of the digital portion of the tool.
We provided user stories, to convey key tasks and user needs for the different user groups (BCAT team admin, group participants). Working with a governmental entity, funding is often limited and dependent on politics- so we knew the resources for development wouldn't be available in the near future. User stories are an effective way to capture the important parts of the user's experience without necessarily specifying a visual design.
We did include select wireframes for a few key flows, as well, to help stakeholders visualize what a 'goal state' of the tool might be.





outcome & key takeaways
Reflecting back on the project, this was my first experience conducting research in a domain I was technically unfamiliar with. The ability to quickly grasp a basic understanding of a topic, enough to effectively ask the right questions on the fly, has served me well in my research work since.
As my first extensive user research project lasting around 6 months, it taught me a lot about budgeting time and effort to meet deadlines and stakeholder expectations. If I were to change anything about this project, I feel as though I would focus less on the flashy design side of things; while our sponsors liked the designs as an inspirational 'goal state,' the effort put into visual designs could have better been spent elsewhere, perhaps applying prioritizations to the recommendations, or seeking out additional urban communities to increase their voice in our feedback.
Overall though, our team felt as though this project was a success. We were able to deliver what we had set out to accomplish, on-time and in scope. Our sponsors were very happy with our work. We were able to present to two groups of broadband stakeholders, both in Seattle and in Washington, D.C., and we received positive feedback there was well. It was really rewarding to work on something that felt like it would help make a difference in advocating for communities being left behind by the digital divide. Our sponsors have since been able to incorporate many of our recommendations, with noted improvements on the experience of their users.

