The Smart Nonprofit. Beth Kanter
Cinthia Schuman, Chris Tuttle, Christopher Noessel, Darrell Malone, David A Colarusso, France Q. Hoang, Heejae Lim, Iain De Jong, Jake Garcia, Jake Maguire, Jill Finlayson, John Mayer, Julie Cordua, Kevin Bromer, Leah Post, Leila Toplic, Mohammad Radiyat, Nancy Smyth, Nick Bailey, Nick Hamlin, Ravindar Gujral, Rhodri Davies, Rita Ko, Shalini Kantayya, Steve MacLaughlin, Sue Citro, and Woodrow Rosenbaum.
We want to give a special thanks to friends and colleagues who read parts of this book, answered questions, and gave us advice (when we asked for it and when we didn't!). In particular, we'd like to thank: Tamara Gropper, Mark Polisar, Lucy Bernholz, Johanna Morariu, Lisa Belkin, and Amy Sample Ward for their input and advice.
CHAPTER 1 Becoming a Smart Nonprofit
INTRODUCTION
Leah Post has a keen sense of other people's pain. As a program manager at a Seattle social service nonprofit, she uses her gifts to help people who are homeless, or at high risk of homelessness, enter the local support system. An integral part of the intake process is a required assessment tool with the tongue-twisting name VI-SPDAT.
Every day, Leah asked her clients questions from the VI-SPDAT and inputted their answers into the computer. And every day the results didn't match the picture of despair she saw in front of her, the results that should have made her clients top priorities for receiving emergency housing.
Leah knew the basic statistics for the homeless population in King County, home to Seattle. Black people are 6% of the general population but over a third of the homeless population. For Native Americans or Alaska Natives that ratio is 1 to 10. Most of Leah's clients were Black, and yet time and again white applicants scored higher on the VI-SPDAT, meaning they would receive services first. Leah knew in her gut that something was wrong, and yet automated systems are supposed to be impartial, aren't they?
With over a decade of experience as a social worker, Leah knows that asking people who are scared, in pain, may have mental illness, and are at your mercy to self-report their personal struggles is not likely to yield accurate results. Similarly, victims of domestic violence were unlikely to self-report an abusive relationship. But that's not how the VI-SPDAT worked. For instance, one of the questions was: “Has your drinking or drug use led you to being kicked out of an apartment or program where you were staying in the past?” Single, adult Black Indigenous People of Color (BIPOC) were 62% less likely than white applicants to answer yes.1 In general, denial of drinking and drug use is the smarter and safer answer for people of color when applying for public benefits. Except when taking the VI-SPDAT. This assessment is intended to measure vulnerability, which means the higher the score, the more urgently a client needs housing. But, Leah says, VI-SPDAT “just doesn't allow the space for any interpretation of answers.”2
Leah was not the only person noticing skewed results. Dozens of social workers joined her in signing a petition in Seattle asking for a review of the process. Other social workers around the country also raised concerns. Finally, researchers at C4 Innovations dug into the data from King County, as well as counties in Oregon, Virginia, and Washington, and found that BIPOC “were 32% less likely than their White counterparts to receive a high prioritization score, despite their overrepresentation in the homeless population.”
There were red flags about the VI-SPDAT from the beginning. It was evidence-informed, not evidence-based, meaning it was built on information and experiences from past efforts but neither rigorously designed nor tested. It was intended for quick triage but was most often used as an overall assessment tool by social service agencies. No training was required to use it. Oh, and it was free.3
Why was King County, or any county, using a tool with so many red flags? Some of the answer is found in its development history.
The Department of Housing and Urban Development (HUD) provides funding for homelessness to local communities through Continuums of Care (CoCs) consortia of local agencies. This system was created in the 1990s to provide multiple access points for people who are homeless, or at risk of homelessness, through, say, food banks, homeless shelters, or mental health clinics.
In 2009, HUD began to require CoCs to use a standardized assessment tool to prioritize the most vulnerable people. This was an important switch from the traditional “first come, first serve” model. The wait for emergency housing can be years long, and having an opportunity to get to the top of the list is a very big deal for clients. The choice of which tool to use was left up to each CoC.
Years earlier, Community Solutions, a New York nonprofit specializing in using data to reduce homelessness, created the Vulnerability Index (VI) based on peer-reviewed research. The goal of the VI was to lower barriers for people with physical or mental health vulnerabilities that might prevent them from seeking services. Soon afterward, OrgCode Consulting, Inc., created the Service Prioritization Decision Assistance Tool (SPDAT). Finally, in 2013, OrgCode released a combination of these tools, the VI-SPDAT.
The president of OrgCode, Iain De Jong, told us that time was of the essence in launching VI-SPDAT, which precluded more robust testing and training materials.4 By 2015 more than one thousand communities across the United States, Canada, and Australia were using the VI-SPDAT.
The VI-SPDAT was initially released as a downloadable document with a manual scoring index because contrary to its name, OrgCode isn't a tech company. Two years after its release, multiple software companies serving homeless agencies asked to incorporate the VI-SPDAT into their products, and OrgCode consented.
Incorporating the VI-SPDAT into software programs automated it, which meant that instead of scoring the assessment by hand, administrators were now restricted to inputting data into screens and leaving the rest up to the computer. VI-SPDAT became a smart tech tool. The power of decision-making shifted from people to computers. This gave the VI-SPDAT a patina of infallibility and impartiality. Jake Maguire of Community Solutions said, “There are people who have divorced the scoring tool from the basic infrastructure required for meaningful community problem solving. It is complex. What we need to do is to equip people with the skills and permissions that give them informed flexibility. Don't automatically surrender your better judgment and clinical judgment. We can't put our brains on autopilot when we use these tools.”5 As a result, thousands of BIPOC people didn't get the priority spot they deserved or access they needed to vital services.
You may be waiting for some bad guy to emerge in this story: a company gathering data to sell to pharmaceutical companies or a government agency intentionally blocking access to services. There will be stories like that later in this book, but this isn't one of them.
All the actors here had good intentions. HUD wanted to ease access into the homeless system by using multiple access points and placing local organizations in charge of the assessment. OrgCode was trying to create a standard tool for social workers and disseminate it easily, freely, and quickly. Leah and her colleagues were dedicated to helping the most vulnerable people in their communities receive appropriate services quickly. And, of course, clients who were walking in off the street just wanted to be safe at least for one night.
And yet, the VI-SPDAT was so fundamentally flawed that OrgCode announced in 2021 that it would no longer recommend or support it.
UNDERSTANDING SMART TECH
We use “smart tech” as an umbrella term for advanced digital technologies that make decisions for people, instead of people. It includes artificial intelligence (AI) and its subsets and cousins such as machine learning,