Turning Service Data into System Change
How Integrated Case Management Drives Collective Impact
Across Canada, social service organizations collect substantial data on intake volumes, service durations, referral patterns, and client demographics. Under current conditions, however, much of this data sits in disconnected systems, reported upward to funders in isolation, and rarely translated into the systemic analysis that collective impact requires. The gap between data collection and data use is not a technical failure; it is a structural one. Imagine Canada and the Federal Nonprofit Data Coalition have identified this data deficit as one of the most significant barriers to effective decision-making across the sector.
At the same time, the demand for evidence of outcomes has intensified. A 2023 Statistics Canada survey of more than 8,000 charities and nonprofits found that 46% of organizations reported increased demand for services, yet only 24% reported a corresponding increase in their capacity to meet that demand. Funders, government partners, and boards are increasingly asking not just whether services were delivered, but whether they produced measurable change, in housing stability, family cohesion, income security, or health. The organizations best positioned to answer those questions are those that have invested in an integrated case management infrastructure that connects service events to longer-term results.
This brief examines how integrated data infrastructure, when designed for interoperability and used in coordinated practice, transforms service records into evidence of real outcomes and positions organizations and their funders to direct resources where systemic change is most achievable.
Why Service Data, Alone, Does Not Show Outcomes
Service data, in its most common form, describes inputs and outputs: how many people were served, for how long, and under which program envelope. These figures are necessary for accountability, but they are not sufficient for understanding impact. An organization may report ten thousand service contacts in a fiscal year without being able to demonstrate whether those contacts reduced emergency room visits, stabilized housing, or supported a child's readiness for school.
This indicates that the limitation is not in the volume of data collected, but in its fragmentation. When a client receives employment support from one organization, transitional housing from a second, and mental health services from a third, the full arc of their experience, and the cumulative effect of those interventions, is visible to none of the three providers independently. Each records its own outputs. No one records the outcome. Research on data challenges in Canada's nonprofit sector consistently points to this siloed architecture as a structural weakness, rather than a failure of individual organizations.
In practice, what funders and boards receive is a mosaic of partial pictures: utilization reports, satisfaction surveys, and output tallies that, taken individually, offer little insight into whether the system as a whole is moving people from crisis to stability. As Imagine Canada has noted, at the outset of the COVID-19 pandemic, government officials seeking basic information about which organizations served specific populations found that neither the sector nor government could answer confidently, because the data to do so had never been connected. The structure of reporting, rather than the commitment of organizations, produces this limitation.
The structural gap: When service records are siloed by organization or program, they capture transactions rather than trajectories. A client's journey across housing, employment, and health services may span years and multiple providers, but without a shared data infrastructure, that journey remains invisible to the system responsible for supporting it. Statistics Canada's 2023 review of Canadian homelessness data illustrates this directly: even where administrative data exists across multiple systems, linkage between those systems is required before trajectory-level analysis becomes possible.
What Integrated Case Management Makes Possible
Integrated case management systems, when designed with shared data standards and interoperability as core requirements, produce a fundamentally different kind of evidence. Rather than capturing service contacts in isolation, they connect those contacts to individual trajectories, allowing organizations, funders, and system planners to observe patterns of movement across services over time.
Over the same period that fragmented systems produce utilization counts, integrated infrastructure can produce trajectory analysis: the proportion of clients who exit crisis services and stabilize in permanent housing within twelve months; the relationship between early childhood service intensity and school readiness at kindergarten entry; the degree to which coordinated outreach reduces cycling through emergency shelter. The National Shelter Study 2023, made possible by integrated data, estimated that approximately 118,329 people accessed emergency shelters nationally in 2023 and was able to report on chronicity, average stay durations, and demographic trends across time, precisely because the data infrastructure to support that analysis had been built.
Taken together, the shift from siloed reporting to integrated data infrastructure is a shift from accountability for activities to accountability for results. Kania and Kramer's foundational work on collective impact identified shared measurement systems as one of the five essential conditions for large-scale social change, noting that common reporting across organizations enables participants to identify patterns, find solutions, and course-correct rapidly. This matters not only for demonstrating value to funders but for enabling organizations to identify where their interventions are most effective, where gaps in the service continuum produce poor outcomes, and where additional investment is required.
The Conditions Required for Data to Drive Collective Impact
The potential of integrated data does not materialize automatically. Several structural conditions are required for service data to generate the kind of system-level evidence that drives collective impact.
Shared Outcomes Frameworks
Organizations operating within a common funding envelope or service catchment area are required to align on a shared set of outcomes before integrated data can be meaningfully aggregated. Without this alignment, each organization measures what it values individually, and cross-organizational analysis produces comparisons of incompatible indicators. Innoweave, the McConnell Foundation's capacity-building initiative, has emphasized that establishing a clear population-level outcome goal is the prerequisite for any collective impact effort, and that this alignment, while time-intensive, is achievable where governance structures support it. In communities where this alignment has been established, through coordinated planning tables, funder-led initiatives, or sector-wide frameworks, the resulting data is significantly more actionable for system planners and funders alike.
Interoperable Data Infrastructure
Shared outcomes require shared data standards, and shared data standards require case management platforms that support interoperability. This does not necessarily mean a single system for all organizations; it means that the systems organizations use must be capable of exchanging data in formats that allow cross-organizational analysis. ESDC's 2023–2026 Data Strategy frames this explicitly as a horizontal integration challenge, requiring alignment across policies, programs, services, and channels to achieve the kind of population-level insight that informs effective decision-making. In Canada, this is an area where investment has been uneven: some provincial and municipal systems have made significant progress, while others continue to operate with legacy infrastructure that limits data exchange to manual processes and aggregated spreadsheets.
Analytical Capacity at the Organizational Level
Even where integrated infrastructure exists, organizations need the internal capacity to translate data into analysis. Research on data culture in Canada's non-profit sector suggests that data literacy remains a relative weakness, and that the sector has been slow to adopt the quantitative methodologies required to take advantage of integrated data assets. This includes not only technical skills, data literacy, basic statistical reasoning, familiarity with reporting tools, but also organizational cultures that treat evidence as a resource for improvement rather than a compliance obligation. Where this capacity is present, organizations use data to refine their practice, identify populations not well-served by current models, and make the case to funders for program evolution. Where it is absent, data collection continues without informing decisions.
Funder Requirements That Incentivize Integration
Funders play a structurally determinative role in whether an integrated data infrastructure develops. The State of Outcomes-Based Finance in Canada (2023) documents that, as of September 2023, Canada had invested over USD $14.5 million toward outcomes-based financing mechanisms, structured precisely around the principle that funders pay for results rather than activities delivered. Funding agreements that require outcome reporting against shared indicators, that support the cost of data infrastructure and capacity building, and that allow multi-year investment timelines for systems change work, create the conditions under which integration becomes possible. Conversely, funding relationships that reward output volume and tolerate siloed reporting implicitly disincentivize the investment integration required. The structure of funder expectations, as much as the technical environment, shapes what is achievable.
What Real Outcomes Evidence Looks Like in Practice
Real outcomes evidence does not look like a dashboard of coloured indicators. It looks like a structured answer to a structured question: Among people who accessed coordinated housing supports in this region between 2021 and 2023, what proportion achieved stable housing after twelve months, and what service combination was associated with the strongest results?
Answering that question requires, at minimum: a shared definition of housing stability; a case management system that tracks individual movement across service types and time; a client identifier that allows matching across organizational records; and an analyst with the time and skill to build the analysis. These are not exceptional requirements. Research on the implementation of Coordinated Access in Ontario found, however, that structural and systemic challenges, including staff turnover, housing stock limitations, and data system reliability, continue to undermine the effectiveness of these processes in practice, even where the infrastructure nominally exists. This indicates that infrastructure investment alone is not sufficient; ongoing governance, staffing, and alignment work are required to sustain the conditions under which trajectory-level data becomes analytically usable.
Organizations that have built this capacity describe a qualitative shift in their relationship with funders and community partners. Rather than defending activity volumes in accountability conversations, they bring evidence of movement: how many people moved from emergency shelter to supportive housing, how quickly, and what conditions were associated with sustained exits. An evaluation of collective impact practice conducted by ORS Impact and the Spark Policy Institute across 25 initiatives, including three in Canada, found that those sites paying more attention to data and shared measurement could draw a stronger link between their work and system and population change. This evidence changes the nature of strategic dialogue with boards and funders, and it provides the analytical basis for resource allocation decisions that are otherwise made on assumption.
What funders can ask for: Rather than requesting output reports, funders are well-positioned to require that grantees adopt shared outcome indicators, participate in coordinated data infrastructure, and provide trajectory-level analysis annually. The Treasury Board's guidance on data linking for program evaluation provides a federal-level framework for this approach, recommending that program officials explore data linkage options before initiating new data collection, precisely because linked administrative data can answer outcome questions that siloed program data cannot. This shifts the accountability relationship from one that measures activity to one that measures change, and creates the conditions for collective impact evidence to accumulate over time.
The Role of Boards and Executive Leadership
For boards and executive directors, the implications of this analysis are structural rather than technical. The decision to invest in an integrated data infrastructure is a governance decision, not an IT decision. It involves committing organizational resources, time, budget, and staff attention to a form of infrastructure that produces returns over multiple years and across organizational boundaries.
In practice, organizations that have made this commitment report that the investment is justified not primarily by funder requirements, but by the improvement in internal decision-making it enables. When case managers can see a client's service history across programs, they make better referral decisions. When managers can observe outcomes by program stream, they can identify where additional support is needed and where resources are underutilized. When executive directors can demonstrate trajectory-level outcomes to funders, they operate from a position of analytical credibility rather than narrative advocacy alone.
At the same time, this investment is not achievable in isolation. Tamarack Institute's synthesis of collective impact practice in Canada consistently identifies backbone organization support and shared governance of data infrastructure as prerequisites for sustained system change, not optional additions to individual organizational strategies. Organizations that have advanced most significantly in integrated data practice have typically done so in the context of coordinated sector-level initiatives, community planning tables, regional data collaboratives, or provincial system-change investments that share the cost and governance of shared infrastructure. Executive leadership engagement in those sector-level bodies is, therefore, a precondition for realizing the benefits at the organizational level.
Taken together, the evidence from communities that have pursued integrated data infrastructure consistently indicates that the organizations and funders most capable of demonstrating real outcomes are those that have treated data infrastructure as a strategic asset, invested in analytical capacity alongside technical systems, and committed to shared frameworks that make cross-organizational analysis possible. Evaluating Collective Impact: Five Simple Rules, a widely used framework in the Canadian sector, argues that the process of settling on shared outcomes and measures is itself a valuable outcome of collective impact work, sharpening organizational thinking about what is actually being pursued. These are not optional enhancements to service delivery. Under current conditions, they are the structural requirements for demonstrating that investment in social services produces the system change it is intended to achieve.