What Your Board Could Be Asking About Data

Picture your last board meeting. Someone reviewed the financials. Someone flagged a liability. Someone asked about program reach or how many people you served last quarter. Maybe there was a brief discussion about a government contract or a funding renewal.

Now think about whether anyone asked this: Do we actually trust the data we're reporting?

What Gets Asked in the Boardroom (and What Gets Left Out)

The typical board agenda for a social service organization is predictable for good reason. Financial statements. Risk and compliance. Program updates. HR matters. Occasionally, a strategic planning item. These things matter. None of them should be removed from the table.

But there's a category of questions that almost never makes it onto the agenda: questions about information quality, data systems, and measurement credibility. Boards routinely review outputs (clients served, programs delivered, dollars spent) without asking whether the numbers behind those outputs are reliable, consistent, or actually connected to anything meaningful.

This isn't a criticism of boards. Most board members were never trained to ask about data governance. The sector hasn't normalized it as an expectation. And executive directors, many of whom are themselves uncertain about their data infrastructure, don't always volunteer the conversation.

The result is a quiet governance gap. Organizations are making strategic decisions, writing funding proposals, and publishing impact reports based on data that hasn't been validated, standardized, or seriously examined at the leadership level. As we've explored in our post on rethinking efficiency in social services, the sector has too long measured the wrong things, and boards are one of the places where that pattern can and should be interrupted.

Why Data Governance Is a Governance Issue

There's a tendency in the sector to treat data quality as an operational matter: something for the program manager, the data analyst, or the IT consultant to sort out. The board is there for strategy and oversight, the thinking goes, not to worry about whether the intake form in one program is consistent with the one in another.

That framing made sense ten years ago. It doesn't anymore.

Funders across Canada are shifting toward outcome-based funding, and the pressure is structural, not cyclical. Fiscal constraint is tightening accountability expectations at every level of government. Canada's Social Finance Fund and the broader social innovation ecosystem it supports are built on the premise that impact can be measured, compared, and used to inform investment decisions. Digital infrastructure across the sector is improving, which means funders are recalibrating what they consider reasonable to ask for. As we've written about in our post on why funders are asking for more outcome data (link to be confirmed), the organizations that can't respond credibly to those demands aren't facing a data problem. They're facing a governance problem.

Beyond funding, there are legal and ethical dimensions to data governance that have clear board-level implications. Canada's Personal Information Protection and Electronic Documents Act (PIPEDA) and its provincial equivalents create accountability for how client data is collected, stored, and used. When an organization mishandles client data, even inadvertently, the liability doesn't sit with the database administrator. It sits with the organization and its leadership. As we outlined in our analysis of data infrastructure as a policy enabler, the governance and the technical are inseparable.

Taken together, the funding environment, legal obligations, and the reputational stakes of outcome credibility make data governance a fiduciary responsibility. It belongs in the same category as financial oversight, not below it.

The Questions a Good Board Should Be Asking

Boards don't need to become data experts. They do need to ask questions that hold executive leadership accountable for data quality and data governance. These aren't technical questions. They're organizational health questions.

Here's a practical set to start with:

What outcomes are we actually tracking, and why those ones?

This question forces clarity about whether the organization has a coherent theory of change or whether it's measuring whatever was easiest to count when someone set up the spreadsheet years ago. The answer should reference a logic model that connects program activities to intended changes in people's lives. If the ED can't articulate that connection, the data being collected may not be telling the organization anything useful.

The Common Approach to Impact Measurement, developed through a Canadian community of practice, offers a flexible set of standards specifically designed to help organizations choose indicators that are meaningful to their own work while remaining compatible with funder reporting requirements. Boards should know whether their organization is working from a framework like this, or simply tracking what's convenient.

Do we trust our own numbers?

This is the most direct version of the question, and the most revealing. A confident "yes" should be followed by "why?" and "what's our process for verifying that?" A hesitant answer, or one that quickly redirects to technology or staff capacity, is useful information for the board.

Strong organizations have documented data collection protocols, staff training on data entry, and regular audits of data quality. They know where their data comes from, who enters it, and how inconsistencies get flagged and resolved. If none of that infrastructure exists, the numbers in the annual report are more aspirational than evidential. Our post on how to turn service data into actionable insights (link to be confirmed) walks through what that kind of infrastructure actually looks like in practice.

Is our data consistent across programs and locations?

Many organizations run multiple programs, sometimes across multiple sites, with different staff using different tools and different definitions of the same terms. "Housing stability" means something different in one program than it does in another. "Successful exit" is counted differently depending on who's completing the file.

Boards should know whether the organization has standardized its data definitions across programs, or whether its aggregated numbers are stitching together information that was collected in fundamentally incompatible ways.

What would we do differently if we had better information?

This question reveals the actual relationship between data and decision-making. If the answer is "nothing, we'd make the same choices either way," that's a signal that data isn't integrated into how the organization learns and adapts. If the answer generates a substantive list, it identifies concrete investments worth making.

The best executive directors can name the specific decisions, program pivots, resource allocations, or funder conversations that would change if they had higher-quality information. That kind of clarity is what board-level data conversations should be working toward.

How does our data governance stack up against where funders and policymakers are heading?

Most boards are reasonably current on financial compliance requirements. Very few are tracking where data and outcome expectations in the sector are moving. Boards should be asking whether the organization's data practices are keeping pace with what major funders, collective impact tables, and government partners are beginning to require. Imagine Canada's Standards Program offers one useful benchmark: governance accountability, including how boards exercise oversight of organizational data, is embedded directly in the standards framework.

Being behind the curve isn't always disqualifying, but not knowing you're behind the curve is.

What It Means When the ED Can't Answer These Questions

No executive director should be expected to have flawless answers to all of these questions, especially in organizations with limited data infrastructure. Some of these questions are genuinely hard. What matters is whether the organization is working on them.

There are a few answers that should give a board pause, not because they indicate failure, but because they indicate a gap that needs to be actively closed.

"We report what funders ask for" suggests that the organization's measurement choices are driven entirely by external compliance rather than internal learning. That's not inherently wrong, but it's incomplete. An organization that only tracks what funders require is likely missing information that would help it improve its own programming. As we've written about in outcome-focused reporting for Canadian nonprofits, outcome data serves the organization first and the funder second.

"Our database handles that" is a version of the technology-as-solution framing that often masks the absence of underlying standards. Technology can organize and display data. It can't make inconsistent data consistent, or give meaning to outcomes that were never clearly defined. Our piece on what modern case management software should really do addresses exactly this distinction: good software supports good governance, but it doesn't replace it.

"We're working on it" is a fine answer, as long as it's followed by a timeline, a resource commitment, and someone accountable for the work. As a permanent holding pattern, it isn't.

Boards that hear these answers have an opportunity, not a crisis. They can ask what support the ED needs to make progress, what resources would accelerate the work, and what the organization should prioritize first. That's what governance looks like when it takes data seriously.

What a Data-Literate Board Looks Like in Practice

Introducing data governance as a board-level responsibility doesn't require restructuring the governance model or hiring a data specialist. It requires a few deliberate changes to how the board does its work.

Build a standing data report into the board package. Alongside the financial statements, include a brief summary of the organization's key outcome indicators, data quality flags for the period, and any system changes. This doesn't have to be lengthy. Two or three pages that answer "what are we tracking, what did we find, and what does it mean" establishes a baseline expectation that data is a governance concern.

Ask the questions above at least annually. A dedicated governance review of data practices once a year, separate from the financial audit cycle, normalizes the expectation without adding significant meeting time. Some organizations formalize this as part of a broader organizational health review.

Include data governance in executive performance conversations. If an ED is evaluated on financial stewardship, program reach, and funder relationships, they should also be evaluated on whether the organization's data practices are improving. That inclusion signals to the entire organization that data quality is leadership's responsibility.

Know where your client data lives and what jurisdiction it falls under. This is particularly important for organizations using software platforms developed or hosted outside Canada. Under Canada's Personal Information Protection and Electronic Documents Act, organizations have specific obligations around cross-border data transfers and client consent. Boards have a responsibility to understand these, not as technical detail, but as governance accountability.

The Real Question Behind All of These Questions

The questions listed here aren't really about data. They're about whether the organization has a credible account of its own impact, whether its claims about outcomes can be trusted by the people making decisions based on them, and whether its leadership is governing one of its most important assets with real discipline.

Boards ask about finances because financial mismanagement threatens the organization's survival. Data mismanagement is increasingly in the same category: not because of dramatic failure, but because of slow, quiet erosion where funding proposals can't be substantiated, program decisions are made without evidence, and outcome reports describe activity rather than change.

The organizations that will be best positioned in the next five years are the ones whose boards are already asking these questions, and whose executive directors are building the capacity to answer them well.

Next
Next

Change Management for Nonprofit Software Adoption