2015 Legislative Session: Fourth Session, 40th Parliament
SELECT STANDING COMMITTEE ON PUBLIC ACCOUNTS
SELECT STANDING COMMITTEE ON PUBLIC ACCOUNTS |
Wednesday, February 3, 2016
9:30 a.m.
Strategy Room 420, Morris J. Wosk Centre for Dialogue
580 W. Hastings Street, Vancouver, B.C.
Present: Bruce Ralston, MLA (Chair); Sam Sullivan, MLA (Deputy Chair); Kathy Corrigan, MLA; David Eby, MLA; Simon Gibson, MLA; George Heyman, MLA; Vicki Huntington, MLA; Greg Kyllo, MLA; John Martin, MLA; Lana Popham, MLA; Linda Reimer, MLA; Selina Robinson, MLA; Ralph Sultan, MLA; Laurie Throness, MLA
Unavoidably Absent: Marvin Hunt, MLA
Others Present: Carol Bellringer, Auditor General; Stuart Newton, Comptroller General
1. The Chair called the Committee to order at 9:32 a.m.
2. The following witnesses appeared before the Committee and answered questions regarding the Office of the Auditor General Report: An Audit of the Education of Aboriginal Students in the B.C. Public School System (October 2015)
Office of the Auditor General:
• Carol Bellringer, Auditor General
• Sheila Dodds, Assistant Auditor General
• Jessica Schafer, Assistant Manager, Performance Audit
Ministry of Education:
• Dave Byng, Deputy Minister
• Jennifer McCrea, Assistant Deputy Minister, Learning Division
• Ted Cadwallader, Provincial Director, Aboriginal Education, Learning Division
3. The Committee recessed from 11:13 a.m. to 11:28 a.m. and from 12:18 p.m. to 1:23 p.m.
4. The following witnesses appeared before the Committee and answered questions regarding the Office of the Auditor General Report: The Status of Government's General Computing Controls: 2014 (December 2015)
Office of the Auditor General:
• Carol Bellringer, Auditor General
• Cornell Dover, Assistant Auditor General
• David Lau, Director, IT Audit
Office of the Chief Information Officer
• Ian Bailey, Assistant Deputy Minister, Technology Solutions
• Philip Twyford, Executive Director IM/IT Capital Investment
5. The Committee adjourned to the call of the Chair at 2:58 p.m.
Bruce Ralston, MLA Chair |
Kate Ryan-Lloyd |
The following electronic version is for informational purposes only.
The printed version remains the official version.
WEDNESDAY, FEBRUARY 3, 2016
Issue No. 24
ISSN 1499-4240 (Print)
ISSN 1499-4259 (Online)
CONTENTS |
|
Page |
|
Auditor General Report: An Audit of the Education of Aboriginal Students in the B.C. Public School System |
837 |
C. Bellringer |
|
J. Schafer |
|
D. Byng |
|
T. Cadwallader |
|
S. Dodds |
|
J. McCrea |
|
Auditor General Report: The Status of Government’s General Computing Controls: 2014 |
862 |
C. Bellringer |
|
D. Lau |
|
I. Bailey |
|
C. Dover |
|
P. Twyford |
|
Chair: |
Bruce Ralston (Surrey-Whalley NDP) |
Deputy Chair: |
Sam Sullivan (Vancouver–False Creek BC Liberal) |
Members: |
Kathy Corrigan (Burnaby–Deer Lake NDP) |
|
David Eby (Vancouver–Point Grey NDP) |
|
Simon Gibson (Abbotsford-Mission BC Liberal) |
|
George Heyman (Vancouver-Fairview NDP) |
|
Marvin Hunt (Surrey-Panorama BC Liberal) |
|
Vicki Huntington (Delta South Ind.) |
|
Greg Kyllo (Shuswap BC Liberal) |
|
John Martin (Chilliwack BC Liberal) |
|
Lana Popham (Saanich South NDP) |
|
Linda Reimer (Port Moody–Coquitlam BC Liberal) |
|
Selina Robinson (Coquitlam-Maillardville NDP) |
|
Ralph Sultan (West Vancouver–Capilano BC Liberal) |
|
Laurie Throness (Chilliwack-Hope BC Liberal) |
Clerk: |
Kate Ryan-Lloyd |
WEDNESDAY, FEBRUARY 3, 2016
The committee met at 9:32 a.m.
[B. Ralston in the chair.]
B. Ralston (Chair): Good morning, Members. We have a busy agenda today.
We’re about to begin consideration of the report of the Auditor General entitled An Audit of the Education of Aboriginal Students in the British Columbia Public School System. It dates from October 2015. Present here to comment on the report are Carol Bellringer, the Auditor General; Sheila Dodds, the assistant Auditor General; Jessica Schafer, the assistant manager of performance audit; on behalf of the Minister of Education, Dave Byng, who is the deputy minister; and Jennifer McCrea, who is from the Ministry of Education.
I’m sorry, Jennifer, I didn’t get your title, but perhaps I can add that later.
Thank you all for coming. I’ll turn it over to the Auditor General, and we’ll begin.
Auditor General Report:
An Audit of the Education of
Aboriginal Students
in the B.C. Public School System
C. Bellringer: Thank you, Mr. Chair, and good morning, everyone.
Ten years ago, the British Columbia government committed to closing the social and economic gaps between First Nations and other British Columbians, including an explicit commitment to close the gaps in education by 2015. At the time, the graduation rate for aboriginal students was less than 50 percent compared to 80 percent for non-aboriginal students. In 2014, 63 percent of aboriginal students graduated with their peers.
The Ministry of Education is currently working with educators and education partners across British Columbia to revise the accountability framework for student success. This is a good opening for the ministry to use our audit findings to inform their new framework. As part of the audit, we visited school districts, spoke with board of education trustees and consulted with and included aboriginal people in our work.
Since we released the report in early November, we presented our findings and discussed the report with a number of key stakeholder groups, such as the First Nations Education Steering Committee, the 800-plus attendees at its annual conference, and the B.C. School Superintendents Association and the B.C. School Trustees Association.
The ministry accepted all 11 recommendations in the report, and we are encouraged by the ministry’s actions since the report came out. They dedicated this year’s provincial teacher professional development day to aboriginal education and are revising the school completion certificate policy.
Jessica will take you through the report, and I believe Sheila is going to give her some assistance with that this morning. And sorry, I don’t think you mention this in your presentation. I just want to point out that the photograph on the front cover was taken from one of the visits that was made in the course of the audit.
J. Schafer: Good morning, Chair, committee members. I’m really pleased to have this opportunity to present a brief overview of our audit of the education of aboriginal students in the B.C. public school system, which we reported this past November 2015.
Across Canada, the aboriginal population has been growing faster and is younger than the non-aboriginal population. This raises the significance of education and the education system’s success for aboriginal students. In B.C., aboriginal students account for just over 11 percent of all students in the public school system, and their enrolment generates more than $500 million in annual funding from the ministry to boards of education.
Improving education brings better life chances, such as jobs, income and even better health and well-being. Better aboriginal graduation rates will support aboriginal peoples contributions to B.C.’s economy and society.
Why did we do this audit? There’s a wide and historically persistent gap between education outcomes for aboriginal and non-aboriginal people in Canada. In B.C., education gaps include lower scores in provincial assessments for reading, writing and numeracy; higher rates of special needs designations and enrolment in courses, such as communications 12, that limit options for post-secondary education; and lower graduation rates.
In 2005, the graduation rate for aboriginal students was 49 percent compared with 82 percent for non-aboriginal students. As a result, the provincial government committed to work with First Nations to close the gaps in education outcomes between aboriginal and other students by 2015. Gaps in education outcomes include both academic and social outcomes. Researchers agree that the best way to improve aboriginal peoples’ lives is through improving education. Since 2000, outcomes for aboriginal students have improved, but more can be done.
Our audit objective was to determine whether the ministry had exercised its duties and powers to meet the commitment it made in the transformative change accord to close the gaps in education outcomes between aboriginal and non-aboriginal students by 2015.
We expected that the ministry, in order to achieve its commitment to close the gaps, would provide leadership and direction to the public education system to close the gaps, monitor and analyze outcomes and trends to in-
[ Page 838 ]
form its strategies to close the gaps, and publicly report on outcomes and the effectiveness of strategies. These expectations were based on our understanding of the ministry’s role and mandate, based on the School Act and the statement of education policy order.
The ministry’s target in 2005 to achieve parity by 2015 was ambitious. Despite that long-term goal, the ministry had not fully exercised its duties and powers to close the gaps for aboriginal students.
Between 2000 and 2014, the provincial graduation rate improved from 39 percent to 62 percent for aboriginal students, which was a significant improvement, but there continue to be persistent and significant gaps in some districts and for certain groups of aboriginal students. For example, in ten school districts, less than 50 percent of aboriginal students graduated in 2014. As well, aboriginal children living in care and First Nations students living on reserve, attending provincial public school, have had notably lower graduation rates.
The ministry can do more to lead and direct the public education system to close the gaps for aboriginal students.
Following the ministry’s 2005 announcement to close the gap, the ministry did not develop a shared, systemwide strategy with input from boards, superintendents and aboriginal leadership and communities. A collaborative strategy would focus the system on a common goal and allow the ministry to follow up if expectations were not met.
We also found that the ministry has not yet evaluated its key strategies for aboriginal education, the targeted funding approach and the enhancement agreements. This would allow it to identify what’s been effective and what might need to change. A clear direction and focused effort would help to ensure that the gaps close as rapidly as possible and across all districts.
Through its leadership, the ministry can do more to support a public school system where learning environments are safe, supportive and culturally relevant. We heard of a number of obstacles to ensuring culturally appropriate curriculum and non-racist learning environments that the ministry can help to reduce or remove for the education system.
For example, ensure that educators have the confidence and knowledge to deliver aboriginal curriculum content and that school districts can hire the best people to work with aboriginal students. Educators need to expect that all students will meet their full potential.
Educators, administrators and decision-makers need to know what’s working to inform change. For decades, the ministry has monitored student data, trends and results, and it has shared this information with boards and with the public. Although the ministry has a wealth of valuable information, it’s done limited analysis of that data. Greater analysis of the information would provide a greater understanding of why aboriginal students have had poor outcomes in some contexts and better outcomes in others.
We found that the ministry could do more to support superintendents in their work with boards, staff and aboriginal communities to understand and use the data to inform their strategies.
The ministry could also monitor additional indicators to better understand outcomes — for example, readiness to learn at school entry, attendance, mobility patterns, trends for First Nations students on reserve attending provincial public schools and self-identification patterns.
The ministry also needs to ensure that districts carry out regular assessments of students’ academic and social progress and, crucially, that aboriginal students participate in those assessments.
Performance reporting helps stakeholders make decisions about future goals and strategies. To be useful, performance reporting must be clear and accessible to the people who need it.
Overall, we found that the ministry has reported on results for over 15 years but not on how effective the system has been in closing the gaps between aboriginal and non-aboriginal students or on the effectiveness of the strategies used.
In addition, the ministry had not ensured that boards were meeting ministry requirements for public reporting, particularly with respect to the enhancement agreement annual reports. The ministry expects boards to report each year on progress toward meeting the goals of their enhancement agreement. We found that less than a third of boards had fully met expectations.
Aboriginal communities told our audit team that they would like more information than they have been receiving. For example, First Nations felt that it would be helpful to receive information on outcomes for students from their First Nation separate from the district’s overall aboriginal student population. Others felt that it would be useful to distinguish between results for First Nation, Métis and Inuit students.
We made 11 recommendations to the Ministry of Education. To implement all of these recommendations, the ministry will require ongoing collaboration with boards of education, superintendents and aboriginal leadership and communities.
We made four recommendations related to developing, supporting and implementing a systemwide strategy to close the gaps.
We made two recommendations where the ministry can further support safe, non-racist learning environments.
Four of our recommendations focus on ways to improve practices around data monitoring and analysis and building district capacity to use data to inform strategies.
Lastly, we recommended that the ministry provide leadership on the quality of reporting at both district and provincial levels to strengthen accountability for results.
That concludes our presentation.
[ Page 839 ]
B. Ralston (Chair): Thank you very much.
I’ll turn it over, then, to representatives of the Ministry of Education — Mr. Byng.
D. Byng: Thank you, Chair.
I’ll start by acknowledging our opportunity to gather here today on the traditional territory of the Musqueam, Tsleil-Waututh and the Squamish First Nations. We’re appreciative that we have that opportunity.
I had wanted to introduce a couple of other of my staff members, if that’s all right.
B. Ralston (Chair): Oh certainly, please.
D. Byng: I have Jen McCrea here, who is our assistant deputy minister, who will be leading this work. Ted Cadwallader is here as well. Ted is our provincial director of aboriginal education.
Sitting in the front of the group on the side is Dean Goodman. Dean is our director of accountability. The relevance of having Dean here today in particular is, as we talk about the framework agreement, going forward, he’ll be our staff member leading that, so I thought it would be helpful to have him here to answer specific questions.
B. Ralston (Chair): Great. I’m glad you introduced them. I’m sorry I wasn’t able to give Ms. McCrea’s title. Being an assistant deputy minister is an important position, so I’d wanted to recognize it, but I’m glad you’ve remedied that.
D. Byng: All right, well, thank you very much.
I thought I would start the conversation here today with the committee by very briefly providing a quick overview of our perspective of the Auditor General’s work. I just wanted to preface my remarks by saying that we really felt that the Auditor General’s team worked well with the Ministry of Education as the audit was undertaken.
We worked closely together, and it was a very positive experience. Our view is that it was good work done, and of course, you saw that the ministry accepted the findings and recommendations of the Auditor General, and we do think that this is really helpful for the ministry as we work forward to continue to drive positive change in the education system.
While the Auditor General and her staff have noted that this work was focused on the public school system, I would want to note two things. One is that our work, as we adopt and implement the Auditor General’s recommendations going forward, will be utilized across the education system, not only in British Columbia, but with our offshore schools as well.
We do have the public school system, independent schools, a couple of thousand young people in a home school environment, as well as offshore schools, so the curriculum and supports and teacher training and resources etc., will be actually utilized in a broader context than just the public school setting.
The other piece for us — just to make the committee aware, and I think most members will be — is that there is a division of statutory responsibilities between the Ministry of Education and locally elected boards of education. With that in mind, and to re-emphasize, I suppose, the collaborative requirements of the work ahead, we will be working very closely with school boards and trustees and locally elected officials as we implement these recommendations as well.
We take our responsibility to aboriginal students, First Nations learners and their parents and communities very seriously. I wouldn’t want to leave any doubt about that, and we recognize that there continues to be a gap, albeit closing, that continues to need to be remedied, and we will be putting significant emphasis on that moving forward as a ministry.
It is certainly one of our top priorities, and we’ve been working well with…. I note we’ve got a representative from FNESC, Starleigh Grass — it’s a pleasure to see you — here with us, who works as a very close partner with the province and the First Nations community as we move forward to implement positive change in the system.
I would also like to note that the findings and recommendations of the Auditor General align very closely with the ministry’s objectives, and quite a significant amount of work was underway, I think, at the time that the audit was undertaken. As the Auditor General and her staff have noted, we’ve continued to put a high priority on implementing recommendations in a very timely manner moving forward.
The importance of doing all this — I wouldn’t want it to be lost — is not so much about process and organizational structure and policy. It’s about ensuring that those young First Nations learners and aboriginal students have every opportunity to succeed in the society that we are a part of. We’re very focused on what it means for individual people on the ground, so it’s not just an exercise for us, if you will.
One of the quotes that we heard — and you’ll see it on the board — that really spoke to us at the last All Chiefs meeting was from Chief Littlechild. For those of you who were present, you’ll remember him speaking around — and this was with regard to the residential school system and the impact that it had — how education got us into this mess and how education will get us out of it.
We really, truly believe that, and we do think that this is where it really all starts. We take that obligation very seriously.
Moving on. What we’ve done is we’ve grouped our presentation into a number of themes that we saw in the Auditor General’s report. If you look at “Provincial
[ Page 840 ]
Strategy,” we’ll speak to that. “Accountability” within the system, both for the province as well as school boards and districts. Evidence-based or “Data-Driven Decisions” — how we actually evaluate success and what the importance of that is and then how the province or the ministry can provide support to boards of education and school board officials to ensure that they’re accountable and successful going forward.
The first place I’d like to start is around the strategic plan. The Auditor General correctly pointed out that the ministry, most certainly over the last decade or so, has collected information and disseminated it back out to the public, the First Nations community and school boards around performance of the system. We would further agree that there is a strong need to have a broad provincial strategy.
We have, in the past, tried to strike the balance around local autonomy of school districts and school boards. But we do agree that at this juncture having a well-understood provincial strategy in place that all parties understand — what the goals and objectives are, what the criteria is that we’re going to be using to evaluate success and how that’s going to happen and be reported out on — is fundamentally important.
You’ll see that that strategy is currently under development, and we expect to have it completed within the next six to eight months. A key component of that — and I’ll re-emphasize — is the collaborative nature of it with the First Nations community and boards of education as well as ourselves. And that’s not to leave out students and parents, who we’ll also be consulting with as we build it.
On the accountability piece, one of the things that we understand, and we believe we have an obligation to address as well, is how we are going to take action when school districts don’t achieve the results that they’re committed to. Then in addition, there was the recommendation around — well, I’ll call it school-leaving certificates or what we call the Evergreen certificate and how that is going to be utilized going forward.
I’ll start with the second one first, as far as actions. We completely agree with the Auditor General’s observations in that regard, and I know that the First Nations community does as well. Of course, we have been in conversations with FNESC about that and understand their views on this very strongly, and we’ve been working well together. You’ll see that the minister will be making public very shortly the ministry’s very specific and the minister’s actions to limit the utilization of the Evergreen leaving certificate, as it was originally intended for learners with an individual education plan — special needs learners. We agree that its use has become too prevalent in the system and that it limits First Nations learners’ opportunities going forward without an actual Dogwood graduation certificate.
I might build on that a little bit and just say, further to that, we’re very focused on First Nations learners with a Dogwood certificate — that they actually have the education they require to move forward.
With regard to actions taken, we do expect fully — once we have the reporting from school districts for those that are not achieving the standards that we have agreed to — that the ministry, first, will work with them, providing advice and support from their colleagues, as well as ourselves, as being successful around best practices to ensure that they have the benefit of what’s working provincially. We have a couple of staff members who are focused in a very positive way on working with boards that haven’t been successful. Having said that, we have the tools, particularly with the passage of Bill 11, to compel school districts to make changes if they haven’t on their own to achieve the results that they need to.
The next slide is around data-driven decisions. One of the things that we’re in the process right now of doing in the ministry is restructuring and realigning our resources in the work unit that actually collects data, does the analysis and disseminates the results. With this particular recommendation in mind, we’re actually putting additional staff into that area that will be doing the data analysis. And to your point of more than just sharing results, we’re doing the analysis around: what are the key drivers of success? Where should we invest the resources, both people and financial, that are required to drive positive change in the system? That work is underway as we speak.
Around evaluation. I’ve touched on it a little bit earlier, around targeted funding. I’ll speak to enhancement agreements as well and expectations for regular provincial and district reporting and how we report on progress.
You’ll know and understand from having read our action plan that we have a framework agreement that’s under development this year with school districts. Part of that framework agreement that we negotiate, I’ll say, between the parties on a local level is the development of clear criteria of what it takes to be successful. Districts will be compelled to report out on it on an annual basis.
With regard to First Nation student achievement, that will not only be reported publicly — just to note that — on an annual basis, but it will also be rolled up by the province and reported on a provincial or aggregated level as well, along with the actions that will be taken provincially. We’ll be expecting the same at a local level to really move the needle forward a little more aggressively than perhaps has occurred in the past.
With regard to district supports, I think one of the things that I would like to start by saying is that we’ve worked very hard with the First Nations community to develop the new curriculum. So you’ll have all heard, of course, that the Ministry of Education, in conjunction with school districts and independent schools, has rolled out the new kindergarten-to-grade-9 curriculum. Ten-to-12 is on its way.
There’s been considerable work done on the new curriculum to honour the Truth and Reconciliation
[ Page 841 ]
Commission’s recommendations around, clearly, inaccurately depicting First Nations’ history, tradition, culture and the impact of residential schools from a First Nations perspective. I think that we’ve done a significant job on that in the K-to-9 curriculum. You’ll see the same sort of thing in 10-to-12. Learning resources have been developed in conjunction with our partners at FNESC, and the Auditor General noted earlier that we had a dedicated professional development day focused on aboriginal learners and the curriculum.
Further to that, and with regard to expectations around anti-racism work going on in the school system, we’re working with FNESC around an anti-racism research report, which is underway now. We expect to have it in April of 2016 — so coming this spring — to provide us with some insight and advice around what we may do in that regard. It forms a key component of our ERASE Bullying strategy, which is being implemented, and has been for the last year or so, across the province.
This is, actually, one of the findings that we take particularly seriously, as we know and understand that all learners need to learn in a safe environment. It’s not just good enough to be going to school and to be attending school. If young people don’t feel safe in their learning environment, it really handicaps or hinders their opportunity to learn effectively. We realize that, and we know that this element of your report is particularly significant and important. We’ll be putting a strong emphasis on this going forward as well.
So perhaps just in closing, we would like to emphasize that this is important work. The recommendations of the Auditor General, we think, are fair and reasonable and in keeping with our thinking as well — that the ministry not only is committed to taking action on all the recommendations and has been diligent, both prior to the audit report being made public and subsequently since. We’re continuing to see improving results, and that encourages us. But we know that we want to see more, more quickly and at higher rates.
We’re in a time of implementing fundamental change in the education system in British Columbia. We’re really transforming not only the curriculum in the kindergarten-to-grade-12 arena but also how we assess learners, how we report out on learning and how we support learners and teaching methodology. So it’s a time of rapid change and, in that, we want to be absolutely certain that First Nations learners’ experiences are better than they’ve ever been and that their success is higher than it’s ever been.
If we do that, we have the ability to change the life trajectory of these young people that not only benefits them but our society as a whole.
With that, I’ll close.
B. Ralston (Chair): Thank you very much.
I’ve had a couple of members who’ve already indicated they want to ask questions.
Anyone else? We’ll begin with Kathy.
K. Corrigan: Thank you very much for the report and for the response to the report as well. It was a really excellent and well-thought-out report and very useful.
I think it’s encapsulated on page 28, in the key findings and recommendations, where the report says:
“We found that ministry policies for aboriginal education did not change in response to this 2005 commitment. Despite ministry and board intentions to improve aboriginal student results, the ministry did not work with boards, superintendents and aboriginal leaders and communities to develop a shared, systemwide strategy to close the gaps with distinct responsibilities and accountabilities for the ministry and boards and specific actions, targets and timelines. Further, the ministry did not evaluate existing policies for aboriginal education to understand their effectiveness in closing the gaps.”
I think that really says it all.
My question is with regard…. I have many questions, but I’ll ask whatever number the Chair will let me and then get back on the list again.
School completion certificates or the Evergreen or school-leaving certificates. I’m wondering: how significant is this among aboriginal students? Do we know what the number is of students that were finishing with the school completion certificates? Then I have a question about exactly what that means.
D. Byng: We certainly know not only the aggregate number, but we also know the school districts and the number per district. Just in general terms, our observation is that while this isn’t prevalent throughout the province, there are some specific districts in the province where it seems to be predominantly occurring.
It would be 65 First Nations learners who left the school system in ’13-14 out of a total student population of about a half a million, I guess, young people in British Columbia. But I don’t want to minimize that. These are 65 individuals whose lives are potentially being changed by doing this. We still take it very seriously, even if the numbers aren’t high per se.
K. Corrigan: I appreciate that answer, and I agree. We don’t want to minimize it. But I’m glad to hear it’s not more prevalent.
My next question following on that is — and maybe it’s not as concerning now: do you believe that in some districts, the school completion certificate option was being pursued, however it happened, in order to remove those students from the graduation stats? Are those students included in the stats, or are they just removed? Does that count as graduation?
D. Byng: No, it wouldn’t count as graduation because this is not a graduation certificate like a Dogwood.
[ Page 842 ]
K. Corrigan: Would those students count as non-graduates, or would they simply be taken out of the mix?
T. Cadwallader: They’d be counted separately from our six-year Dogwood completion rate.
K. Corrigan: Okay. So they’d be separate.
B. Ralston (Chair): Sorry. I’m not sure that answer was recorded, so perhaps we could just….
D. Byng: Maybe, Ted, I’ll just get you to give the specific answer.
T. Cadwallader: When we look at our graduation results, Evergreens are not included in that result. They’re reported separately.
K. Corrigan: There could be, but it doesn’t sound like it’s a huge number of students. Potentially, in some districts, it could be a way to ensure that your graduation rates are a little bit higher if you remove those students from the mix, if you didn’t think they were going to be successful in graduating with a Dogwood. You don’t think that’s a factor, though, I see.
D. Byng: With the numbers being so low, we haven’t analyzed it from that perspective. We’ve been more focused on: “Let’s make sure that these kids graduate with the right credential.”
K. Corrigan: One more follow-up? Okay. Thank you, Mr. Chair.
On that same page, where you talk about the school completion certificates, which is page 38, the report says: “Another potential indicator of the racism of low expectations is that aboriginal students were almost twice as likely as non-aboriginal students to complete courses that limit their options for entry to post-secondary education. As a result, those students may need to take additional courses to upgrade, should they choose to attend post-secondary institutions, at a cost to themselves and/or their First Nation.”
Would the ministry agree — maybe Mr. Byng — that the move by the government to not have free adult basic education could have a disproportionate impact on aboriginal students, given that comment in the report?
D. Byng: I’m not sure that’s the case, but what I would underscore and agree with is that we need to be diligent to ensure that First Nations learners, like all other students, know and understand their options going forward and what they’ll need to have in terms of an education, leaving grade 12, such that they’ve got not only the right courses — perhaps, if you don’t mind, MLA Corrigan, I’ll add to your line of thinking — but also have learned the required materials in the courses that they’ve taken such that when they apply to go to BCIT or UBC or wherever it may be, they have the knowledge required to be successful in those programs.
We’re looking at both elements of it, ensuring that they know and understand what the future options are, that they’re able to take the courses that are required to achieve that and that they learn the knowledge required in those courses to enable them to be successful. We’ve seen issues crop up in each of those facets, so we’re focused on all of those things.
L. Reimer: My question is around data-driven decisions and standardized monitoring and assessment. We have talked a lot, and I’ve been very involved with my own school district in the Tri-Cities prior to me becoming an elected official. I know that there previously has been push-back when you’re talking about standardized assessment and data-driven decisions. So I’m just wondering: have things changed, and is there buy-in by our educators who are on the front lines, essentially, around this sort of thing?
D. Byng: That’s an excellent and a timely question. We’re going through the process right now with not only academics — deans of education and the like from universities — but with members from the education community, practitioners, teachers, folks like FNESC as well, really looking at how we assess learners going forward. So while we’re building the new curriculum, of course there has to be an assessment methodology that mirrors that and then reporting. Most certainly, we’re looking at that element of it now, and it’s currently under development with the broader group.
D. Eby: I don’t know, Mr. Chair, if Mr. Byng is familiar with Cindy Blackstock and the First Nations Caring Society and their advocacy around equal funding for First Nations students, from the federal government, on reserve to students who are off reserve. The core of the society’s message or concern is that if you’re a First Nations child on reserve, the federal government funds your education and may fund it at a level, in some provinces, lower than the level that students who live off reserve receive and functionally establishes a racist policy where First Nations children receive less funding for education than other children receive.
The fact that this policy exists federally is, obviously, in my opinion, an abomination, but I don’t know whether that’s the case in B.C. I haven’t heard this government talk about it. That doesn’t mean they haven’t talked about it, but it might mean that in our province, the federal government funds First Nations children on reserve to a level equal to our provincial government. Whether that’s adequate is another debate. I’m just curious about whether
[ Page 843 ]
or not First Nations children on reserve receive the same level of funding as children who live off reserve who are not First Nations children in British Columbia.
D. Byng: Well, thank you for that. Of course, you’ll see that I’m getting passed some cheat sheet here.
We do get the same level of funding. There is the same level of funding that exists in British Columbia on and off reserve. Perhaps, maybe, I could, if you don’t mind me adding to the answer…. The province’s philosophy in the Ministry of Education is that there should be equal access to education for all learners in British Columbia regardless of who they are and where they happen to be.
The province is very focused now on how we enable that to occur — it goes, actually, beyond funding on a per-pupil basis — and on how you ensure that individuals in remote and rural locations, where a lot of these First Nations learners tend to live, are able to do that. We’re very focused right now on providing things like the next-generation network, which is a technological access to high-speed Internet and the like that will sort of enable more equitable access to education.
We’re working with FNESC and the federal government on the possibility of expanding those same opportunities to on-reserve schools that are outside of the provincial jurisdiction but nonetheless have young people in British Columbia learning at those schools. So we, I would say, share your views that there should be equal access to education, regardless of who you are and where you live in B.C.
B. Ralston (Chair): Before you go, I’m just wondering. I don’t think this was the subject of consideration by the Auditor General in the report. Is there any additional comment that the Auditor General wanted to make at this time on that aspect of the question? It’s not required, but if you did, I just wanted to give you the opportunity.
S. Dodds: I think just one point to highlight is the targeted funding for aboriginal students in the B.C. public school system. Even with First Nations students living on reserve — although the responsibility for funding their education is federal — for those students attending public schools, school districts are provided the same targeted funding as First Nations or non–First Nations students living off reserve.
D. Eby: That’s certainly good to hear.
The other question I had was more in relation to my personal experience and some reading outside of the report in relation to the disproportionate representation of First Nations children in low-income families. This is because of the legacy of residential schools, because of racism in employment in the province and other systemic issues.
It seems to me that a kid who goes to school hungry or who doesn’t have stable housing is going to really struggle to graduate. I don’t know how we get there through the measures that are analyzed in great detail in this report. Where does that consideration come into the data which is gathered by the ministry — that there’s a large number of aboriginal kids that don’t have stable housing in our province? This could be affecting graduation rates. Or if we don’t have a breakfast program at the school, that affects graduation rates. Where does that come into your programming?
D. Byng: Thank you for the question. I’m going to turn to Jen McCrea, who has been involved in exactly the same types of issues that you’ve spoken to, and who has probably better firsthand knowledge than I do on that one.
J. McCrea: We agree completely. Kids need to come to school ready to learn, and sometimes that means making sure that they’ve got clothing and food and have had a place to sleep the night before. Some of the indicators and data that we do use in the ministry, early indicators, are for our youngest learners. Some of that information comes through those — so our early learning file. We’re working with UBC on that data and on how we pull from that data and then provide supports and resources.
Schools do have breakfast and lunch programs in order to support students, and we know that teachers go well above to support kids. Sometimes it means giving them some clean clothes for the day, just to ensure that there aren’t any unfair comments made by colleagues or from other students.
That is a big focus of our Safe, Caring and Orderly Schools file: to make sure that kids have the very best life chances to learn. There are so many facets of that. We use the data that’s available to the ministry. We work with our safe schools coordinators and our CommunityLINK workers, and really try and help understand what they need and what the best practices in other areas are, to be able to share those too.
J. Schafer: Just to add to that, in our work we did take into consideration the influence of socioeconomic factors on student outcomes. A lot of that was thanks to data from the Ministry of Education. They do collect that, and they factor some of that into the funding formula for boards of education. But one of the key points that was motivating us early on in the audit as we started to gather knowledge of business in the area was that the education system has a significant impact on outcomes above and beyond socioeconomic factors.
Factoring those out of the…. All other things being equal, socioeconomic factors determine outcomes to a certain level, but the education system has an impact
[ Page 844 ]
above and beyond that. That’s why we were really looking at the importance of policy and decision-making in the education system.
On the ground, we certainly saw, within the education system, people working very hard to make those connections and do what they could within their sphere of influence to address some of the broader issues around poverty that students were facing.
S. Robinson: Thank you very much for the report and for the response.
I just have some questions about the ERASE Bullying program and appreciate that there’ll be some research happening around it. I have a couple of questions about it. First of all, do you have baseline data prior to actually implementing it so that you can get some really good sense of the value of the program?
D. Byng: Jen McCrea has been leading that work, and she’ll respond to that.
J. McCrea: We have baseline data, as far as our training goes, around school and educator readiness and ability to respond to issues as they come through the school or as the child reports. What is extremely hard to measure — and something we’re working with researchers on — is how the program has made an impact on an individual child. It is extremely hard to be able to know that a teacher, through the training, has made a connection with a student that may have stopped that child from ending their life that day.
We know it’s extremely important work. We know that there are very complex cases that school districts face every single day, and we have supports in place to help that. We have the training data to be able to guide it, and we are looking at a full program evaluation.
S. Robinson: It’s always hard to measure something not happening.
J. McCrea: Which is also a good thing.
S. Robinson: Right.
If I might, Mr. Chair, I have one other question around that. I think it’s harder to measure, and that’s around systemic racism that is so ingrained that it’s even hard to notice. What caught my attention in the report was the idea that First Nations children were being directed in a certain way. I don’t think anyone would say: “Well, that was racism.” It’s not how people see the world or see, perhaps, those children as having the same kinds of opportunities. So — wanting to do good but really, in fact, limiting their options.
I’m wondering if you’re going to be capturing that data in terms of how the system itself can become the barrier and how to address that.
D. Byng: Maybe I’ll start, Jen. And if you’ve got more to add….
We certainly are looking at that as a potential factor, for sure. We’re looking at the prevalence of First Nations or aboriginal learners in various programs and in various courses and that sort of thing to see: do they represent the general student body proportionate to the group that they’re in? Or is there an overrepresentation in particular areas, like some of the non-academic studies or skilled trades versus higher education or whatever it might be? Although there’ll be a lot of arguing nowadays that it’s not so bad to be a plumber these days.
In any case, we do, do that analysis and look at it to ensure that we’re not seeing trends that might be taking kids in a particular direction.
S. Robinson: If I could just ask…?
B. Ralston (Chair): Ms. McCrea, anything you wanted to add?
J. McCrea: I just think the other piece around it is that if it’s perceived by a student or perceived by their parent, it’s real. So we need to take action on that. And we are, in a number of different ways, ensuring that it’s integrated with school districts.
S. Robinson: Along that line, is that part of what’s being measured? There’s one thing to measure, for the students, what the student experience is on the ground, student to student. But I would really be interested to learn more about how our institutions…. Are they improving around this? How are we measuring that — that we’re getting better, institutionally, around ensuring that we’re not engaged in racist policy or behaviours or actions that contribute to that?
D. Byng: I suppose the answer that I would give you would be twofold. Most certainly, we’re looking at student outcomes, for sure. Really, that’s where it starts and ends. At the same time, we are actually working quite closely with, as we call them, our partners in education, if you will, on exactly these issues.
Just a specific example. I’ve had direct conversations with Jim Iker from the BCTF, who represents public school teachers, on these issues. And we have with other senior members of the education community who, I’ll note, are on the same page.
We understand that we don’t want this to be occurring in the system and are working collectively to that end. Our measurement of success, obviously, is what’s going on with the students themselves.
J. Schafer: Might I add something?
[ Page 845 ]
B. Ralston (Chair): Go ahead.
J. Schafer: On the data side, there is also the student satisfaction survey that the ministry does every year with students in grades 4, 7, 10 and 12, and there are a few questions on that survey that do get to some of these issues. That might be a source — with your ERASE Bullying research — that you could also draw on to dig down into what the changes and impacts might be of any initiatives.
D. Byng: Chair, if you don’t mind, may I add a comment to that?
B. Ralston (Chair): Sure. I don’t want to inhibit discussion, unless it’s off base. But that’s a rare circumstance.
D. Byng: In keeping with the comments from the Auditor General’s staff, what I would say in that regard is we’re actually looking at that exact survey right now and how we’re going to get underneath, if you will, some of that information. We are going to be going out and undertaking further research with those students from this perspective and then actually start getting into focus group discussions and perhaps one-on-one interviews and things like that as well. So thank you for providing the opportunity to add to that. That’s what’s in the works at the moment.
L. Throness: Just some questions, perhaps for Ms. Schafer, some number questions — to clarify the numbers. There are 62,000 self-identified First Nations students in the population. This study focused on the 9,000 that are in B.C. public schools. Is that correct?
J. Schafer: There are 62,000 students of self-identified aboriginal ancestry in B.C. public schools, 9,000 of whom are from First Nations on reserve. We were looking at the entire 62,000 and the entire school system. That 9,000 is just to indicate the proportion of First Nations students living on reserve attending public schools. We weren’t just looking at First Nations students. We were looking at aboriginal students across the public school system.
L. Throness: Did you parse out the numbers, graduation rates for those on reserve versus those in public schools?
J. Schafer: Yes.
L. Throness: What are those numbers?
J. Schafer: The graduation rate for First Nations students on reserve from, then, the same year that we were looking at, 2012-13, was 50 percent compared with 62 percent of the aboriginal students overall.
L. Throness: The promise that the government had made….
J. Schafer: That’s exhibit 8 in your report, for reference.
L. Throness: Okay, thank you. The ambition that the government had in 2005 was to increase the graduation rate by 2015. The figures, I think, that you show are for 2014. Do you know what the rate was for 2015?
J. Schafer: The graduation rate data for 2014-15 just came out in November, right after our report was published, and it was 63 percent for aboriginal students. So it’s gone up by 1 percent from the data that we were looking at.
L. Throness: So we continue to make progress, but we have a ways to go.
J. Schafer: Yes.
R. Sultan: Well, let me add my congratulations to both the Auditor General and the ministry for well-thought-out observations on what I guess I’m sure all of us would agree is one of the biggest and most important challenges we face as a society — namely, educating our young people. I sometimes feel overwhelmed as I try to understand the challenges that you have to overcome.
Turning to the report, the report refers frequently to the gap. As an engineer, it almost sounds to me like an engineering concept. If we just put more steel in there, more energy or something or push harder, the gap will be closed. I’m not saying that facetiously. It’s almost as though what’s required is simply more effort.
I harken back to President Bush, I think it was — senior or junior, I’m not sure — who brought forward No Child Left Behind, with all sorts of rigorous state-by-state measurement standards that had to be met — or else. And by golly, more or less, they did. Now, I guess, with hindsight, it turns out that teachers are teaching the test more than the regular curriculum. There’s a certain amount of hanky-panky in the numbers. The headline in The Economist recently was: “No Child Left Behind is Being Left Behind.” It didn’t work out very well.
My hunch is that if you press the system to show comparable graduation rates over time, by golly, you will. But does that mean that education has been brought up to the same standard? I would doubt it. I would think, as a former teacher myself, the easiest thing to do is just notch those grades up a little bit more to get those pesky people from the ministry out of my life.
My first question is: are you confident that…? The pressure to produce these better numbers merely forces the system to say: “Okay, you want better numbers? By golly, we’re going to give them to you.”
[ Page 846 ]
D. Byng: All right. Chair, if I can respond to that?
B. Ralston (Chair): Sure. Go ahead.
D. Byng: I would start by saying that we’re conscious of the observation that you brought forward — that this is not purely or even at all a numbers game per se. These are real people that have real lives that we need to influence in a positive way.
I’m married to a teacher, so I get some personal insight into how this may go from time to time as well.
Part of what we’re looking at are not only the graduation rates and transition rates and things like that that we see are key indicators. It’s one of the reasons why we’re actually now working to start to track students post–grade 12. What have they gone on to do? Have they gone on to live successful lives as they’ve determined them? If we’ve prepared them for a particular course of studies or an apprenticeship in a particular trade, did they go on to pursue that?
We are looking at: how do we ensure that they not only have the credentials that they need — to your point — but actually have the knowledge that they need and are well prepared to go off to live their life and pursue a career or pursue post-secondary education in the way that they had intended to? So we’re looking at those pieces, as well, to perhaps minimize the opportunity for manipulation of numbers without actual, tangible change in the system.
R. Sultan: Again, building on some comments that Selina and David made about the environment in which learning is going on, I walked into this room convinced that these differences — the gap, and so on — are primarily a function of the social condition of the community from which these students come. Intrinsically, they’re probably as bright as anybody else, but they start out life with a lot of handicaps — not having breakfast being perhaps the most obvious one on a school day.
Then I learned this morning that only 9,000 of the 62,000 First Nations kids are actually living on reserve, which I guess forces me to think: “Well, maybe I’m falling into the trap of thinking it’s the reserves that have severe social problems and everything in the city is just fine.” But I guess that’s not true either.
I guess my question is: do you try to do correlations and analyses of the environmental factors that are going into these results?
D. Byng: Maybe just a point of clarification as well, MLA Sultan. When we talk about the 9,000 and the 62,000, while there may be 9,000 students in schools on reserves, a large number of those that are attending public schools may well still be living on reserve and attending a public school that’s off reserve or in the community, right? So it’s not as clear to just extrapolate from those numbers where they might reside.
Having said that, we most certainly do look at the other factors that would influence a learner’s education. It’s one of the pieces, actually, that we’ve been working with our colleagues over at FNESC on as well. They’re really driving the message with us that those factors in a young person’s life may very well be a barrier to them in their education and we should be….
This is where it comes back to sort of the data that’s being collected. Now it’s being disseminated and utilized in ways which then point to actions that may be taken on the ground, on a school district by school district level, to help mitigate some of these issues.
R. Sultan: Well, another way of putting it perhaps would be…. Instead of putting more money into the schools, maybe we should be putting more money into the communities. We’d have better results.
D. Byng: It may be. And it may be as simple sometimes as tracking a student’s attendance and, if they’re not showing up at school, having somebody talk to their parents or provide supports to enable them to get to school if transportation is an issue or some such thing. We’re most certainly looking at doing that.
R. Sultan: If the Chair would allow, I’d like a third question, if I may.
B. Ralston (Chair): That would be a fourth one, but given your eminence, I’ll defer to that. Go ahead.
R. Sultan: I still don’t understand clearly the role of the federal government. I mean, periodically we get posturing out of Ottawa, usually around election time, that they’re really going to change the world. At the end of the day, it seems to me as a British Columbian that we are the people who do the heavy lifting with respect to educating, policing, doing all of the socially necessary things that a government does with respect to the First Nations communities. The feds are way off in Ottawa, and I’m not even sure I believe that they send the money, but maybe they do.
Do you have a comment?
D. Byng: Well, I won’t comment on the politics of it, but what I will say is that…
B. Ralston (Chair): That’s wise.
D. Byng: Thank you.
…the federal government does First Nations education on reserve. No question about that. Maybe just as a follow-up to your observation about…. Well, we did observe a change of government. There has obviously been a change of government with a renewed focus on First Nations in Canada, so we are engaging with our federal
[ Page 847 ]
counterparts now around the level of support that they provide to First Nations learners in British Columbia and if there aren’t ways in which we can’t work in a more fulsome way with them, financially and otherwise, to increase learner outcomes and provide better supports.
J. McCrea: Could I just add something?
B. Ralston (Chair): Go ahead.
J. McCrea: I think it’s also very important to note that B.C. is the only jurisdiction in Canada that has a tripartite education framework agreement, and that is an agreement with the federal government, the First Nations Education Steering Committee and the province. We don’t speak for First Nations students as a province. That is very clearly FNESC’s role. But we do work together.
B. Ralston (Chair): Next I had Greg. I’ve got a long list here, so I hope members aren’t getting impatient, but I’ve noted all your names, so we’ll get to you.
G. Kyllo: One of my questions is already canvassed.
When it comes to the actual statistics, do we track status, non-status and Métis? The reason I raise that is my wife was born and raised in England, but she’s Métis. All my four daughters are Métis. Where do they fit? They are, obviously, First Nations ancestry. Where do they fit in the actual data?
T. Cadwallader: I can speak to that as well. The reason that we track First Nations status students is because it’s a funding agreement with the federal government that they are a fiduciary responsibility of the federal government. That’s one of the reasons that we track First Nations status and are able to disaggregate them. It’s only been recently, through our discussions around developing enhancement agreements, that we have brought Métis content into curriculum, Métis content into certain areas of the province as well, working with school districts.
We don’t disaggregate Métis students, because those students self-identify as being of aboriginal ancestry when they register at the school. On our registration forms, there may be, at the local school level, a box to say, “I’m of Métis ancestry,” to disaggregate that for the purposes of delivering service at the school district and school level but not at the provincial level.
G. Kyllo: The data that’s provided — is it just for status First Nations, or is it all?
T. Cadwallader: The data that we collect is for students who self-identify as being of aboriginal ancestry, and in some cases, in working with the First Nations Education Steering Committee, we are able to disaggregate First Nations status academic performance levels as well.
G. Kyllo: Do we report out, then, on the actual success rates of actual First Nations, on status?
T. Cadwallader: As a province, we don’t do that, no.
G. Heyman: I have three, maybe four questions.
The first one is just one of clarification, as the report notes or states that in 2005, the government committed to close the gap in education outcomes. Laurie Throness referred to that as a government ambition. I just want to clarify with Mr. Byng that in fact this was a commitment and therefore part of the mandate.
D. Byng: It was a commitment made by the Premier of the province at that time.
G. Heyman: And it then became part of the mandate of the ministry in some official written manner?
D. Byng: I can’t answer that question, George. I can certainly come back to the committee with the specifics of that. I just don’t have that information back to 2005 with us right here.
G. Heyman: I look at the conclusions of the Auditor General’s report — I’ll ask the Auditor General, or staff of your choice, to respond first and then the deputy minister — that the ministry had not provided the education system with sufficient leadership and direction to close the gaps.
In your observation…. You may not have been in a position to observe this, but it’s certainly the nature of our discussions here and reports by your office. One of the things we tend to focus on is failures of ministries or public bodies or particular programs to provide sufficient rationale or monitoring of any number of initiatives. That’s essentially one of the things you look at.
In this case, would you say that the ministry’s lack of sufficient leadership and direction with this particular initiative was different than the leadership and direction they would provide to other initiatives within the ministry? Or is this systemic throughout the ministry?
S. Dodds: We just focused on the aboriginal education piece, but what you’re looking at is the mandate from the School Act and the education policy order. It’s looking at the collaboration with boards and providing that leadership on a focus. We were looking at the aboriginal education piece. We can’t really speak to any of the other programs.
G. Heyman: Well, then let me follow up and ask Mr. Byng whether the level of leadership and direction of other ministry programs and initiatives is more concrete, tangible and fulsome than that provided to this program or if it’s a systemic issue.
[ Page 848 ]
D. Byng: Thank you for the opportunity to comment on that. I think it’s actually a material question worth considering, for sure. What I’d harken back to is the philosophy of the province historically as it relates to school districts and school boards that there’s a clear division of statutory responsibilities — legislation that clearly outlines our respective roles, responsibilities, accountabilities.
One of the premises around that has been the autonomy of decision-making of local school districts and school boards by locally elected officials. With that in mind, the province has tended to take — and this is a broad statement, so I would say it applies equally across the variety of issues that the ministry engages with school districts on — typically, a collaborative approach on issues such as this and others around education.
I would point to even the development of the new curriculum, where we engaged about 100 teachers from around the province to help build the new curriculum. It wasn’t sort of a top-down initiative by the Ministry of Education. I think, philosophically, our view and engagement with school districts is pretty consistent across the piece. I would say what’s shifting a little bit now, as the Auditor General correctly points out….
Having said all that, we do have a leadership role that the Ministry of Education needs to take with education in British Columbia, so we do need to not only monitor and track and disseminate information around performance, which is what we’ve done historically, but also to provide support to school districts that aren’t achieving the outcomes that we all agree are appropriate.
We are working on that element of things and how we do that in an effective way provincially. Then further to that, I think, the legislation that was passed recently around Bill 11 — which provides the minister an opportunity to provide direction to school districts — is a tool in the toolbox that’s sort of an option of last resort, if you will, to perhaps engage with school districts that continue to be unsuccessful. It’s our expectation and our hope that that won’t be necessary, but we do have that option, should we need it, moving into the future.
You should expect to see us continue to work with the First Nations community and school districts very much in a collaborative manner. The outcomes that we collectively want to achieve are well understood and agreed to, so it’s a matter of how to get down to business and make that happen.
G. Heyman: I have a follow-up to this, and then I have a third question, if that’s all right. Or do you want to put me back on…?
B. Ralston (Chair): Well, maybe a follow-up to this topic, and then we’ll put you back on the list.
G. Heyman: My follow-up to this is, first of all, a slightly editorial comment.
I regularly attend Vancouver school board meetings and meetings of parent advisory committees where they are regularly discussing the use, by the ministry and the minister, of financial levers that effectively curtail their autonomy.
Notwithstanding that, you gave a very long answer to my question, but you didn’t answer it. The question was whether, in this instance, aboriginal education and closing the outcomes gap, you provide, in your opinion, the same level of leadership and direction that you do in other initiatives and programs of the ministry. That should be a relatively simple and short answer.
D. Byng: Sure. I’ll give you one. I would say that the answer to that is yes, we do, but perhaps with greater emphasis than we do in many other areas.
G. Heyman: So in other words, if the observation by the Auditor General is that there was insufficient leadership and direction in this program, we can assume that may be the case with other programs as well?
D. Byng: I wouldn’t make that correlation.
V. Huntington: I have three or four questions.
B. Ralston (Chair): Everyone has got four questions. I think I’ve gone too far in this now. I’m going to ask members…. I think one or two follow-ups are reasonable. Four — I think then we’re starting to cut into each other’s time.
I’m sorry to pick on you, Vicki, but I feel I’ve made a mistake as Chair, and I’m going to try to reel it back in, if I can.
V. Huntington: It’s always the independent that gets picked on. [Laughter.]
What is the percentage of aboriginal students that is graduating with options for post-secondary? Do you have that refined data?
D. Byng: No, I don’t. We do have a 63 percent graduation rate, of course, that then allows any student that graduates to go on to pursue a post-secondary education.
V. Huntington: That doesn’t include the completion….
D. Byng: That does not include the Evergreen certificate, no.
V. Huntington: Okay. I know B.C. was an early leader in placing an elementary school, leased, on a reserve. I don’t know how many provincial public schools are on reserve throughout the province. Certainly, John Field
[ Page 849 ]
in Gitanmaax has been there for many, many years. Are you tracking data from those schools on what the performance level is of students who are in a provincial public school on a reserve, and how many of those are on reserves?
D. Byng: Sure. Chair, we can provide the specific detailed answer to the question of the member. We don’t have that information available to us right now, if that’s acceptable.
B. Ralston (Chair): Certainly. Those responses are noted, and there will be a letter coming to you just to remind you of your commitment.
D. Byng: Sure. We’d be happy to do it.
What I would say, as just an anecdotal answer, is that I was just on Haida Gwaii. You may be familiar with it, I’m sure — the little community of Queen Charlotte and then Skidegate. There’s a rather unique arrangement there where there’s a public school on leased land from the First Nations. There’s a primary elementary school in Skidegate. So all students from both communities attend that school.
Conversely, there’s a high school in Queen Charlotte that all students — First Nations, local kids — get bused back and forth and attend that school as well. So it’s not unique, although, I would say, it’s not prevalent either. But I would say that school districts and school boards are open to those kinds of ideas. Most certainly, we’re looking at learner outcomes in those areas as well.
V. Huntington: I’d be really interested in learner outcomes. That’s exactly the situation I’m talking about in Gitanmaax. I was the manager in Gitanmaax and lived on the reserve for a number of years. The elementary school for the whole catchment area is on the reserve, and likewise, right near it is the high school for the catchment area. It would really be interesting to know what the outcomes were in that situation.
Similarly, in the more rural and remote areas, other than the traditional occupations — fishing, to some extent, forestry, trapping — the vocational role models are really restricted. Like, they are teaching, nursing — if you see the nurse very often — and police. Those are the three vocational role models. So there are quite a few First Nation teachers coming through the system.
I’m wondering if you’re tracking data about the success rate of aboriginal students, the outcomes with First Nations teachers, whether there’s a noticeable difference in outcome.
D. Byng: We don’t have that data.
V. Huntington: I think it would be really interesting.
D. Byng: That’s not one of the data points that we collect.
What I will say is this. We’re working with deans of education at the post-secondary institutions to put a greater emphasis on encouraging First Nations learners to become teachers, because we do see a real benefit from having members of those communities actually teaching in those communities.
The biggest challenge so far has been, actually, that once you get them into schools, they get recruited to other positions — probably not unlike the one that you were in — in the band office, and others. It will continue to be a big priority of ours.
V. Huntington: That whole issue of vocational role model is a very restrictive world view when you’re on a reserve. I don’t know what the system can do to broaden that, but it might provide more opportunity, ultimately.
Can I ask another one?
B. Ralston (Chair): Okay. We’ll have a response to that, and then we’ll move on.
D. Byng: What I would say, Vicki, is this. We do see access technology helping to bridge that gap in rural communities as well as on reserve. It’s one of the reasons why we’re putting a real push on high-speed Internet access — to allow greater reach outside that community to other opportunities.
V. Huntington: The other….
B. Ralston (Chair): I think that’s about five now. I think I’ve been….
V. Huntington: No, they’re all numbered, right here.
B. Ralston (Chair): Well, there’s the intermediate comments. I’ll put you on the list. We’ve got a couple of hours here, so we’ll get back to you. There are a number of other members who are second-time questioners.
I still have Lana and Simon, and then I’ll begin with a list of a number of people who asked one question already — or had one opportunity, I should say.
L. Popham: I just have one question. It goes back to my colleagues Robinson, Eby and Sultan and the idea of outside influences that affect learning outcomes. I’ve always been interested in this topic, generally, in the education system. Specifically, I’m wondering if there are any records or studies being done on eye health and dental health as they affect learning outcomes.
T. Cadwallader: We’re not doing any research on that, but I’m pretty positive, knowing the scope of North America, that somebody is — but not that we’re aware of.
[ Page 850 ]
L. Popham: There’s nothing being tracked?
T. Cadwallader: No, not through the Ministry of Education.
S. Gibson: I just have one question.
B. Ralston (Chair): You’re allowed two.
S. Gibson: It’s a three-part…. No, I’m just kidding.
B. Ralston (Chair): I was waiting for that.
S. Gibson: I taught First Nations students for many years at the university level and was certainly conscious of some of the additional challenges to work with those students to ensure success. I think, indirectly, that your ministry discovers that as well in the challenges that they face.
I guess my question is maybe a little more philosophical. The idea of high school graduation is a colonial concept, obviously. I mean, that wasn’t something that…. Before we got here, the First Nations didn’t worry about that very much. But now that we’re all a collective and we’ve got this desire to succeed as an aggregation, which we believe in — I totally believe it; I’m not disputing that — my question is: how do we tailor the pedagogical style to ensure that the success that we aspire to is culturally sensitive but also matches our expectations in western civilization? How do we juxtapose those? That’s my question.
D. Byng: Sure. That’s a good question as well.
J. McCrea: Absolutely, there needs to be a number of supports in place. We’ve started that work right with the new curriculum, in the redesigned curriculum, ensuring that there was aboriginal representation on the curriculum teams and that we also have accompanying documents with the aboriginal perspectives document that we’ve done out, which is a teacher training.
How to incorporate aboriginal perspectives into every lesson plan. Inviting First Nations community into school districts to look at how we can use the lessons from the First Nations community in a math class. Really challenging educators to think differently. Also looking at the assessment piece of work, because we have to know how all kids in the province are doing.
Balancing out those different pieces around the aboriginal education agreements, on the ground in communities, between First Nation school districts and the province, but also taking it through the curriculum work and the redesign work that we’re doing with FNESC, ensuring that the Evergreen is used appropriately. There are a number of pieces that all are coming together in that work.
T. Cadwallader: I’d like to also recognize our colleagues in the post-secondary world — their teacher-training programs requiring all teachers who are in training to take courses on aboriginal education. Also, as we said, around the K-to-12 aboriginal education table, we have representatives of the B.C. School Trustees Association, the Principals and Vice-Principals Association, and the post-secondary First Nations Education Steering Committee. We’ve all recognized that each of us has a role to play in changing our society. From all of our conversations around there, all of us recognize that as an issue, and we all have a role to play in making it different.
B. Ralston (Chair): Thank you.
Now I’m going to move to a list of those who have asked one question. It would be Kathy, Laurie, David and Vicki. Is there anyone else who wants to get on that list? George. Go ahead.
K. Corrigan: I wanted to ask a couple of questions about recommendations No. 4 and 3. On page 34 of the report, in the key findings, the report says: “We found that the ministry had not evaluated the effectiveness of targeted funding or enhancement agreements.” Then a little bit later: “We found that the ministry had not evaluated either the relationship between the use of targeted funding and outcomes for aboriginal students or the effectiveness of its funding model.” Then a bit later: “After 2005, when government made its commitment to close the gaps by 2015, the ministry made no change to its targeted funding policy.”
The response to the recommendation that comes out of that, to evaluate the effectiveness, is on page 13. Two things are mentioned there. Well, a few things are but two main things. One is that there has been a contract awarded, I assume, to evaluate the aboriginal education enhancement agreements.
Just as an aside, I was a school trustee for nine years in Burnaby and certainly was really pleased when we finalized and worked with the community to sign our aboriginal education enhancement agreement. It was a long process, but it was very worthwhile and rewarding.
So that’s happening, and I’d like to hear a little bit more about exactly what that is. But also, it says: “Furthermore, in keeping with this recommendation, the ministry is undertaking greater evaluation and assessment of the programs it funds to ensure the desired outcomes are achieved and future funding is targeted at those areas delivering the best results.” Recognizing the importance of a certain amount of autonomy of school districts, and I certainly do appreciate that, I’m pretty surprised that it has gone on so many years with targeted funding and no real evaluation of whether or not that targeted funding is working.
I still don’t see in the response, other than the “look at the education enhancement agreements,” much that is
[ Page 851 ]
concrete there when it says that “the ministry is undertaking greater evaluation and assessment.” That kind of concerns me. So I wonder if you could explain a bit more about that.
D. Byng: Sure. I’d be happy to. I’ll pass the baton over to Jen in a little bit to talk about what we’re doing on the enhancement agreement side.
Just quite specifically, what we’re doing within the ministry is restructuring one of our business units and actually adding a couple of individuals to it. So we’re putting more resources of a technical nature to the data collection and analysis work that’s done by the province.
So in very specific terms, we’re doing that. The analysis that they’re going to be doing is not only going to be reporting out on, “Here’s what the performance of the system was for individual school districts,” but actually driving to the point that the Auditor General was making in her report around: what are the levers that are being utilized. And back to your point around the effectiveness of this funding, what are the levers that are actually driving change within the system, and how can we actually target or be more specific with our funding and activities and policies and the like? We are quite specifically doing those pieces in addition to doing the analysis and evaluation of the enhancement agreements themselves, working with FNESC and others in the sector.
I don’t know, Jen, if you want to add to that bit of work.
J. McCrea: Yeah. A couple of specific pieces that the enhancement agreement review will be looking at is around ensuring that aboriginal voice and aboriginal people are involved in all decision-making at the district level — and we extend that into the provincial level as well — and that aboriginal students are seeing greater success as a result of the enhancement agreements and aboriginal culture is reflective in the classroom. With all of those pieces working together, we expect an increase in completion rates.
K. Corrigan: Can I have another question?
Okay. Thank you for that. My next question has to do with recommendation No. 3: “We recommend that the Ministry of Education take action when districts have not achieved expected results for aboriginal students.” It sort of fits with No. 4.
Maybe it’s because of the desire to respect the autonomy of school districts, but again, it seems a little nebulous — what is being proposed. It says:
“The ministry currently works collaboratively with school districts to analyze results and provide opportunities for districts to share and discuss successful strategies. These opportunities include provincial gatherings during the year and learning and resources shared by the aboriginal education enhancement agreement coordinators as they travel the province.”
Is it because of wanting to respect the autonomy of school districts that there are not what seems to me should be more concrete strategies? Or is it just that these strategies are being identified as we speak?
D. Byng: I’ll start by saying that there are some specific things that we’re doing and actions being taken going forward. Perhaps I’ll put a little more meat on the bones of what you read in the action plan.
K. Corrigan: Thank you. I’d appreciate that.
D. Byng: Yeah, no problem.
Within the ministry, we do have two individuals whose jobs are to look at where we’re being successful provincially — so really sort of look at the school districts. We’ve got some that are doing very well — that aboriginal learner outcomes are on par with the general population, for example. We really want to get underneath that and know and understand what’s working well.
The other part we are mindful of is that situations are unique around the province as well, so what works in one area may or may not work in another. However, we do get underneath the successful practices in those school districts. Then the idea is, obviously, to share those learnings with those that are not being that successful.
What we do try and do is two things: have our staff sit down with the districts that are having challenges, work with them to figure out where they’re falling short, so to speak, and try and share with them not only tools and resources from the other districts but marry them up together so that there can be that learning within the system going forward. We try and do that and work in a collaborative way, so to speak, intentionally, in that regard. That’s certainly our intent and will be our starting point as we go forward.
Having said that, there are tools that the ministry has to direct districts, if need be. And you’re quite correct in your observation around a desire to certainly start by respecting the autonomy of school districts and school boards. However, also recognizing that the province has an overarching responsibility and a mandate to provide a good education to all learners in British Columbia whereby, if we need to, we can bring a special adviser on board in a school district to take a look at what’s going on and provide advice and recommendations to the board around what to do or for the minister to, if necessary, direct that school district.
As we go down that sort of ladder, if you will, obviously we would expect these to be very infrequent or rare occurrences, but there are the tools in place. I think the reason we were starting where we did in the action plan is that we really do expect to be successful with the collaborative approach. But we do have other strategies to employ if need be.
K. Corrigan: Thank you.
[ Page 852 ]
I’ve got more, but maybe you could just put me on the list again, Chair.
L. Throness: Just a couple more clarifications about numbers. I don’t know who can answer these.
According to page 24, there are 1,200 students who live off reserve who are eligible to attend school on reserve and choose to do so, at a cost of about $14 million. I assume that those are considered with the 6,000 that are there. What is the graduation rate of our provincially funded band schools on reserve? Do we know?
S. Dodds: The graduation rate is aboriginal students across the province in provincial schools. So as you’ve noted earlier, the 9,000 students that are living on reserve attending public schools are part of that graduation rate. But we don’t have the graduation rate for First Nation students in reserve schools.
L. Throness: I was told that it was 50 percent.
J. Schafer: Those are attending provincial public schools. So the question…. The 1,200 students are attending schools on reserve that are not provincial. They are funded federally, so the data on those students is not provincial.
L. Throness: I thought you said the 6,000 on-reserve students had a graduation rate of 50 percent. Is that wrong?
J. Schafer: No. The 9,000 students living on reserve attending provincial public schools had a graduation rate of 50 percent. The 6,000 who are attending on-reserve schools, federally funded, we don’t have data on. That’s federal. They aren’t publishing that.
L. Throness: Okay. The other question I had was a bit of a disturbing….
B. Ralston (Chair): Perhaps if I can stop you there. It looks like Ted had a response on that question.
T. Cadwallader: No, that’s fine, Chair.
L. Throness: The other figure I saw, which is a bit disturbing, is on page 19. It said that in 2011, 29 percent of aboriginal people in Canada had not completed secondary school, which implies a 71 percent graduation rate, which is far higher than B.C.’s.
Is that a correct analysis? Why would there be such a discrepancy? Are we not meeting Canada-wide rates?
J. Schafer: That’s for people ages 25 to 64. The graduation rate that we’re referring to is the six-year completion rate. That’s students who entered grade 8 and graduated within six years. After that time, students may still graduate — adult Dogwood or going back to adult education. So by the time they’re 25 to 64, they have graduated in other ways. But the education system’s measure for success within B.C. is that six years. The idea is there’s no reason that aboriginal students should have a lower rate than non-aboriginal students for that six-year completion.
D. Eby: I was intrigued by a comment of Ms. McCrea’s about the tripartite agreement that is unique in British Columbia. I try to catch up a little bit in between the breaks and questions. In particular, Ms. McCrea said that the ministry couldn’t really speak for aboriginal children as a result of the tripartite agreement. There’s a guest in the room who may be able to do that. I don’t know.
I guess the question I have is for Ms. McCrea. Is there somebody the committee should be hearing from in relation to the recommendations that have been made here? The Auditor General said they’ve consulted very broadly in terms of the ministry’s compliance with the recommendations and the recommendations generally from the First Nations Education Steering Committee.
J. McCrea: We are absolutely working hand in hand with the First Nations Education Steering Committee in this province related to education.
I know, Ted, you’ve got a little bit more detailed information on this one.
T. Cadwallader: The First Nations in British Columbia, as all of you would know, have elected governments, chiefs and councils, who speak for those chiefs and councils. When we sit at the tripartite education framework agreement table, British Columbia — our representatives there — speaks on behalf of the Ministry of Education and the education system, the students who are in those education systems. We represent those students who are attending provincial schools. First Nations, when they sit at that table, are fully capable, confident and able to speak on behalf of First Nations in the province, and Canada speaks on behalf of the government of Canada.
To clarify, when we’re talking about students who are attending our schools, we’re fully capable and confident of speaking about our students who are attending public schools. When the questions arise that are specific to First Nations–status students, the representation is there to speak on their behalf.
Frankly, what happens at that table is the three groups are also trying to collaborate in such a way that we don’t have jurisdictional issues, and we’re trying to do the best that we possibly can, wherever students attend school in British Columbia.
D. Eby: I guess in terms of my understanding of the recommendations from the Auditor General, whether they’re being followed or not by the ministry, it’s un-
[ Page 853 ]
usual to have a situation where we’re talking about First Nations without the actual First Nations representatives at the table and participating and providing that feedback. In fact, it seems to me that the ministry has brought someone along who has some role in this. I don’t fully understand what, because I’m just learning about this myself. It’s not meant as a criticism.
I guess what I’m asking is: to make our understanding complete, should we be endeavouring as a committee to bring a person from the First Nations part of the tri-part group to come and to comment on the recommendations and the implementation? Are we missing something at the table, is my question.
D. Byng: I think I would just, perhaps, wrap up the ministry’s answer by saying this. While we work very closely with FNESC and we value the relationship — and I think we have a strong working relationship — they speak on their own behalf.
If you have a desire to hear directly from them and their perspective on this report…. I know that they worked very closely with the Auditor General during their review and certainly with us as we put forward our response to the Auditor General’s report. If you want to hear their individual voice, certainly that’s the committee’s prerogative to consider.
B. Ralston (Chair): That’s something, perhaps, that the Deputy Chair and I, then, can take under advisement and consider. I don’t want to put anyone on the spot in this particular proceeding, but we do have the discretion of meeting again, and it’s a suggestion that we’ll entertain.
V. Huntington: It’s a question for the Auditor General. In looking at the action plan, there is a great deal of narrative in it. I’m wondering. Is it specific enough for your purposes to provide a good evaluation of progress?
S. Dodds: We have reviewed the ministry’s action plan. We recognize the report came out in November, so it is an early response, and the request for an action plan will be coming out again at a later date. We would like there to be some time for the ministry to make progress on the plans that they’ve identified and to see where they’re at with the next action plan and then to make that decision.
V. Huntington: On first reading, it’s kind of difficult to pick out precisely what the action is going to be from some of the narrative. So I just wondered whether it was going to be something that would be easily followed up.
D. Byng: Thank you very much, MLA Huntington. One thing I would note is, with our response to the action plan, what we’re really trying to do at this point in time…. We do want to get to measurable actions with timelines associated with them. Our intention is to be at that place. Where we could, we attempted to put those in now, just for the committee’s benefit and also the Auditor General’s.
However, with regard to some of them, we’re involved in a collaborative conversation with school districts and FNESC and others to really hammer down exactly what’s the right sort of response. We have a broad response to the recommendations, and we agree with it. We are consulting with the education community and the First Nations community to get into the specifics of it. We’ll certainly be happy to provide those to the Auditor General and back to the committee at some point in the future, if that’s desired.
V. Huntington: I’d be interested in seeing some of that.
If I could just make a comment. The whole report is, basically, from my reading of it, a discussion of the difference between collecting data and evaluating data. I think that evaluation and how you’re doing it would be a very important component of the action plan.
G. Heyman: Picking up on Vicki’s comment, my question will start with the Auditor General’s staff and ask the ministry for a response, as well, to the conclusion that the ministry undertook a limited analysis of the wide range of student outcome data.
Is that because the analysis was consistent with the analytic framework that’s used for outcome data in other areas of the education system and that’s not adequate or because different levels of analysis were applied than are normally applied or because, in this particular instance of outcomes of education for aboriginal children, a different analytical framework is required that perhaps takes in a different social and cultural context?
S. Dodds: I think I’ll start with this, and I’ll pass it to Jessica.
The findings in the report…. The ministry as we know it has collected really extensive data for years and has made that data available. But what we were looking for was using that data to understand the performance around the province — as the deputy minister has noted, there are some districts that are seeing very successful results, to understand why, and for districts that are not, to understand why not. There is the ability to do that analysis to have better information. It’s not going to have all of the answers — but to better inform that.
As the ministry has pointed out, they’ve been saying that they’re identifying additional staff to do that analysis. What we were seeing was just limited analysis from a capacity, a time perspective to really get underneath why the results are the way they are.
Jessica, do you want to add something?
J. Schafer: We did look back historically at what the ministry was putting forward publicly with its data. What we found was that the ministry gathers a wide range of
[ Page 854 ]
data on student outcomes and also on the profile of students in the system and publishes that data.
Early on, there were reports that analyzed and commented on the results and drew out the implications of that. Then there was a move away from that commentary around the mid-2000s. Since then, it’s really been the ministry providing a system with the data without commentary. That would be across the system — not just for aboriginal students.
All of the data is there. It’s presented, but it hasn’t been analyzed and interpreted for public use.
D. Byng: My comments would actually echo those of the Auditor General’s staff as well. Your observations are consistent with, certainly, our thinking as well.
MLA Heyman, just to respond specifically to your question around: is it the same in the First Nations arena or aboriginal learner arena versus the rest of the ministry, or different? It would really be the same approach to data analysis. I really believe that this is based on the philosophical thinking around the ministry, of course, collecting and reporting out on the system’s performance, sharing that data with school districts to allow districts to formulate their plans on a district-by-district basis.
Having said that, at this point, just to reiterate my previous comments and commitments, we have added additional resources and are taking a stronger leadership role, if you will, at looking at the drivers in the education system more broadly as well as within the context of aboriginal learner outcome and to work with school districts in perhaps a little bit more fulsome way than we have in the past to drive more positive change.
B. Ralston (Chair): I’m going to suggest we take a brief recess here, about five minutes. Then there are a couple of more questioners. We may well conclude early, but I just want to take a quick break. So if we can take five minutes, and we’ll reconvene in five minutes or so.
The committee recessed from 11:13 a.m. to 11:28 a.m.
[B. Ralston in the chair.]
K. Corrigan: This is not directly related to this, but I think it’s certainly relevant. I believe one of the recommendations of the Truth and Reconciliation Commission was that it would be an improvement and that it’s important that there be more teachers of aboriginal background teaching in our schools.
There seem to be some targeted measures by First Nations and other organizations to support aboriginal youth going to law school and other places — business school, and so on — but maybe not so much in terms of teachers. I’m just wondering whether the ministry is involved or has thought about any strategies in order to increase the number of aboriginal teachers in our province.
D. Byng: I’ll turn to my staff here in a moment, but the short answer is yes, most certainly. We agree with the finding of the Truth and Reconciliation Commission in that regard.
On a personal level, I’ve been actively engaged in conversations with the heads of post-secondary institutions, as well as deans of education, around: how do we recruit and equip aboriginal citizens to become teachers in the system and then retain them? Retention is actually one of our biggest challenges as well. So most certainly we share that philosophy, and there’ve been discussions at a senior level around it with specifics.
I’ll turn to some of the staff here.
T. Cadwallader: I don’t know if I can add much to that. We sit at the K-12 Education Partners Table. We recognize that’s an issue, and Deputy Minister Byng indicated the recruitment and retention piece of it.
The retention piece of it is quite difficult, but we do work with our colleagues at BCTF and BCSTA at the K-to-12 partners table. We have a working group around that. We have looked to find out the numbers of aboriginal teachers who have graduated, how many are currently in the system, what roles they are playing within our education system. So we do recognize that as an issue.
K. Corrigan: I’ve got endless number of questions, so just stop me….
B. Ralston (Chair): Okay. Well, I don’t have any other questions on the list. So I’m not going to let you go on endlessly, but please feel free.
Okay, we’ve got another question.
K. Corrigan: I have a question about readiness to learn in kindergarten. On page 40 of the report, it says: “There is no ministry requirement for districts to assess student readiness to learn when they first enter school or to carry out any assessments of student skills in the primary grades. We found that district practices for early assessment varied….”
I know that in the school district that I was a board member in, in Burnaby, that we had the early pre-K assessment done, and it was fantastic in terms of identifying even specific neighbourhoods where students were not ready in those six categories — social and, you know, the six categories.
Is this not done across the province, and if not, why not?
D. Byng: I’ll let Jen McCrea speak to that.
J. McCrea: Yeah, there are a number of programs that we do run provincially, starting right with our StrongStart program, and we have a number of outreach programs
[ Page 855 ]
around the province that go right onto reserves in looking at some of those hard-to-reach rural communities.
We also have what’s called Ready, Set, Learn. This is an introduction for students to come right into the school before they enter it. And yes, absolutely, districts and especially kindergarten teachers do a number of assessments with the kids around readiness to learn at district levels. There is nothing at this point provincially.
K. Corrigan: So in my district, where we did…. Every student was assessed when they came into the kindergarten system, and it provided very rich information.
Again, respecting the autonomy and doing things through best practices, I suppose, but is there any thought of trying to get that data, because it’s so important to get that data when a child is young. It can provide supports and work — I mean really make a difference — all through the lives of those students.
J. McCrea: We are currently reviewing our StrongStart B.C. program, where we expect that type of data to be coming up and identified.
K. Corrigan: Okay.
Page 43 of the report talks about analyzing data to understand outcomes. I know we’ve talked a little bit about it, but there was an original report called How Are We Doing? in 1999, and that report says it “explained the data on aboriginal student outcomes and identified what needed to improve. Over time, this analysis decreased, and by 2007, the ministry no longer provided an interpretation of the data.”
I’m just wondering why it is that that data just stopped being analyzed and then the information distributed to boards. What was the history of that?
T. Cadwallader: The Ministry of Education produces a provincial report each year — the How Are We Doing? report — and has since the date that you cited there. Our most recent report was released in November.
We also distribute and make public school district’s How Are We Doing? report on aboriginal student performance each year. The Ministry of Education, by department, will analyze that, and my department in particular, because it’s the aboriginal education department, will look at those results.
Our two secondees, who work with our enhancement agreement coordinators, will distribute those results to school districts. So we do use that in our analysis and in our discussions directly with school districts on request of school districts.
Sometimes, by invite by the Ministry of Education, school districts will come and attend the Ministry of Education to analyze those results. But as a practice, we don’t provide an analysis of those results back to districts in a report. It’s somewhat ad hoc. If it’s a district that we’re concerned about or a district that’s performing at a very high level, we’ll have those one-on-one conversations and share that information that we have with other school districts, but not as a systemic practice across the ministry.
K. Corrigan: In another part of the report — I can’t remember where it is — it talks about the accountability reporting from school districts. It intimates that it’s not working very well, that there’s a huge amount of reporting. I remember this from being a trustee as well, and it may or may not have changed. You had to do accountability reports. You had to submit mountains and mountains of data. This report says that that’s not necessarily working, and we need to improve that.
What’s the disconnect? The report here says the “analysis decreased, and by 2007, the ministry no longer provided an interpretation of the data.” Why would that have happened? You say one-on-one things can happen, but one of the whole premises of this report — or conclusions of this report, I guess — is that providing that data would be important to aboriginal students’ success. You just used the words “ad hoc.” I don’t quite understand that. Is this an issue of resources?
D. Byng: Maybe I’ll jump on that one, Ted.
I’ll bring it back, actually, to comments we made earlier about shifting the work that we’re doing from collecting data and reporting out on the performance of the system to doing the analytics and understanding what the drivers are, and then starting to work systemically to implement those. We have shifted resources and our perspective on that as well.
S. Robinson: It’s always interesting how someone’s questions sort of trigger another question. It’s this idea of retention as an issue, combined with…. On page 37, there’s a little box that talks about providing cultural awareness training. It appears from this, it looks like, that each district gets to do their own.
I guess a couple questions arise from both the question and that box, around who’s monitoring the retention issue and what strategies are in place to address that. Is that sort of at the ministry level? Or is that each school district looking at that on their own? It’s just having a sense of who’s paying attention.
J. McCrea: I’m looking to Ted to give us a specific example.
T. Cadwallader: Your question around retention — was it about retention of students in the system? Or was it retention of aboriginal teachers in the system?
S. Robinson: Aboriginal teachers.
[ Page 856 ]
T. Cadwallader: I’ve mentioned earlier that we don’t track that. So it’s hard to say right now how many aboriginal teachers we have in the system. I’ll just leave it at that. It’s hard to talk about that. If we were talking about retention of students and cultural awareness, then I could talk more about that.
S. Robinson: If I might, Mr. Chair…. Given the comments earlier from the Truth and Reconciliationaround the importance of having aboriginal teachers as part of the system, if we’re not tracking, we don’t currently know how many. Is that something that we’re looking to address and identify and work towards, given the value it would bring to the system?
T. Cadwallader: Where we start from is that we recognize that we have thousands of teachers across the system who are currently working. They’ve expressed a need to understand better how to implement aboriginal content in their classrooms. They need to raise their cultural awareness from their local territory. That’s the place that we’ve started from. It’s to say: “Okay. Our teachers have expressed a need for this. So I’ll be in Invermere on Monday night and spend Tuesday with them doing exactly that.”
I’ve had staff members sit with districts and their principals to raise their understanding of what it is their role is as educational leaders within their system. We were in Port Hardy and spent all day with all of the teachers in their school district — with a number of keynote speakers, myself included, and other members from the Ministry of Education — to raise that cultural awareness.
That’s the track that we’ve taken: recognizing that our teaching staff across the province has recognized that they have a need. We’ve tried to fill that need, working with our school districts to do that.
S. Robinson: If I could just come back to the Truth and Reconciliation around the idea of having or putting together a concerted effort to identify and support aboriginal teachers, making sure that they’re represented appropriately in our school districts across the province. Is that something that is being currently undertaken by the ministry to address, to identify, to build, so that we can say that we have put a concerted effort?
I’m concerned that there might be some barriers. If we don’t pay attention, then…. We can have all the cultural awareness we want, but unless we see aboriginal teachers reflected as part of our school community, then we’re still not doing the best we can.
D. Byng: Maybe I’ll pop in on this one. The issue is germane. It’s most certainly one that’s under discussion, as I was noting earlier, at a high level. The head of the BCTF and I have been discussing it, like I said, with educational institutions and, of course, with FNESC as well. We’re all on the same page for certain with the Truth and Reconciliation Commission around the need for more aboriginal teachers, for sure.
We’re looking at a couple of things, not only how we recruit more appropriately to have them teaching in the local community but if there are systemic barriers to getting them there, not only educational — and I’m talking post-secondary education — but on the retention side, to keeping them in the system.
As we talked about earlier, the comments about bias and racism sort of being elements that need to be dealt with in the school system…. Certainly, when I talk to members of the First Nations community, First Nations teachers at times feel that same stigma, if you will, as well. That formed part of the conversations I was having with Jim over at the BCTF.
The conversation is alive, for sure. Your point is well taken around the need to have a baseline understanding of how many you’ve got in the system to know whether you’re winning or losing going forward, right? I think that’s point you were trying to make, and it’s well taken, for sure.
S. Robinson: Yeah, that is point I’m making — exactly the point.
V. Huntington: I just want to follow up a little bit, first on Lana’s comment about eye care and dental care and its impacts on children and the lack of its impact on children. That ties into Kathy’s issue of readiness.
I’d like to illustrate how important this issue might be to young aboriginal learners coming into the system. I’d like to tell a bit of a story. A friend of mine was in residential school on the Capilano reserve — a Catholic residential school. It was two blocks from her home, and between August and December, she never saw her family.
She happened to become the first graduate with a bachelor’s on the Capilano reserve. She eventually became a lawyer. I met her in Ottawa, and she was executive assistant to the Deputy of Indian and Northern Affairs. She became an exchange person with the Senate in Washington, D.C., on the Senate interior affairs committee. I visited her when she was down in Washington.
At that time, she was just beside herself about her children and was about to take them to a doctor. She had been called into the school and asked what the problem was in getting care for these kids. She was taking her children, some of whom were in their teens, to the doctor for the first time in their lives. I just looked, and I said: “How could this be that you’ve never taken them to a doctor before?”
She said: “Vicki, I grew up in the residential school. The nuns did all of that for us, and I just assumed that the school was looking after all of that.” And her kids couldn’t see, and they were failing school. They had poor teeth. She didn’t realize it, because they had never com-
[ Page 857 ]
mented on it. For me, it was the most classic example of an impact of the residential school system.
It may not be as applicable today, as the younger families are going through the system. But I think that you would see in that whole readiness structure of preschool especially and getting into elementary school…. If you had some way of working with FNESC to develop a mobile capacity of some sort to have these children tested as they get into the system, especially in remote areas….
They don’t have the access that we enjoy, and some of their parents don’t understand the necessity of obtaining that access. I just think that the ministry ought to look at what that readiness means, perhaps in different areas of the province — and just to tell you that story.
D. Byng: I certainly accept the advice.
G. Kyllo: A large part of the discussion today has been about data driving decisions. Further to MLA Corrigan’s comments about evaluating students at the very front end, I really strongly believe it should be at the front end, and it should be consistent across all students in the entire province — you know, the ounce of prevention, pound of cure. The earlier we can identify if there are students that are entering K to 12 that have challenges, we can look further into those communities and provide additional supports that are there, like with StrongStart and those sorts of programs.
I’m also wondering what the Auditor General or the ministry views are on standardized testing. There seems to be lots of discussion and dialogue about getting away from standardized testing, and I think that is a real concern if we really want to be able to monitor the readiness of students when they start K to 12.
It’s fine to be looking at graduation rates, but we also need to be monitoring them through grades 8, 9, 10, 11, to try and identify where there are some challenges. If kids are having challenges, and we can identify that through standardized testing, say, in grade 9, we can provide those additional supports so that we are better able to keep them in the school system right through to graduation.
I’m just wondering if the Auditor General or the ministry have any comments about standardized testing. Personally, I feel it’s extremely important, and it really concerns me when you see schools moving away from standardized testing.
C. Bellringer: Just as a general comment from our office, we have taken the position — and it’s recommendation 7 — that there should be something around standardized monitoring and assessment. We didn’t go as far as suggest what it should look like, but we have taken that position.
S. Dodds: I just wanted to add — and leave a moment for Jessica too — there is a challenge with standardized testing, as it was raised in an earlier comment about preparing for the test. What we’ve identified in the report is that it’s part of understanding where students are at, and it’s one tool. What we have pointed out is that if there is that tool, then participation is important.
We have found that with the standardized assessment that’s in place, there is a concern around the participation rate, and that means not understanding where those students are at on that tool. But when we’re looking at outcomes, it’s academic and social, so there’s a broad range of indicators.
G. Kyllo: My comment was around testing, but I think evaluation may be a better way. It needs to be standardized in order to have that baseline. Otherwise, it would be very difficult to interpret the data.
S. Dodds: We have identified a number of potential indicators that the ministry could consider in looking at it across the province, and the readiness to learn and attendance are listed as part of those in the report.
Let’s see if Jessica wants to add anything.
J. Schafer: Yeah. In our work, one of the districts that we visited that had had significant improvement for their aboriginal student education outcomes across the board from grade 4 up to graduation was using standardized assessment, really using it to inform their decision-making. It’s not enough to just have the assessments. They have to be used. They have to be understood.
They also had developed standardized testing at a district level across their schools, and they were using that for decision-making purposes as well.
The provincial standardized testing has a role to play and needs to be used well, but it’s not the only piece of assessment that should be happening, because it happens at particular points, and in between that, there has to be other assessment going on. But it did seem to be playing a significant role.
The ministry has also done…. The piece of analysis that we highlighted in our report that we felt was an example of good practice that could inform other types of analysis that they do was also looking at the drivers of success and the connection between succeeding in the grade 4 reading assessment and graduation. They had provided that tool to some districts, and we saw some districts really using that — again, to drill down to a student level and figure out what to do based on those results.
B. Ralston (Chair): The district that you’re referring to. There’s a reference in the report to Fort Nelson. I don’t know whether you are comfortable with identifying indi-
[ Page 858 ]
vidual districts where the standardized testing had been used. Is that the district you’re referring to?
J. Schafer: That’s the district I was referring to in the first instance, and then other districts — for example, Vancouver Island North was working with the analytical tool that the ministry had developed.
B. Ralston (Chair): Greg, any further question?
G. Kyllo: Just if the ministry had any further comments.
D. Byng: We do, so I’ll turn to Jen, who can describe, I guess, two elements, Greg. One, what do we do with individual students, and then how do we ensure that the system continues to track its performance in a way so that we understand what’s going on and how things are working?
J. McCrea: At the earliest stages, we do work with the Ministry of Health and the Ministry of Children and Family Development, both with the early-years centres and with the early-indicator tool that will help — a number of screening processes in places in districts exactly around eyesight, dental. That is the first piece of work that needs to happen at the very, very early years, and the kindergarten assessment that teachers do is vital.
Moving from there, absolutely the foundation skills assessment. That is the provincial standardized test in both grades 4 and 7. The grade 4 exam, as mentioned by the Auditor’s team, is the first indicator for success for graduation — so a very, very important tool that we have as a system but also as individual results that go right home to parents. We have to make sure that that assessment tool is aligning with the new curriculum and the very best research that we have around assessment.
Currently there is a design team working. The First Nations Education Steering Committee, the B.C. Teachers Federation, independent schools, the ministry and other assessment experts are at the table right now building the new model to ensure that we are getting the level of information that we need, right down to the student level. Very important for parents as well.
G. Kyllo: Just a follow-up. Is it consistently applied? Are there some school districts that are not embracing the standardized testing or not participating so that we potentially could have some holes in the data that we’re trying to collect?
D. Byng: Why don’t we speak to the current situation.
J. McCrea: Currently there are some districts and specific teachers that are not supportive of the foundation skills assessment, and that’s exactly why we’re working at the provincial table around developing a new model. But what we also know is that we have about an 80 percent completion rate on the foundation skills assessment.
G. Kyllo: So with 20 percent of students, we really have very little information on how they’re doing through their K to 12.
J. McCrea: Well, provincially, yes. But there are classroom assessments done by teachers.
D. Byng: There are provincial exams that they have to all take as well.
J. McCrea: At the grade 10 level, yeah.
G. Kyllo: Do you see that as a risk? Are there reasons why it cannot be compulsory for all of the teachers to participate in providing that data? I think the data is extremely important.
B. Ralston (Chair): This is a really live area of public policy. I think you’re right to pursue it, but I don’t know whether we have the scope within this report to deal with it. Standardized testing is a somewhat controversial political issue, with BCTF taking a position, certain school boards taking a position. I’m not going to prevent anyone from answering. I just want to make sure that we stay focused a bit on this report. But you’re right. It does relate to outcomes.
G. Kyllo: Yeah. I think the method — whether it’s a test or an evaluation, whether it’s an assessment — needs to be consistently applied. Otherwise, the data that we’re collecting is not going to properly direct or advise us on changes that might need to be made.
D. Byng: Just a minute, perhaps, to summarize where the ministry’s perspective is on all this, MLA Kyllo. We do think that it’s fundamentally important, at an individual student level, that both the students and their parents have a clear understanding how well their son or daughter is doing as they’re moving their way through their education and that the FSA is part of that.
One of the reasons why, as Jen pointed out, we’re working with all the partners is to bring some commonality to the nature and application of the assessments happening. You’ll see us driving to that end as we go forward. Apart from being beneficial to individual students, we need to know and understand what the health of the system is, so to speak, and the performance of the system — you know, back to the Auditor General’s recommendations around collecting data, doing the analysis and then actually understanding the drivers of success and then funding them appropriately.
[ Page 859 ]
G. Kyllo: Just one quick follow-up. Back to the Auditor General, how important do you feel the collection of the data is, consistently across the entire student population, in order to properly, I guess, identify challenges that might be existing and the successes? Do you have any further comment on the importance or the need, necessity, to ensure that that’s happening across the entire student body?
S. Dodds: It’s part of that information. How the ministry and government choose to do it is a policy issue, but it is important to understand how districts are doing and how individual students are doing. As you’ve pointed out, it’s across time, and it’s different indicators, because you’re looking at different outcomes.
K. Corrigan: Just a couple more questions. I wanted to check. On page 30, under “Setting targets,” it’s talking about the Fort Nelson school district. It said, “Between 2008-09 and 2013-14, the aboriginal student graduation rate increased from 55 percent to 100 percent,” and that it was 2 percent higher for aboriginal students over non-aboriginal students.
I think maybe that’s not reading the way it’s meant. Is the graduation rate 100 percent in Fort Nelson, or was there an increase? They way it reads, it looks like there’s a 100 percent graduation rate for aboriginal students in Fort Nelson, which would be fabulous, but I don’t think that’s probably the case. I would imagine that there was a 55 percent increase or a 100 percent increase in the number of students who graduated. I just want clarification on that.
J. Schafer: Yes, you are reading it correctly. There was a 100 percent graduation rate for aboriginal students in that year.
K. Corrigan: How many students are we talking about?
J. Schafer: It’s a small student population. The numbers are small, so the fluctuation from year to year can be quite significant. That’s why we added that last sentence — to explain that. We did do some data analysis over a long period of time, and we took five-year means and rolling averages to make sure that we were smoothing out for those kinds of fluctuations for districts with very small cohorts.
The analysis did show that the district was performing better than expected, based on the various factors that influence outcomes consistently. So 100 percent in that one year — take that within the context of a very small and fluctuating rate. But the improvement over time and the results better than predicted, based on the data analysis, suggest that the practices that they’ve implemented and the changes that they’ve made and the focus that they’ve had on closing that gap has made a difference.
K. Corrigan: That’s impressive.
I have one more question. Going back to the foundation skills assessment…. I’m not going to get into the debate about data one way or the other, collection of data.
B. Ralston (Chair): Thank you.
K. Corrigan: You’re welcome.
But it does say on page 43: “The ministry’s analysis showed a strong correlation between participating in the foundation skills assessment for reading in grade 4, writing the English 10 exam on time, and graduating from secondary school within six years of entering grade 8.” Is that strong correlation because if a student, an individual student, was to write the FSA and challenges were found with their achievement to date, that that would be identified and interventions would happen?
I’m just trying to figure out why there would be that correlation.
B. Ralston (Chair): I think the question is directed to Ms. Schafer.
J. Schafer: Yeah, that’s certainly a plausible hypothesis. There would, I think, need to be more analysis to really understand that. But in the literature, it seems to show that assessments bring attention to student progress, and once it’s demonstrated that interventions can make a difference, you have to identify when you need an intervention. That is certainly a hypothesis.
D. Byng: Maybe just to add to that. Most certainly, when we do the foundational skills assessment work, we share it with school districts, and we expect, of course, that it’s going to be shared with teachers and students. And to your point that where children are identified as being lower than what would be expected, there is indeed work that goes on with that student to look at their individual needs and to build a plan around that.
With that in mind, the FSA is only one piece of that puzzle, right? You’ve got, of course, the teacher’s classroom experience and assessment as well, so you have to couple those pieces together. But most certainly, it is expected to be utilized as a tool to not only look at the health of the system but where learners are at and then build individual education plans around that.
R. Sultan: A question, both to the Auditor and the deputy. I slowly became aware of the fact that there’s very little direct feedback from the First Nations community on their response and reaction to this report and the ministry’s response to the report. Maybe I just missed it.
Have they had any reaction? We have the First Nations Education Steering Committee citing further protocols, etc., but have they…? If they were here today, for example,
[ Page 860 ]
what would they say about it? What would they tell us? Maybe this has been self-evident, and I just missed it this morning.
B. Ralston (Chair): Just before you give an answer, that was the subject of the question that David Eby asked. Sam, as Deputy Chair, and I will be considering issuing an invitation to the First Nations Education Steering Committee to respond. That’s something that we’ll have to decide, and I wouldn’t want to prejudge that.
R. Sultan: Well taken. I wasn’t paying sufficient attention.
B. Ralston (Chair): Okay. I don’t want to, again, bar any comments, but if Mr. Byng or Ms. Bellringer have any comments, then please feel free to offer them.
D. Byng: Perhaps I’ll start, then. FNESC did issue a public response to the Auditor General’s report and provide a public statement around their thinking on that, so that is out in the public arena and on record. I’ll leave that part at that.
R. Sultan: What was the gist of it?
D. Byng: I think the gist of it was that, given that they were involved, of course, with the work that the Auditor General did, they were…. They had two comments: that they were supportive of the findings and recommendations, as I recall — it’s unfortunate that Starleigh is not here — and that they enjoyed a good and productive working relationship with the province, and that they were hopeful, of course, that the gap would continue to be closed moving forward, recognizing what had transpired.
I guess that was going to be the second part of my answer. FNESC has been deeply involved in all facets of the ministry’s operation, from curriculum development to teacher resources to how we were responding to the Auditor General’s report on an ongoing basis.
I’ll let the Auditor General’s team speak to their involvement in the work that you did. I know that they played a role in that as well.
S. Dodds: I’d just like say, and I’m sure Carol has a couple of comments…. As part of our audit process, we met with members of school boards and members of district staff. As we toured the four districts, we met with representatives of First Nations communities in the districts that we visited as a way to understand aboriginal education on the ground and the ministry’s role and districts’ role.
Because of the role of the First Nations Education Steering Committee, they’re an important stakeholder. We did include them as part of the interview group, and we did discuss the findings of the report with them at the end of the process to understand their input and their reaction to the findings.
D. Byng: And we did the same with our response.
R. Sultan: So from the Auditor’s point of view, was it positive feedback, or did they say: “No, you got it all wrong”?
C. Bellringer: I’ve just got the statement from when they released…. The news release was on November 5. “The First Nations Education Steering Committee is calling for the rapid and meaningful implementation of the B.C. Office of the Auditor General’s recommendations,” and then they name the report.
Then they go on to describe the report. As the deputy minister mentioned, they note that they have a positive working relationship with the ministry, and it looks forward to working together to fully implement the recommendations. It goes on in more detail, but it’s a positive response from that perspective.
R. Sultan: Sounds very positive, indeed. Congratulations.
S. Robinson: Once again, as questions are asked, there are always more questions that come to light.
On page 30, there’s an inset about the accountability for transportation for First Nations students. Just in reading that inset, it seems clear that there is no clarity on whose jurisdiction it is for remote communities. I want to acknowledge the creativity of the superintendent who thought that one of the ways to address a broken-down bus was to send a teacher to the reserve, but it turns out that it was outside district boundaries and, therefore, who was going to pay for it.
If there is some sort of comment around…. I didn’t see it in a recommendation, to address transportation issues for isolated communities. Is there a plan for addressing the access piece? We can talk all we want about making sure that we address institutionalized racism and making sure we have good assessment tools and all that — that’s really excellent — but if students can’t get to school, then that’s just part of: they’re hungry and they don’t have a house and a safe bed. This is another piece, and I haven’t heard anyone address that.
J. McCrea: Absolutely we recognize it’s an issue, and it is part of the tripartite education framework agreement, that we have identified that and that we’re working towards. A piece of work is local education agreements that happen between First Nation bands and the school district. That’s where transportation is addressed. So we are actively working with all of our partners on that issue.
[ Page 861 ]
S. Robinson: Do you have any idea, now, how many students, for example, have to travel more than 40 minutes? I don’t know what the marker would be or what would be acceptable. Do we have a sense of how many students have to travel and for how long?
I’m trying to imagine putting a five-year-old on a bus for an hour and a half each way, and that just seems completely unmanageable. I could barely tolerate it. I can’t imagine a child tolerating it. So do we have standards for what that looks like? Are we going to be measuring it like all the other things that we ought to be measuring?
D. Byng: Well, I’ll hop in first and then let staff fill in the details.
Most certainly, when you’re dealing with rural and remote locations, school districts have to make decisions — and we work with them on this — around: do you provide a school or do you bus students? Dependent on the geography of the area, that starts to drive it, right? So you’ve heard lots of commentary lately around utilization rates of schools and things like that. Well, I can assure you we’ve got low utilizations rates of schools out in Anahim Lake, but it’s completely untenable to think about busing those students, given the distances involved.
We do work with school districts on a local level to establish what those standards would be. While the number of times that issues are raised around students’ access to education by busing are relatively small, we do have, I think, five that are in play right now. Perhaps Ted can speak to that.
T. Cadwallader: This issue came up. We recognize that some are remote First Nations communities with access to a school. I’ll use Moberly Lake and Dawson Creek. That’s quite a trip for students on a bus. Does the school district take that bus away? Unlikely. Are students having to travel for 40 minutes? Perhaps. The conversation goes on with the First Nation and the school district. With our advice, our direction, sometimes our problem-solving skills, we may get involved, but we recognize that a school district can do that.
There are also the same sorts of incidents that might happen in an urban area, around busing — south Delta, Coquitlam — those sorts of things too. Again, our ministry’s role is to recognize that that is an issue but also recognize that through conversations between a First Nation directly with a school district…. That does two things. One is it allows them to form a relationship. And on the other side, it allows them to problem-solve jointly.
S. Robinson: I just have two follow-up questions to that. I’m interested in sort of the transportation piece in relation to First Nations students. I recognize that in Delta and Maple Ridge and in Coquitlam, we certainly have some transportation challenges around getting kids to school. Because we’re talking about aboriginal education and given the existing challenges, this is another layer on another layer on another layer. I would be really interested — I don’t know if you’re collecting the data — in the number of aboriginal students who are particularly challenged by transportation — tracking that.
Also, who’s paying for that? Is that coming out of school board budgets, that they’re expected to address? I’m going to imagine — I would love some confirmation — that that’s built into their funding model because they have to serve these remote locations. That it’s built into the funding, I’m hoping, is the answer.
D. Byng: There is a transportation factor, yeah. Or geographic factor, actually.
K. Corrigan: I’ve got two questions. I wanted to ask about the superintendent of aboriginal achievement. The position was created, or came into being, in 2012. The position became vacant in 2014 and then was eliminated as part of the legislative changes in 2015. The report says we didn’t see evidence that the ministry gave the position the opportunity to exercise its authority to require action when school districts had poor results for aboriginal student achievement.
I’m not necessarily supportive — I wasn’t particularly supportive — of the superintendents of achievement coming in and imposing their will. But I do wonder about what…. It’s kind of tantalizing. Why did we only have a superintendent of aboriginal achievement for two years? Presumably, when the position was created, it was seen as important. It became vacant in 2014, and then they were all eliminated. What happened?
D. Byng: Why don’t I speak to where we’re currently at with that? We do agree and believe that we in the Ministry of Education need to have someone in a senior leadership position that is going to be responsible for doing things like ensuring the Auditor General’s report gets actioned, as appropriate, and working with school districts and bands around the province. We’re working with FNESC — they’re on the panel — to recruit an individual for that role.
We do think that its fundamentally important. The process is underway to recruit someone. In the meantime, we’re hiring an individual on a temporary basis to work in that capacity until we hire someone on a permanent basis.
K. Corrigan: My final question — really, really….
B. Ralston (Chair): Is there a drum roll?
K. Corrigan: On page 32, you’ve got the graduation rates for aboriginal and non-aboriginal students. For 2013, although there’s improvement — you can see that from how much closer districts are to the equity line —
[ Page 862 ]
you’ve got a couple of real outliers. I see, on the bottom right-hand side there, that you have a district — I don’t need to have that district named — that has more than a 90 percent graduation rate for non-aboriginal students and it looks like around 30 percent for aboriginal students. Then there are a couple that are pretty close to that.
What specifically, at this time, is the ministry doing — or did the ministry do — when we knew that there was a district that had such low graduation rates, particularly with the gap being so wide? Can you explain to me how the ministry would deal with that specific incidence?
J. McCrea: Maybe I’ll start, and Ted, please jump in.
Absolutely. That kind of data we receive, and our staff go out and have face-to-face conversations. Students, by name, are addressed in those conversations about what’s going on, what’s happening, and using the data right back to the FSA, about ensuring supports are in place.
Ted, I think you’re probably going to want to add.
T. Cadwallader: Sure. Thank you.
I love that question. I really like to start with that data, because when you can put data into the conversation when we’re talking about the school district — and we do use that data to do exactly that — then we have a place to start. Whenever I’ve had that conversation, nobody wants to say 30 percent is good enough. If what we’re doing for the majority of students is working really well to get a 90 percent graduation rate, then why is that not working for that 30 percent?
I’d also put in the qualifier that that 30 percent might represent seven students — right? — so keep that in mind too. We also work from the numbers, as well, to say: how many students are we talking about? If it’s small like that, we talk, as Jen said, about those individual students.
I’m really proud of our school districts when I go out there, because all of them are trying to do the very best they can to try and improve those results. It may be the first time that that person has actually seen those results like that in that conversation. Each time we do that, I always come away from those encouraged that our school districts are doing the very best they can to improve those results.
Sometimes those attempts are very effective, and other times they’re not. But that’s what we expect, right? You want to make a mistake. You want to try something. Always, through our enhancement agreement process, we have aboriginal people, First Nations people, Métis people, off reserve and on reserve, involved in those conversations, because the conversation around that data is through those education councils that we generally meet with.
B. Ralston (Chair): I think that concludes the questions. I want to thank the Auditor General and her team and the deputy minister and his for both the very thorough and considered report and the responses. I think it’s been an excellent discussion. But we’re not going to conclude consideration of this report, because there is that issue about inviting the First Nations Education Steering Committee before this committee.
That’s something that Sam, as Deputy Chair, and I will have a discussion about. We may be able to schedule something at some point in the future. I want to make sure that I follow the appropriate protocol in issuing that invitation so that we have a good discussion, if we choose to do that.
Given that, I’m going to suggest that we recess. Given that it’s about 20 after 12 now, maybe we could take an hour and come back at 20 after one. I know there are some members who are anxious that we get started on the afternoon’s agenda. Thank you very much, everyone. We’ll reconvene at 1:20.
The committee recessed from 12:18 p.m. to 1:23 p.m.
[B. Ralston in the chair.]
B. Ralston (Chair): Good afternoon, Members. We’re considering the next item on our agenda, which is the Auditor General’s report entitled The Status of Government’s General Computing Controls. It’s a report that dates from 2014.
From the Office of the Auditor General, there’s the Auditor General herself; Cornell Dover, assistant Auditor General; David Lau, who is the director of IT audit in the Office of the Auditor General; from the office of the chief information officer, Ian Bailey, the assistant deputy minister of technology solutions; and Philip Twyford, executive director, IMIT capital investment.
Welcome, all. I’ll turn it over to the Auditor General to begin a presentation about the report.
Auditor General Report:
The Status of Government’s General
Computing Controls: 2014
C. Bellringer: Thank you very much, Mr. Chair, and good afternoon.
All IT systems are vulnerable to threats like hacking and theft and to disruption from power outages, sabotage or physical damage. For government IT systems, there is often a lot at stake because these systems hold substantial and sensitive information. So the focus of this report was looking at general computing controls, which are, basically, government’s first line of defence against potential threats. They help maintain confidentiality, integrity and availability of government’s IT systems and related data.
We all need to be vigilant in ensuring strong controls are in place to protect our information assets and to
[ Page 863 ]
continuously monitor and improve our computer controls. In a recent example in December of 2015, hackers brought down B.C. Transit’s web services for several days. It shows that government organizations, despite huge amounts of investment in IT security, are not immune to cyber threats.
This report — we asked every provincial government organization in British Columbia, 148, to assess how well developed and capable their general computing controls are and to assess themselves on a scale of 1 to 5. We used a methodology for that that David will describe further. We looked at nine areas. It was a very large undertaking.
We then selected 13 of those 148 self-assessments, and we did an audit on those 13. Cornell and David will go through the results of that.
I will add that as a result of having gone through the results and when we were thinking about what we may do next time, we would appreciate your comments and suggestions. It is a bit of a different methodology that we used with this. We could not go out and audit all 148, and in selecting 13, you’ll see from the results what…. There’s a pattern that you’ll see from that.
Also, the self-assessment process is useful, but we are looking at whether or not more information would be even more useful. Should we name names? That’s one specific question to keep in mind. We did not choose to do that in this report, but we did provide feedback to the individual organizations.
With that, I’ll leave it to David to give you more details about the report.
D. Lau: Thank you, Carol.
Good afternoon, Chair and committee members. Just a little bit of background to begin. Information technology is critical in delivering key government services such as health care and education. As well, government processes billions of dollars in transactions each year, often involving sensitive and significant information.
Increasingly, government relies on third parties to develop IT systems and provide IT services. Currently there are over 600 outsourced IT systems in services across government.
The use of ITs comes with risks, such as fraud, human error and downtime — downtime being unexpected disruptions, such as from a power outage. Government needs strong controls to reduce the impact of these risks. Specifically, a subset of these controls, called general computing controls, ensures that systems and services can help organizations meet their business objectives. General computing controls are so important that 78 percent of the recommendations in our IT audits over the last ten years focused on improving these controls.
Here’s what we did. We asked 148 B.C. government organizations to complete self-assessment forms. The organizations included ministries, Crown corporations, universities, colleges, school districts, health authorities and other organizations controlled by or accountable to the provincial government.
The self-assessment focused on nine critical IT processes that are essential for maintaining confidentiality, which is all about protecting the information they manage; integrity, or ensuring that transactions are processed correctly; and availability, ensuring that critical government services are always up and running.
We audited 13 of the 148 completed self-assessment forms. We also issued a report to the head of every organization showing a comparison of the results with similar organizations. As well, we sent a report to government’s chief information officer, providing her with the summarized results by general computing control area and type of the entity.
Overall, the average self-assessed maturity level was between 2.3 and 3.4 on a five-point scale. In the 2013 self-assessments, the average was between 2.2 and 3.3. Exhibit 2 in the report shows the range and average self-assessment maturity levels for each IT process, and appendix 8 breaks the results down by type of organization.
Health authorities, ministries and Crown corporations rated themselves higher than universities, colleges and school districts. The majority of organizations assessed themselves at level 3 and above in eight of the nine IT processes, and 69 percent of the 13 organizations whose results we validated did not have sufficient evidence to support the self-assessed maturity level in one or as many as all nine IT processes. This indicates that these organizations have deficiencies in their general computing controls, as outlined in exhibit 1 of the report.
Here are some of the common deficiencies in general computing controls that we noted during the validation: policies, standards and procedures were lacking; lack of periodic monitoring of compliance with established policies, standards and procedures; rules and responsibilities were undefined; missing or limited staff training.
Maturity levels may be different for each organization, depending on the organization’s business objectives, complexity of computing systems and IT environment and the value of the information they manage. Organizations should aim for a baseline of maturity level 3. This means that procedures are standardized and documented, compliance of standardized procedures is mandated and staff are trained.
However, organizations that have complex computer needs or handle sensitive information should have a higher target maturity level. For example, a government organization that has the personal information of every person in British Columbia or that provides critical services should likely have high maturity levels.
In conclusion, we recommend that with regard to general computing controls, organizations in the B.C. government reporting entity should periodically (1) review their business and IT goals and determine the target ma-
[ Page 864 ]
turity level, (2) analyze the controls necessary for meeting the target maturity level, (3) determine what needs to be done to achieve the target maturity level and (4) monitor the progress in achieving the target maturity level.
We also recommend that the B.C. office of the government chief information officer continue to promote strong general computing controls and assist government organizations in achieving and improving their target maturity level.
B. Ralston (Chair): Thank you very much. Then, over to office of the chief information officer.
I. Bailey: Thank you, Chair and Members, and thank you, David and Carol, for the introduction and report.
Overall, the self-assessed maturity level for general IT controls has increased across the GRE — all organizations — from between 2.2 and 3.3 in 2013 to between 2.3 and 3.4 in 2014.
Generally, a maturity level of level 3 is considered the baseline. We should note that the maturity levels were higher than average in ministries, health authorities and Crown corporations and lower than average at universities, colleges and school districts.
Regarding the office of the government chief information officer, our mandate. Under Chapter 12 of the Core Policy and Procedures Manual, the office of the CIO is responsible for IMIT policy for ministries. We do not have authority over health authorities, school districts, colleges, universities and Crown corporations or other GRE organizations. They have their own governance structures, policies and procedures.
However, we actively work with these organizations to ensure that their IT controls are regularly reviewed and are maturing. We provide a lot of assistance, not only in the passive sense, if they ask for this information, but we are actively engaged. For example, we have an information security working group that all members of the GRE are invited to attend, and our chief information security officer chairs that working group.
It reports to a council of chief information officers, which is really the large GRE organizations — the Crowns, the health authorities and core government.
In terms of general IT controls on security and privacy, protection of government data and networks is our top priority. We do have a comprehensive information and security policy and security standards that meet or exceed international standards.
We have a large team within the office of the CIO. We have 41 staff dedicated to continuing to develop policy, standards and procedures regarding IT controls, as well as tools to assist in ensuring that IT controls are properly implemented. We also make these tools and information available to all of the GRE organizations. There is also information security staff within all of these GRE organizations and ministries that we work closely with.
Government has worked to improve IT general controls across our organization, and this is reflected in increased scores in the audit. We have put a number of measures in place to improve these controls. Additional measures to improve are currently underway, and further measures are planned.
Generally, ministries, health authorities and Crown corporations meet or exceed the Auditor General’s recommended baseline of level 3. We are working closely with the Ministries of Education and Advanced Education to improve IT controls within their two sectors.
The key actions that we’ve completed to improve these controls. We do an annual information security review across all ministries. We have implemented advanced cybersecurity and network security tools and implemented vulnerability scanning tools of applications in government.
Our chief information and security officer has also created a vulnerability and risk management team within our office to identify, mitigate and manage risks. As well, our information and security team has integrated our formal security requirements into schedules for our service procurements to improve the controls there. In addition, we’ve introduced critical security infrastructure into our data centres to better protect systems and data.
Now, those key actions that we have planned or are currently underway. We are developing a cloud security standard, so as the GRE moves to take advantage of the most modern cloud technologies available today, we have the right standards in place.
We are implementing a security operations centre to manage event monitoring and incident response. I think you may remember that when I was here last, I talked about the number of attacks that core government is facing every day. We are successfully repelling those with our security team.
We have made a major investment within our data centres to improve the security infrastructure there as well, and we are currently implementing a new service management framework for measurement and reporting, and also responding to incidents effectively and tracking problems so that we can resolve those proactively. We are also strengthening our IT performance monitoring and evaluation of outsourced services.
In summary, the OCIO recognizes the importance of IT general controls and is committed to the security, integrity and availability of systems and data under its mandate. We work closely with ministries to continuously assess and manage risks to the organization and improve IT management.
We provide advice to other organizations in the government reporting entity when requested, and I’d like to expand on that point. We are actively engaged with the GRE entities to improve their security levels.
[ Page 865 ]
Finally, we welcome the work of the Auditor General and her staff and the valuable information provided in the report.
I’ll leave it at that and turn it over for questions.
B. Ralston (Chair): Thank you very much.
Questions?
K. Corrigan: Well, I’m going to start with just a process question. The Auditor General spoke about how the various entities had self-assessed — in many, many cases — at a higher level than they actually achieved. The numbers that we’re getting in this report, indicating….
For example, on page 11, it starts with: “The majority of organizations self-assessed at maturity level 3 and above.” Is that after they’ve been adjusted? Or was that what they self-adjusted at, and then…? Maybe I’ve got this wrong, Madam Auditor General, but were there not adjustments made, specific adjustments that were made to those maturity levels after you looked at the 13 entities? I’m just trying to figure out what’s actually reflected in this report.
C. Bellringer: I’ll have Cornell answer it specifically. It was only 13 that we did the audit on.
C. Dover: On those 13, we did re-evaluate them, and we adjusted them to what we thought the appropriate maturity level rating was for those 13 organizations.
K. Corrigan: If that’s the case, then, my guess is that you believe that that sample was a sufficient sample to get a pretty good picture of how good the self-assessment process was. When we see that there was a self-assessment at a higher level than really was true, then, did you have any feelings that you should go back and do some more checking with the other entities, because there was so much overassessment that probably that would be the case across all the entities?
C. Dover: Well, we selected a sample that was across all the different sectors in government. It was, we felt, a good sample of all the entities in government. We had planned on doing this on an annual basis so that in future years we’d go back and we’d look at a different set of samples. But with the time and effort that it takes, we decided we’re going to do it every two years rather than on an annual basis. So 2013 was the first time we did it, and it set the baseline.
In 2014, when we did the work, we started with the baseline, and then we got the self-assessments and decided to do a sample of 13, to see how people were doing the self-assessments. I think now that we’ve had a report and we’ve stated that we think some people might have been a bit optimistic in how they were reporting, we might see a more realistic assessment, in the future, of their maturity levels.
K. Corrigan: One more question on that specific point. Given that there is a self-assessed maturity rating that has gone up through all of the different factors — “assess and manage IT” and all of those different factors — do you believe that the maturity level actually has increased? Or is it more optimism, as reflected in your report?
C. Bellringer: I’ll go for it. I’m very concerned with finding out that for the self-assessments, the ones that we audited, so many were overly optimistic. It was very disappointing. It’s beyond disappointing; it’s serious. The feedback that we gave went back to the governing bodies. So they’ll have the opportunity to…. Right across the board, we’re expecting all of the organizations to just take it more seriously. It needs to come from the board. It needs to be seen as something they should be assessing on a regular basis, even more frequently than annually if they need to.
The part of it that I’m even more concerned about is the organizations that are even self-assessing lower than a three. We did not get into enough detail to be able to say that that’s at a satisfactory level.
Again, each of the organizations has to go in. They should be looking at that very carefully, and they should be determining whether or not it’s sufficient. I’m not going to name names on this because this was just merely observations as we were looking at them. There were a few organizations that we thought: “Well, maybe theirs should be a little bit stronger, but they’re actually self-assessing quite low.” So we thought: “Well, their self-assessment is probably quite accurate.”
That’s still not a good thing, so we are looking at taking this…. As I was saying, we’re looking at what angle to take in the next series. Do we do another self-assessment? Do we now select a few for the same kind of thing where we’ll do the audit, or should we take even a slightly different approach from that and even go as far as saying: do an assessment as to whether or not we think the levels that they’re at are sufficient or not.
K. Corrigan: Well, I was just going to say…. I mean, I share your concern. I’ve been on this committee for a while now, and I remember the CORNET report and the JUSTIN report. Both provided solutions and answers and raised all the issues with regard to security of our computer systems. JUSTIN was two years, almost three years ago, and CORNET was a couple of years before that.
L. Throness: I just wanted the Auditor General to confirm. The CIO says that IT controls have improved. Do you agree with that?
[ Page 866 ]
C. Bellringer: Not necessarily. I think the…. What we did find was that the self-assessment had increased. That doesn’t necessarily mean that the controls have improved. And I do….
L. Throness: Do you know?
C. Bellringer: What we know is that the self-assessments that we verified weren’t as…. They didn’t have the documentation to support the self-assessment, so we can’t be sure that the increase in the self-assessment is justified by an increase in improved controls. We just don’t know. I’ll leave it at that. I mean, we can’t say one way or another.
L. Throness: Thank you.
I wanted to ask our CIO: are there any fives? There’s a three, there’s a four. There’s one four that was assessed. Are there any fives in the system?
I. Bailey: I’m not aware of that. David, do you want to…?
D. Lau: None.
L. Throness: There are no fives.
D. Lau: No fives. If there’s a five, we would probably definitely validate them.
L. Throness: My question was…. In choosing the 13 entities, there was only one four — most-vulnerable institution — that you looked at. Why wouldn’t you have chosen more vulnerable institutions that maybe could use…? It’s more critical that they be audited to see their level of security.
C. Bellringer: The assessment that we did was to verify whether or not the level that they assessed at was accurately reflected.
L. Throness: But it would be more important, I would think, to assess a four over, say, one that would be rated a two.
C. Bellringer: Fair enough. But I’m just saying, at the same time, it also informs us when we’re selecting a system that we want to now go in and do an actual assessment of how strong the controls are and whether that should improve. We would have to go in and look at it from a different angle.
I don’t know if you want to add something to the member’s point, though, in the selection of the 13.
C. Dover: Exactly. We selected 13. We didn’t select based on whether we had concerns over their controls in the organization themselves. We selected the 13 to see, like you said, whether the assessment was realistic and whether we could rely on those assessments going forward.
The next stage may be, in future audits, that we take some that we think are having issues and go specifically to that organization and then do the audit of that organization. That wasn’t the intent of this work.
L. Throness: It would just seem to me that you asked for suggestions in choosing which ones to audit — that you would choose the ones on a risk basis. That’s all.
G. Heyman: I note several places in the report…. It states that organizations did not have sufficient evidence to support their self-assessed maturity level in one or as many as all nine IT processes.
I thought I saw or heard in the presentation the words “did not have the capacity to assess,” but I can’t immediately find that in the report. Did I hear that correctly, or was it strictly about evidence?
D. Lau: Strictly about evidence. But we never comment about the capacity. In my experience, when I was dealing with those officials that are doing the self-assessment, we found that some of these entities don’t even have the understanding of what general IT control is all about. There was a lot of education on our part to explain to them: “This is what you’re looking for.” When we’re out there, it’s a really large undertaking.
While we’re there, we’re looking for evidence, and while we’re there, we have to explain to them what general computing control is all about. We never commented about that because it’s out of the scope. We’re not commenting about the competence of the people. We just want to see: do they have the evidence to support it?
G. Heyman: I have a follow-up. I understand the problems with trying to have expertise in every area, rather than seeking expertise, for instance in IT, from outside. I’ve had conversations over the years and quite recently, actually, at the B.C. Tech Summit with a representative within the public service who commented that a decade or more of contracts and the loss of internal expertise has resulted in a pretty thorough loss of corporate analytical capability.
I’m wondering if you can comment on that or, if you can’t comment on that directly, whether you think, as a follow-up to your comment, that a certain amount of internal capacity to at least knowledgably assess the work and contract compliance of contractors would help with the issues that you’re raising in this report.
D. Lau: Many organizations that we audit outsource their IT services, and these organizations all rate them-
[ Page 867 ]
selves fairly high. They thought that once it was outsourced, they could take care of business. But they never forgot about…. There are many general controls that have to be in place to monitor the providers to make sure they’re providing the service that you need and meeting objectives. That’s the problem with…. If they don’t have the resources, they outsource. And people thought, “Hey, I’m done. We don’t need these types of vigorous IT security systems in place,” and so on — the personnel that we need.
Yeah, it’s something that we try to educate those organizations on and say: “Once it’s outsourced, you still have to make sure these folks are doing the right thing.” With a few of them, we found out that they even allowed their service provider to remotely log into the systems and roam around in the servers without monitoring what they’re doing. These kinds of things are really bothersome once they’re outsourced to external parties.
S. Gibson: Two quick questions. How many people do we have on the ground in the government that have either direct or indirect responsibility for IT security? They might be in ministries, or they might be in a department. Maybe you don’t want to answer that question. The question really revolves around how many people are charged with this as a responsibility. For example, you mention that universities and school districts score low. I worked at a couple of universities. I don’t remember anybody, that I know of, actually doing this kind of work in those environments. That’s really the reason I’m asking that first question.
I. Bailey: I don’t know about organizations like universities and colleges. I mean, we do work with them, but of course, we don’t see all of their people. Within core government, like I mentioned, there are 41 dedicated staff within our office, and then each ministry would typically have three to five people working directly on this. The fact that we have outsourced the operation and provision of services for our data centres and networks, desktop computers…. Actually, I don’t think that’s true for core government. I think that for our organization, we have maintained it, if not enhanced the number of staff.
S. Gibson: I guess my other question is…. I was looking at somebody the other day. They had their bank accounts on their cell phone. They didn’t even bother putting a password on it — just opened it up. I’m thinking: “Not too smart.” Now, you would say right away: “That’s kind of dumb.”
I guess my question, based on that anecdotal observation is: do we train our employees to realize the gravity of issues if they do something irresponsible or even careless and that the implications for government could be quite problematic? How do we equip them? How do we guide them with the tools they need to realize the gravity of what could happen if our security system was, in fact, compromised?
I. Bailey: We do have a dedicated information security awareness team. There are three individuals that are dedicated to training and making all the employees of government aware of the risks associated with…. Whether it’s weak passwords or clicking on emails from outside, we have that dedicated team.
We also undertake our information security and privacy conference every year. This is the 17th year of that conference where we have many people from across government attending to learn about best practice for non-IT people — to learn about what the best behaviours are, really. That’s probably the most effective control we have: keeping our employees informed of what is risky behaviour and what’s good behaviour.
S. Robinson: If I understand the whole purpose of this exercise, this is really a test of the self-assessment system rather than the IT systems. What it suggests — what you’ve learned — is that these entities doing the self-assessments believe that they are doing better than they actually are doing. So there’s some sort of disconnect. Therefore, it suggests that we can’t rely on their self-assessment because it’s inaccurate. It’s just not a good tool.
I wanted to get a sense of: is this because they don’t perceive it’s as important, because it’s a value? Is this about the entity’s value for protecting this data? Or is this more about competence and capacity, or a little of both, or we don’t know? I’m just trying to get a sense of where the limitations might be around having accurate self-assessment.
C. Bellringer: The one thing I would add is, while definitely…. I think it was not so much the design but the results of having received back…. We went in and did the audits and realized that the self-assessments were not as accurate as we thought they should be. At that point, it becomes a test of the self-assessment process.
There is an element of the overall process that we’re using that should also be moving practice forward. It should be improving the controls themselves. It is drawing attention to this area, to all of the boards of directors of all those various organizations, to the ministries, to the independent offices. We’re actually asking the chief information officer in each of those organizations to sign off on the self-assessment. There’s a rigorous process behind it that we do believe does improve their operations in and of itself.
It is a little bit more than a test of it in terms of the whole process itself. But we don’t know why it’s coming across that way, other than I think there’s a natural tendency to believe things are better than they really are.
[ Page 868 ]
B. Ralston (Chair): How many of the people assessing are men, and how many are women? I suppose that would be the question.
C. Bellringer: No, we didn’t go there. But it is giving us pause for thought in terms of whether or not it’s the best way to go forward for the next round.
S. Robinson: I’m inclined to agree. If the idea is continuous improvement — if that is what we’re trying to get at…. We don’t have any fives, and the idea is to get to five, as close as you can get to five, ideally. I mean, in an ideal world, that’s, I think, what we want. I think three is a standard, but we want to push people along. But I think it’s important to try to understand a little bit about where the problem is. If it’s a value issue, if they don’t value it, then that’s one kind of issue.
But if it’s a capacity and competence issue, then it just requires, I think, different testing or different understanding around what the issue is. You can’t fix the problem if you’re not quite sure what it is.
I would be interested in trying to find out: is this about a value that they don’t…? There’s no appreciation for how important this is. If these entities recognize that it’s really important to try to get to five, then I would imagine that they would make that a priority and put resources and energy into that. If they do recognize its important but there really is no competence around the table to even ask the questions, or if there’s no capacity, then it’s a different challenge. I’m still interested and curious about where the problem is.
G. Kyllo: One of my questions has already been canvassed, but I’m also wondering: when you had a look at the actual rating — a self-assessment — were you able to draw any conclusions between the quality or the performance of the contractor and their ability to potentially sway or have an impact on the internal controls that are in place versus the culture that exists within the ministry directly? Was there any correlation between the quality of the contract service and its influence on internal behaviour versus internal behaviour having the ability to influence the performance of the contractor?
D. Lau: Based on our experiences, every time organizations outsource their IT services, the first thing we ask for is: how do you know that they are providing value? How do you know that they are meeting business needs?
Usually we ask the organizations to ask those providers to provide a report. The technical term is called 3416. It’s an external internal-controls report, an independent assessor assessing the outsource company to report back to the entity and say that they are doing okay in terms of confidentiality, integrity and availability, and so on.
Most government organizations are doing that. It’s the small organizations that just don’t have the resources. It requires money, and to do this kind of report is very expensive. So that’s how we approach them. We always ask for external assessors against external providers.
C. Dover: Unfortunately, some smaller organizations that are constrained with resources think that when they outsource their services, whether it’s IT or some others, they’ve outsourced all the risk as well, which is not entirely true. They still maintain the risk, and they have to manage and oversee what the outsourcer is doing. Sometimes, like I said, they don’t have the resources or the capability to do it.
G. Kyllo: Were you able to observe that for the terms of reference or the contracts or the way the contracts were negotiated there could potentially be changes with the way those contracts are let to provide further protections for the organization without necessarily having to rely on increasing all of the knowledge within the entity itself?
D. Lau: May I add that most of the contract…. In some of the small organizations, the external service contract is so poorly written that there is no provision there to require some sort of independent assessing or assurance type of report. So we make recommendations to those organizations, saying: “Either you go and adjust those agreements to give you that kind of a right or privilege to acquire this information, or look for someone else.”
G. Kyllo: So the opportunity definitely exists for improving the terms of the contracts. Is that potentially an opportunity for the chief information officer to lend a hand in maybe vetting those contracts?
Every organization is going to have varying experience with respect to terms and the conditions of those contracts. Is there a role that the information officer may be able to play in helping vet those contracts to prevent or put some of those safeguards in place at the writing of the contract rather than trying to mop it up or manage it at the back end?
C. Bellringer: Some of this goes back to what was brought up yesterday. We had mentioned that in the report that’s coming out on the challenges in large IT systems, we do start to explore some questions around the overall governance.
My personal view, and one that you’ll see in one form or another in that report, is that I think there’s a role for government as a whole to play. I think there’s a role for ministries individually to play, as well as the organizations. How it gets puts together is up to government, but I’m not comfortable that….
This is common with a number of systems and in a
[ Page 869 ]
number of jurisdictions, so I’m not looking at it, saying B.C. is standing out there alone on this one. But the number of organizations out there, and the varying capacities of those various organizations, to me, means that you’ve got to have something somewhere to provide support in a structured way. While the informal processes in place are useful, you can’t be sure — by ministry or by organization, depending on how you slice it — whether or not each organization has got the ability to do all of the things that they need to be doing to provide that assurance.
B. Ralston (Chair): Mr. Bailey, did you want to comment on this?
I. Bailey: Just to say that within core government we do have a mandatory process in place called the security threat and risk assessment, which is a robust methodology for assessing risks and identifying the right solutions for those risks. It’s a mandatory process. It’s also an automated system that all ministries use, including outsourced business initiatives. That’s required. It’s mandatory for all, whether it’s insourced or outsourced. My staff act as consultants in completing those assessments.
I will say that over the past few years, our experience is that it’s similar in results to overestimating their maturity level on their response, but we do work with them to actually ensure that there is evidence in place. For our most critical systems, we are now requiring that they have independent, third-party assessment of those controls to ensure that they actually are doing what they said they were going to do to address those risks.
Recently, we implemented the critical systems standard for government, and we’re now working across all of our applications that are deemed critical to address those issues, whether they are insourced or outsourced. We will make all of that available to any of the broader GRE area organizations, but again, it’s on a voluntary basis.
G. Kyllo: Just a quick follow-up. Further to what David had indicated, that there were some very poorly written contracts — and I guess having identified the inconsistencies maybe with the wording and the strength of some of those contracts — is that now something that is an initiative? You’ll actually be going out or encouraging the opportunity to vet some of these contracts, to see what you can do on modifying them or cleaning them up, so to speak?
I. Bailey: We do that on our core government contracts now, so we have staff that do that. Plus we have our strategic partnerships office, which provides guidance.
Maybe, Phil, if you want to talk to the guidance that your office provides, as well as the strategic partnerships office.
B. Ralston (Chair): This is to other entities outside of core government?
I. Bailey: No, they’re within the office of the CIO.
P. Twyford: The strategic partnerships office provides advice and guidance to ministries on all of our big outsource deals, so they actually help the ministries, reviewing their contracts, making sure we’ve got best practices in place.
We work with an organization called CORE, the Centre for Outsourcing Research. They’re based in Toronto. We’re actually a member of that organization, and we regularly put on training courses for not only ministry representatives, but we do invite representatives from the broader public sector who do want to participate. Health authorities, Crowns have sent people to the training courses that we host in Victoria, so we bring the trainers in and actually run the courses here.
We do offer those courses up. We provide an opportunity to organizations to come in and work with us. We’re not in a position, though, where we can mandate that they must use those services, and that is a challenge. It is a governance challenge, but we do make best efforts.
We do the same thing on many of our IT projects and many of our initiatives as well. We develop best practice, and we offer it up. We say, working with independent experts: “This is what we think best practice, good practice, looks like.”
Again, balancing off to the question of: should we be aiming for a five? Quite frankly, a five is extremely expensive and very difficult to maintain, but we’re kind of aiming for that three-to-four range, providing the advice and guidance so that ministries and organizations can use that. And it’s clear the responsibility is for ministries. We do work with other organizations to say: “These are the standards we’ve built. We recommend that you use them.”
G. Kyllo: Would it be fair that those entities that are not availing themselves of your service are potentially at a higher risk than those that are, and would that be maybe of interest to the Auditor General? When you’re looking at which entities you look at in the future, those entities and organizations that are not availing themselves of the service and the trainings available are likely, potentially, to be higher risk and may be an area where you want to focus your attention. Would that be fair?
B. Ralston (Chair): I see an acknowledgment.
R. Sultan: Listening to the evidence of the Auditor General and the representatives of the chief information officer, I form the conclusion that the Auditor General uses words like “uncomfortable” and “concerned.” I’m not even sure if the word “alarmed” crept into her vocabulary. Perhaps it did.
On the other hand, the representatives of the chief information officer portray a calm, unperturbed, perhaps
[ Page 870 ]
even quite satisfied view of how things are going. Is there some discord between the views of the office and the views of the Auditor General?
I. Bailey: I don’t believe so.
B. Ralston (Chair): It’s a shorter answer than usual, but it’s a good one.
R. Sultan: Is that your full answer?
I. Bailey: Believe me, I’m well aware of the risks that all organizations are facing, whether that’s private sector or public sector. That’s a fact. And when I was here last, I talked about the number of attacks that government infrastructure faces every day.
If I don’t appear alarmed, that’s probably more of my personality, because we are under attack constantly, every day. But I have competent staff that are working, and our suppliers as well are competent, at addressing those.
I’m not going to comment on the broader GRE entities in that regard, but I think any smaller organization is not going to have the resources available to it that a large one will. Large organizations can, just by their very nature, have specialists and dedicated staff, whereas it’s more challenging… That’s just natural. Whether that’s private or public sector, that would be true.
R. Sultan: I would have thought that a meeting of the Public Accounts Committee of the British Columbia Legislature to review a report on computer and information systems could be honoured with the presence of the chief information officer herself. Do you have any comment on that?
I. Bailey: I’ll pass that on to her. I do have responsibility for information security and all of the operation of the IT infrastructure for government. As well, I am recognized, at least previously, as an IT expert and have that ability to respond to your questions in that regard.
R. Sultan: But according to her bio I just checked out on the iPhone, she is in charge of all policy and direction, and I suppose you must report to this lady. Is that correct?
I. Bailey: That’s correct.
R. Sultan: Would you say her background in the field that we’ve been discussing isn’t perhaps as extensive as yours?
I. Bailey: I’m not going to comment on that.
B. Ralston (Chair): Good idea.
I. Bailey: But I will pass on your request to her.
B. Ralston (Chair): I should just say, on behalf of the committee, that generally what I request as the Chair is the responsible person. Typically, in a ministry, it’s the deputy minister. That’s not always possible. And without some….
There are some procedures that we could take if we wanted to compel the attendance of a given individual, and that would require a vote of the committee. Generally, I think most entities are cooperative and send the most qualified and capable person that they view as being able to answer the detailed questions of the committee. It’s not always the case that we have the CEO or the deputy minister here, but it is desirable.
I don’t say that out of any disrespect for Mr. Bailey. I think he’s demonstrated competence and a capacity to answer all the questions. I don’t think there’s anything that he hasn’t answered, so I’m not sure it’s a real issue here. But just as a general note about who we try to get before the committee, I just wanted to assure the committee that we do strive for the most senior and responsible person.
There’s a principle in the British Public Accounts Committee that they look for the responsible officer or the accounting officer — the person that’s deemed to be answerable for that department — at the highest the level. I don’t want to sound too defensive, but I think I’ve made my views known, and that’s generally what takes place when we’re soliciting people to come before the committee.
I’ve probably gone on long enough, and I’ll move to the next person, unless you have anything further.
R. Sultan: I just was going to offer a final comment to the Auditor General, who solicited suggestions as to how she might conduct her office’s affairs in what I think we all agree is a very complex, very rapidly moving, very high-risk, subject to daily attacks, complicated world where even federal governments around the world are not immune to invasion and theft.
WikiLeaks comes to mind. I think the perpetrator is hiding in Moscow and leaking the personal chats between Mrs. Merkel and….
B. Ralston (Chair): He’s not in Victoria anyway.
R. Sultan: That’s right.
We don’t want to underestimate the point you make that this is a tough area, and we appreciate the skills you bring to bear. But I would suggest to the Auditor General that she might want to conduct two tests.
One is to one day call up the chief information officer and say, “In my capacity, and charged with the responsibility for testing the integrity of our various systems, I would like just to shut down facility X,” wherever it
[ Page 871 ]
might be — Kamloops, some office building in Victoria, whatever. Simulate a seismic event which just puts that particular facility out of business, and see how rapidly it takes the rest of the system to respond. That’s point 1.
Point 2. Go to London Drugs and buy, for $25, a thumb drive. Find one of your more James Bond–inclined employees to go in and try and steal a big bunch of data and download it and get the hell out of there in a hurry, and just see how difficult that is. I suspect it’s really not that difficult.
These are two ways you can test the security of the system upon which the mere existence of government relies.
B. Ralston (Chair): Mr. Bailey, did you want to comment on either of those two suggestions? Or the Auditor General?
C. Bellringer: We’ve noted them. I’m not going to….
B. Ralston (Chair): Not going to give away your methods. Fair enough.
V. Huntington: My concerns, I think, somewhat follow on Mr. Sultan’s.
Yesterday, as the Auditor General said, we were discussing, some of us, anyway, what we felt was an accountability gap in the development of IT systems and programs, big and small — and we were talking big yesterday — throughout the system, that there is no accountability anywhere for oversight, control, compliance. It’s left to all of the different entities on their own.
On page 7, the Auditor General says: “The B.C. office of the government chief information officer is mandated with governance authority for standards setting, oversight and approvals for the province’s information and communications technology.” B.C. government organizations are responsible for following the spirit and intent of that.
Well, that does what? What is the process for following all of the different entities as they try to develop the spirit and intent within their IT systems? Do you approve their processes? Do you assist them in developing them? Do you have compliance requirements? Do you have people that go in and say: “Just hold it. We’re testing this before you proceed”? Or do you say, “Does the spirit and intent of our systems apply?” and if they say yes, that’s it?
What I’m trying to find out is: what are the tentacles that reach into the ministries and through the ministries into the entities? You say you work with the ministries. You don’t work with some of the entities, but you work with the ministries responsible for the entities. What are the tentacles you have at your disposal for moving down into the system to control it, to really provide approval, to provide the oversight that the system obviously needs?
It’s in real trouble in some ministries, and there’s nobody that we can find — not through the comptroller general’s office…. There’s nobody in government that we can say: “What went wrong?” Nobody that I can find, yet it would seem to me your office is it.
I. Bailey: Within ministries, there is, and that’s our office. I did speak of the security-threat and risk-analysis process that we undertake with all projects that’s mandatory, and we review all of those. I think within core government, we have a process. We do make the same framework tools available to the broader government GRE entities.
But you’re right. It’s for them. They have to take that upon themselves. We will help them, but it’s their choice.
V. Huntington: I think what we’re sensing around this table and perhaps what…. I can’t speak for the….
I. Bailey: I think I’m agreeing with you, maybe.
V. Huntington: Well, yeah. But then, it would seem to me that if the chief information officer were here, perhaps we could say to her: “Are you approaching the decision-makers within government and talking to them about this accountability gap?” How can something like Panorama happen with a chief information officer that is supposed to be setting up approval systems? There’s a gap in my understanding of what you do.
You talk about risk and assessment. That has nothing to do with the development of multi-million-dollar programs and whether they’re being effectively developed or not. Nobody’s accountable for that. Nobody oversees that opportunity within ministries to develop things that aren’t working. It’s happening ministry after ministry after ministry — multi-million-dollar expenditures that go fizzle. And who’s responsible for ensuring that doesn’t happen on such a regular basis?
I’m sorry if I sound frustrated. I don’t mean to be rude. It’s obviously not you guys. What would be your suggestion to the system for reforming it?
I. Bailey: I’ll have Philip talk about our IT capital management program that we have implemented over the last couple of years to address these concerns. But I want to address your governance concern in the broader entities.
It might be useful for you to hear, so Philip, did you want to talk about your program to address this?
P. Twyford: We have spent the last two years working quite diligently. We’ve brought in quite a number of experts, very large international organizations, to help us with a review of how we manage IT projects. So to MLA Heyman’s earlier question, we have actually reset the balance, in some cases, between the level of outsourcing and the level of insourcing.
We have actually hired 100 net new staff last year into IT organizations across ministries. We’ve developed a very robust framework. There’s a group of deputy ministers that reviews all of the IT projects.
What we’ve also done is every one of our IT projects…. There are 52 of them underway right now in ministries, but every one of those is broken down into small pieces. We have a diligent review. In quite a number of cases, we’ve brought in external third-party organizations to do a review before it comes forward for assessment. We have a very robust business case process. We have a robust analysis and financial review process that we put these through.
Some of our current projects that are underway right now, we actually have mentoring agreements in. We’re working with the Public Service Agency to build that capacity and build the talent. We’re working with the comptroller general’s office on policies and procedures around those. We’ve made all of that information available to our colleagues and Treasury Board staff, who have the ability to, through Treasury Board, actually provide direction to some of the broader organizations within the GRE.
I think we’ve done quite a bit of work. Certainly, for the projects that have started within the last couple of years, we are seeing progress. We have brought in those third parties — Ernst and Young, KPMG. We brought in PwC to do independent assessments and reviews, and those reviews are very positive.
I understand what you’re saying about the large projects. We, too, get quite frustrated by those. Certainly, we have put in a very clear governance structure. There is an individual for each one of those projects that is named and held responsible. They are always a business person, because these are business projects with an IT component. They are not truly IT projects anymore.
We have shifted the governance framework. We have put in place policies and tools. We’ve brought in outside experts to help us. We’ve built capacity to do that, and we’ve hired a new staff to rebuild some of those organizations.
V. Huntington: I guess I would like the Auditor General to do a test of all of that, because we’re not seeing it here — unless I completely misunderstand your role within the reporting structure for the entities.
B. Ralston (Chair): I suppose within a parliamentary system, the ultimate responsibility is the minister. I think the office of chief information officer reports to the Minister of Technology, Innovation and Citizens’ Services. So at the level of political accountability, that, in a parliamentary system, would be, I suppose, where the buck stopped in that respect.
I think your question is a good one in terms of peeling away where the accountability at the senior staff level is being assigned.
V. Huntington: Excuse me, Mr. Chair. Could we hear the governance side of it, which was going to be commented on by you?
B. Ralston (Chair): I thought that was just a fairly extensive answer on that very issue.
V. Huntington: Was that the answer you were giving? That’s fine, then.
I. Bailey: Well, I don’t think I can answer the overall governance question.
V. Huntington: Oh, I’m sorry. You said you wanted to comment on that.
I. Bailey: No, I’m sorry.
B. Ralston (Chair): Okay. I’ll move on to the next person.
George, you’re next.
G. Heyman: Thank you. I have two questions — one for the Auditor General and one for Mr. Bailey and Mr. Twyford.
The first one is…. Mr. Lau, in answer to my earlier question, said that he encountered or your office encountered far too often the assumption that once something was contracted out, that’s all that needed to be done, that it would be taken care of, and there wasn’t particularly the capacity or knowledge to evaluate compliance on any number of measurements or that it happen.
We’ve reviewed a number of IT projects here, whether it’s health records, where we’ve found that security promises in the contract weren’t kept, hardware upgrades that were the whole premise of the contract weren’t met. We’ve seen contracts in Panorama where contract deliverables weren’t met, and the contracts were either extended or rewritten. We’ve seen other IT systems that simply have failed to work or gone massively over budget.
There’s quite a litany of failure of security measures, performance measurement or delivery, contract compliance and a number of other things, and you’ve reported to us on that.
My question is: as a result of this audit and what you found in this audit, does that point you in the direction of any further investigations that you think you might want to undertake to give a better sense of the functioning of government IT systems and contracting and contract management in general or that you think we should be asking for in order to properly perform the due diligence that comes with being a member of this committee?
C. Bellringer: One of the things…. I’m just jotting down the number of different topics that are coming out
[ Page 873 ]
of the discussion. One of the major distinctions is between projects where there’s…. It’s a new project. It’s an implementation. It’s a new contract in order to acquire a new system. Those sorts of things were the subject of many of the audits that have come forward to this committee.
The next one on the list in that category that we’ve already identified is not an audit. It’s a study, an information piece, into the challenges in implementing large IT projects. That one is coming up.
Then we identified Cerner within the Ministry of Health to look at the actual implementation and whether or not, in effect, lessons have been learned from previous learnings. Have those improvements actually been made, and can we see that in that detail, when you really are getting into the detail and doing a full audit on that particular implementation?
Then there’s the maintenance piece. Now you’ve got all these systems out there. I’d say the governance issue is over both of those things. Again, we’re hearing quite a lot of activity taking place within ministries that the CIO office is doing oversight around.
My greater concern…. I’m not giving a blessing on everything else, but I still have a concern with the obvious gap in the organizations outside of core government, outside of ministries — all of the Crowns, agencies, boards, commissions that are all out there and each individual organization. While I respect the board’s role, at the end of the day, it’s still government’s responsibility to ensure those boards are doing their job. Are they getting adequate information — let’s keep it to the IT area — on both the implementation side and on the maintenance side?
We saw from this exercise that on the maintenance side, I do have concerns in a number of the organizations where we were out there, and I would suggest David’s reflections are from a number of organizations he would have been out doing work with. Whether it was this or some other, we do control work when we’re doing the audit of the public accounts, so we have quite a broad exposure to a lot of these organizations. But I’m still not seeing any…. It doesn’t need to be in one single organization, but there does need to be some accountability in that framework piece. As I say, we’ll bring that out going forward.
When we’re doing our…. We do the update of our performance audit coverage plan every year. I have to say I’m feeling more and more like we have to devote more resources into the IT area. While we were doing our planning this year…. We haven’t made that final decision, but I am considering shifting some resources from other places to have more IT people in the office. We have quite a large staff, and we do quite a bit of work as it is, but I have enough concern to suggest we need to put more resources into it.
G. Heyman: Thank you.
My question for Mr. Twyford. In terms of the response to the maturity level, you said that reaching maturity level 5 would be very expensive.
I think we all know that perfection is hard to achieve, but you said you’re aiming for three, and three is: “Procedures have been standardized and documented and communicated…. It is mandated that these processes should be followed; however, it is unlikely that deviations will be detected.” No. 4 is: “Managed and measurable: management monitors and measures compliance.” Frankly, given this report, it’s beyond me why you would aim for three and not level 4.
My question is: why? Do you think that aiming for three is an appropriate response to the report we’ve just reviewed?
P. Twyford: I did say between three and four, and it depends….
G. Heyman: You said three. You can check it with Hansard.
P. Twyford: I believe I did say three to four. I’m happy to check that.
B. Ralston (Chair): We don’t do read backs here like they do in court.
P. Twyford: Okay. Our target is between three and four, and it depends on the organization. It depends on risk assessment. For organizations with a relatively low degree of risk, a three may be acceptable in terms of a baseline. Four is ideal. Five is generally, for public sector organizations, very expensive.
I. Bailey: I’ll respond to that as well. As you know, government has many different programs with different risk profiles. I agree that there are some organizations that should be getting as high as they possibly can. There are others where they are not managing critical information or critical services, and it’s of less importance. I don’t think we have decided on a targeted number at this point in time.
B. Ralston (Chair): Carol, did you…?
I. Bailey: I would actually appreciate an answer from your office.
C. Dover: Our recommendations are, basically, stating that the organizations have a look at their risk within the organization and determine what the target maturity level is. It’s based on the complexity and sensitivity of the information that they’re gathering and maintaining.
[ Page 874 ]
We went with three as a baseline. It’s a good place to start because you have some procedures in place. You have people that understand what it is that they need to do, and it reduces the risk tremendously over a level 2 to a zero. Ideally, again, five is what we strive for. It’s not always what we can attain. Four is definitely a process where it is managed and you’re in a better place.
Again, it depends on the type of organization and the type of information and the types of systems that are being run.
K. Corrigan: With regard to managing third-party services, I believe it was said that there are 600 contracts out there for managing IT. Is that correct? I heard a number — 600, I thought. No? Okay. I’m getting a nod and a shake.
I. Bailey: You did. That was in the Auditor General’s report, and I think they could explain exactly what the 600 means. I think it’s actually services and not contracts.
B. Ralston (Chair): Did you want to explain that, then, Mr. Dover or Mr. Lau?
D. Lau: When we send out a self-assessment form, we ask the entity to list how many outsourced IT services. Ian was right. It’s really outsourced services, not always systems development or anything.
K. Corrigan: Okay. I’m wondering whether or not…. We’ve talked a little bit about this. If IT services are outsourced, there’s a certain responsibility to make sure that the standards still are being met. I note that the office has said that some services are being brought in, or maybe there’s more oversight. There has been a hiring of 100 IT specialists.
I’m wondering if there’s any comment about whether a comparison of standards and safety, I guess — a comparison between when things are outsourced or when they’re kept in-house…. Can you check it? Like, how can you check it? Has your office, the Office of the Auditor General, tried to dig down and see whether or not those outsourced services have the same level of assurance as in-house? Also, what proportion of all IT…? Is that 600 a third, half or what of government IT services? That’s a bunch of questions.
D. Lau: Yeah, okay. The 600 is really just a number that we collected from them, from the 148 organizations that we surveyed.
C. Dover: As for the assurance, with government entities, we can actually go in and do the audit ourselves, and we can get direct assurance on what they’re doing. With third parties, we rely on the report that David mentioned earlier — the 3416, which is a service auditor’s report. That’s our only mechanism for getting assurance from third-party entities. So we’re relying on the organization to ask their service provider to provide them that report. If they don’t receive that report, then we really don’t have a good handle on what assurance they’re getting from the service provider.
K. Corrigan: Are you comfortable with that level of assurance in that they’re doing their own report on how well they’re doing? Or they’re getting an external person to do it.
C. Dover: It’s an external audit done on the service provider.
C. Bellringer: It’s a mechanism that was put in place…. I sort of remember. Frighteningly, we probably both remember the genesis of much of that in the days where….
The external organization is serving many clients, and each of those clients wants assurance. So a mechanism has been developed where they go in and get an audit done, and then that audit report is available to all the various users. It gives you a great deal of information about the strengths of the controls, and it’ll also identify any weaknesses in the controls, if they’re significant.
K. Corrigan: So you’re comfortable with that.
C. Bellringer: With the process, yes, but not when it’s missing.
K. Corrigan: Right. No, that would not be helpful.
B. Ralston (Chair): That might be a given, yes.
K. Corrigan: I wanted to ask about the overestimation of the sophistication with regard to the computer controls and the number of times that they overestimated — some of those specifics. When you go through the table 1 on pages 12 and 13 and 14 and 15, the stuff on the right lists some of the deficiencies in general computing controls. Those deficiencies would have been just in the 13, right? Because those are the ones that you checked into.
Just as a comment, I guess, I find it kind of scary that out of 13, you would have four, for example, whose procedures with regard to the installation and accrediting solutions and changes were “ad hoc, informally documented, still being developed.”
Change management processes were “not established, not formally documented, in the process of being developed, in the early stage of implementation.”
And then with regard to ensuring continuous service, IT continuity plans were “non-exist” or “in existence but neither updated nor regularly tested.”
We haven’t talked about those specific findings, but I think it’s important to get it on the record that these are
[ Page 875 ]
really seriously lacking, in many of them. Did that concern you, that they were so far off in some of these control mechanisms?
C. Dover: Well, the concern was that they thought they were addressing a risk that they really hadn’t. If we talk about ensuring continuous service, they might have had a business continuity plan, but they never tested it. So they don’t know if the business continuity plan was even going to function if they had an emergency. They were kind of in a position of being comfortable with where they were at but not identifying that there may be higher risks within the organization that they hadn’t really identified.
K. Corrigan: It was certainly concerning to me. Like, seven out of 13 organizations didn’t have the sufficient evidence with regard to managing the physical environment. Managing the physical environment was the subject of, or certainly part of, for example, the JUSTIN report, saying so many people had access to very sensitive information. Same with CORNET. So I’m surprised that this is still continuing. I guess it’s more a comment than a question.
L. Reimer: I guess my question is for Mr. Bailey and for the Office of the Auditor General. We’ve heard lots about our own systems and the maturity levels of various ministries and post-secondary institutions, etc. Those maturity levels are determined by an internationally accepted framework.
I guess my question is: how are other governments handling this? How are other provincial governments handling this? How are governments down in the United States handling this? We certainly know that everybody is threatened every day, both private and public agencies, internationally. There must be some form of best practice out there that’s being utilized, possibly, by another government. How do we compare? Technology is kind of a new area for everyone, but how are we comparing against other governments that face these same issues and challenges?
C. Bellringer: We didn’t do an interjurisdictional or international comparison on this one, so we don’t have that information.
C. Dover: But I can comment on standards. There are a number of international standards, like…. The ISO 27001 is an international standard for security. There is COBIT, which is a standard for governance for information technology. There’s ITIL, which is a standard for data centres. There are a number of them out there that are international and are followed by most organizations. Just having the time and resources to be able to implement them all, I think, is one of the issues.
I. Bailey: In terms of those standards, our information security policy is based on the ISO 27001. My organization is also an ITIL standard–based organization, and the toolset we use is ITIL-based. We aren’t, I would say, at sufficient maturity level yet on that, because that’s new for us. We are also a TOGAF standards…. We follow the TOGAF standard for business enterprise architecture. That’s ensuring that your IT implementation is meeting business needs. So we have that as well.
Philip, did you want to talk to the standards that you are…?
P. Twyford: We also follow the COBIT standard, COBIT 5 — control objectives for IT, as it used to be referred to — and a framework called Val IT for our projects and project management. We do adopt quite a number of the international standards and adhere to them, as Ian says.
L. Reimer: Other governments are also utilizing those?
I. Bailey: In terms of across Canada, I’m not aware of a quantitative comparison across Canada of the provinces and territories and the federal government. I’m not aware of such a comparison. Operationally, our information security team works all across Canada and with the federal government on an operational basis to identify risks and share information. I suspect it’s similar across Canada. That would be my suspicion, based on kind of what’s going on.
B. Ralston (Chair): That was quite a litany of acronyms. I hope Hansard will get some help in transcribing those.
I. Bailey: This gets to what Carol was talking about. I would say that in the private sector there is a significant maturity being achieved in setting standards and in compliance and audit. We will be able — and I have these conversations with some of our suppliers — far better, in their interest, to adhere to standards and be audited and have these reports available for all of their customers so that they are not having to treat these as one-offs and do this work over and over again.
That’s something that we’re seeing in the marketplace, as our large suppliers are getting there. I think there are some that are setting the gold standard for that, like Microsoft and Amazon and big companies like that, which have tremendous resources to achieve those. I would say that that’s very encouraging for the industry.
I don’t know if you’d agree with that, but that’s our assessment in our dealings.
V. Huntington: Speaking of acronyms, you mentioned Val IT as one of the processes, and standards for projects and project management. Could you explain a little bit
[ Page 876 ]
about what that is and whether you monitor whether ministries are…? Or is it a requirement that ministries use that standard? How do you monitor that for any compliance or approval systems?
P. Twyford: Again, looking at projects within ministries, we have a framework for IT project management and how it’s being approached, and it was built on the Val IT 2.0 framework, which, quite frankly, meets the same objectives as COBIT.
Really what it’s about is aligning to business objectives, ensuring that you have the right governance structures for those and then ensuring that you have the right processes in place to deliver on what you have set out to deliver and to correct issues as they come up, quickly.
In a nutshell, that’s what that framework is about.
V. Huntington: What is your expectation of ministries when they’re undertaking a major project?
P. Twyford: Under the new framework, as of August 2015, there was actually a directive that came out. Any new project within ministries, regardless of size, must come through the OCIO. We do assess it. We review it.
We provide advice to our colleagues and Treasury Board staff and into Treasury Board, looking at all of the different elements where we see risk. We do bring in independent experts to review some of those key risk areas.
Once the project is underway, we actually go in and we assess those projects. So the project manager in the ministry is responsible for undertaking the project and managing the project. We don’t dilute that responsibility, but we do come in with an oversight role to say: “Is the project being effectively managed?” If we feel that there’s a concern, then we will bring in other parties — they may be internal or external — and we will work with the organization to address those issues and make sure that the project gets back on track.
V. Huntington: That holds true for every major project in the core government? And nothing like that was in place prior to last August?
P. Twyford: I can only speak to what we’re doing recently, what we’ve put in place with the major projects and the minor projects. It was applied to the major projects in 2015, so that is the new process that’s been extended. It has been in place with the minor projects — anything under $20 million is the threshold — for about a year now.
K. Corrigan: I wanted to just ask. You mentioned earlier that there was…. I may have gotten this wrong, but for two large IT projects, government engaged Ernst and Young and KPMG to do assessments. I’m wondering if you could give us a little more information on that. Is it possible for us to see those reports?
P. Twyford: I’ll answer the second part first.
I would need to check the specific terms and conditions. I don’t see any reason why we couldn’t share them. What we did when we looked at the projects…. We have two projects that are currently in flight that are major projects starting up. I’ll treat each of them independently, because they’re at different places.
The first project. When we looked at the business case, what we actually did…. This is based on a framework out of Oregon. You were asking about best practices. Oregon has a framework called the quality assurance framework. We have modelled our process on that. We also adopted a U.K. process as well — so looking at different jurisdictions.
Once we had the business case done, we actually brought in PricewaterhouseCoopers, and we had them go through and look at all of the key elements based on what’s called the PRINCE2 framework for project management. There are different frameworks, but we used the PRINCE2, looking at all of the key elements and the key risk areas.
They identified a number of them as they were going through the project, and we actually worked with the project team to, what we call, cure them — to address them and fix them and put the right processes in place.
We were sufficiently comfortable that we had either reduced the risks or put risk mitigation processes in place to recommend that the project go forward. But best practice is also breaking projects into really small pieces.
To look at an example, we only approve to the first deliverable, the first piece that you could capitalize and say that that is an asset. That’s the minimum, what we call the minimum viable product. It’s getting it as small as you possibly can.
Once that project was then up and running, we actually brought PricewaterhouseCoopers, PwC, back in to have another look at it. We have now done three of those reviews on the one project, and what we’re seeing is progressive improvement.
On the other project that we have on the go, we did work with Ernst and Young and with KPMG looking at a risk framework. We’ve also brought Gartner Consulting in to have a look at one of the projects as well.
So we have developed the framework. We’ve developed the process. We’ve developed the standards. Now what we’re doing is bringing in independent experts to actually have a look at those projects and provide us with a third-party review.
What we do, quite frankly, is we bring in people who’ve been through a lot of big projects, so they’ve actually got experience. It’s not just another set of eyes, but it’s another set of pragmatic eyes to, again, look at some of those issues, address some of those risks and help us make the determination: what’s the most cost-effective way to address the risk?
[ Page 877 ]
K. Corrigan: Are you able to let us know what those two IT projects are that you’re talking about?
P. Twyford: Unfortunately, they’re before Treasury Board right now, so I can’t speak specifically to them. I’m sure they’ll be coming out fairly soon, at which point in time I would be happy to go into more detail.
B. Ralston (Chair): It’s too bad you weren’t here yesterday. We had the secretary to the Treasury Board here.
I think that’s fair enough. That’s obviously confidential at this stage.
Any further questions?
K. Corrigan: No, I’m okay.
B. Ralston (Chair): Unless there’s anyone else who wants to dive in, I think we’ve exhausted the questions. That was, I think, a good and fruitful discussion.
I’m going to thank the Auditor General and her team, Mr. Dover and Mr. Lau, and Mr. Bailey and Mr. Twyford from the office of the chief information officer. Thank you very much. I’m sure this topic will continue to attract the attention of members in the future, so I think that was a helpful discussion for some future consideration of these issues.
Before members go — you probably know this — on prorogation, which will take place in the morning on Tuesday, all the committees are dissolved. This is, strictly speaking, a sessional committee. Then there will have to be a motion passed in the Legislature to reconstitute the committee, and that will take place usually within a week or ten days, by the time the House Leaders get around to it. Then we’ll devise our future schedule during the course of the session.
Thank you very much. I think that was a fruitful day and a half. See you all on Tuesday, I expect.
We’re adjourned.
The committee adjourned at 2:58 p.m.
Copyright © 2016: British Columbia Hansard Services, Victoria, British Columbia, Canada