2016 Legislative Session: Fifth Session, 40th Parliament
SELECT STANDING COMMITTEE ON PUBLIC ACCOUNTS
SELECT STANDING COMMITTEE ON PUBLIC ACCOUNTS | ![]() |
Wednesday, November 23, 2016
9:00 a.m.
West Meeting Room 111 and 112, Vancouver Convention Centre
1055 Canada Place, Vancouver, B.C.
Present: Bruce Ralston, MLA (Chair); Sam Sullivan, MLA (Deputy Chair); Dan Ashton, MLA; Kathy Corrigan, MLA; David Eby, MLA; Simon Gibson, MLA; George Heyman, MLA; Marvin Hunt, MLA; Vicki Huntington, MLA; John Martin, MLA; Lana Popham, MLA; Linda Reimer, MLA; Selina Robinson, MLA; Ralph Sultan, MLA; Laurie Throness, MLA
Others Present: Carol Bellringer, Auditor General; Stuart Newton, Comptroller General
1. The Chair called the Committee to order at 9:02 a.m.
2. The following witnesses appeared before the Committee and answered questions regarding the Office of the Auditor General Progress Audit Report: Effectiveness of B.C. Community Corrections (May 2016).
Office of the Auditor General:
• Carol Bellringer, Auditor General
• Malcolm Gaston, Assistant Auditor General
• Laura Pierce, Manager, Performance Audit
Ministry of Public Safety and Solicitor General:
• Brent Merchant, Assistant Deputy Minister, Corrections Branch
• Bill Small, Provincial Director, Community Corrections Division
• Elenore Clark, Provincial Director, Strategic Operations Division
• Leigh Greiner, A/Director, Research Planning and Offender Programming
3. The following witnesses appeared before the Committee and answered questions regarding the Office of the Auditor General Report: An Audit of Mid-Size Capital Procurement in Post-Secondary Institutions (May 2016).
Office of the Auditor General:
• Carol Bellringer, Auditor General
• Christopher Thomas, Senior Manager, Financial Audit
Government:
• Kevin Brewster, Assistant Deputy Minister, Ministry of Advanced Education
• David Galbraith, Deputy Secretary to Treasury Board, Ministry of Finance
• Heather Hill, Executive Director, Capital, Ministry of Finance
• James Postans, Director, Ministry of Advanced Education
4. The Committee recessed from 11:59 a.m. to 1:01 p.m.
5. The Committee resumed consideration of Office of the Auditor General Report: An Audit of Mid-Size Capital Procurement in Post-Secondary Institutions (May 2016).
6. The Committee recessed from 1:43 p.m. to 1:47 p.m.
7. The following witnesses appeared before the Committee and answered questions regarding the Office of the Auditor General Report: Getting IT Right: Achieving Value from Government Information Technology Investments (October 2016).
Office of the Auditor General:
• Carol Bellringer, Auditor General
• Sheila Dodds, Assistant Auditor General
• Kevin Keates, Manager, Performance Audit
Government:
• Cheryl Wenezenki-Yolland, Associate Deputy Minister and Government Chief Records Officer, Ministry of Finance
• David Galbraith, Deputy Secretary to Treasury Board, Ministry of Finance
• Bette-Jo Hughes, Associate Deputy Minister and Government CIO, Office of the Chief Information Officer, Ministry of Technology, Innovation and Citizens’ Services
• Heather Hill, Executive Director, Capital, Ministry of Finance
• Philip Twyford, Executive Director IM/IT Capital, Office of the Chief Information Officer, Ministry of Technology, Innovation and Citizens’ Services
8. The following witnesses appeared before the Committee and answered questions regarding the Office of the Auditor General Report: Management of Mobile Devices: Assessing the moving target in B.C. (October 2016).
Office of the Auditor General:
• Carol Bellringer, Auditor General
• Sheila Dodds, Assistant Auditor General
• John Bullock, Senior IT Audit Specialist
Government:
• Bette-Jo Hughes, Associate Deputy Minister and Government CIO, Office of the Chief Information Officer, Ministry of Technology, Innovation and Citizens’ Services
• Ian Bailey, Assistant Deputy Minister, Technology Solutions, Ministry of Technology, Innovation and Citizens’ Services
• Cheryl Wenezenki-Yolland, Associate Deputy Minister and Government Chief Records Officer, Ministry of Finance
• David Curtis, Assistant Deputy Minister, Corporate Information and Records Management, Ministry of Finance
• Sharon Plater, Executive Director, Privacy, Compliance and Training Branch, Ministry of Finance
9. The Committee adjourned to the call of the Chair at 4:02 p.m.
Bruce Ralston, MLA Chair | Kate Ryan-Lloyd |
The following electronic version is for informational purposes only.
The printed version remains the official version.
WEDNESDAY, NOVEMBER 23, 2016
Issue No. 30
ISSN 1499-4240 (Print)
ISSN 1499-4259 (Online)
CONTENTS | |
Page | |
Auditor General Progress Audit: Effectiveness of B.C. Community Corrections | 1015 |
C. Bellringer | |
L. Pierce | |
B. Merchant | |
B. Small | |
M. Gaston | |
E. Clark | |
L. Greiner | |
Auditor General Report: An Audit of Mid-Size Capital Procurement in Post-Secondary Institutions | 1036 |
C. Bellringer | |
C. Thomas | |
D. Galbraith | |
K. Brewster | |
S. Newton | |
Auditor General Report: Getting IT Right: Achieving Value from Government Information Technology Investments | 1046 |
C. Bellringer | |
K. Keates | |
B. Hughes | |
D. Galbraith | |
C. Wenezenki-Yolland | |
P. Twyford | |
S. Dodds | |
Auditor General Report: Management of Mobile Devices: Assessing the Moving Target in B.C. | 1062 |
C. Bellringer | |
J. Bullock | |
B. Hughes | |
I. Bailey | |
C. Wenezenki-Yolland | |
Chair: | Bruce Ralston (Surrey-Whalley NDP) |
Deputy Chair: | Sam Sullivan (Vancouver–False Creek BC Liberal) |
Members: | Dan Ashton (Penticton BC Liberal) |
Kathy Corrigan (Burnaby–Deer Lake NDP) | |
David Eby (Vancouver–Point Grey NDP) | |
Simon Gibson (Abbotsford-Mission BC Liberal) | |
George Heyman (Vancouver-Fairview NDP) | |
Marvin Hunt (Surrey-Panorama BC Liberal) | |
Vicki Huntington (Delta South Ind.) | |
John Martin (Chilliwack BC Liberal) | |
Lana Popham (Saanich South NDP) | |
Linda Reimer (Port Moody–Coquitlam BC Liberal) | |
Selina Robinson (Coquitlam-Maillardville NDP) | |
Ralph Sultan (West Vancouver–Capilano BC Liberal) | |
Laurie Throness (Chilliwack-Hope BC Liberal) | |
Clerk: | Kate Ryan-Lloyd |
WEDNESDAY, NOVEMBER 23, 2016
The committee met at 9:02 a.m.
[B. Ralston in the chair.]
B. Ralston (Chair): Good morning, Members. We have an agenda before us. The first item on the agenda is the Office of the Auditor General, a progress audit report on a previous report from May 2016, Effectiveness of B.C. Community Corrections.
Representing the Office of the Auditor General: Carol Bellringer, the Auditor General; Malcolm Gaston, the assistant Auditor General; and Laura Pierce, who is the manager of performance audit.
Representing the ministry, the auditee: from the Ministry of Public Safety and Solicitor General, Brent Merchant, who is the assistant deputy minister, corrections branch; Bill Small, provincial director, community corrections division; Elenore Clark, provincial director, strategic operations division; and Leigh Greiner, acting director, research planning and offender programming.
I’ll turn it over to the Auditor General to begin.
Auditor General Progress Audit:
Effectiveness of
B.C. Community Corrections
C. Bellringer: Good morning, Members.
In 2011, we audited the B.C. community corrections division to determine its success in reducing the rate of reoffending among offenders under community supervision. We made several recommendations to address risk factors that we found were hurting the division’s performance. Since that time, we’ve continued to monitor the progress the division has made in addressing our recommendations, and in 2015, we determined the timing was right for a progress audit.
This was the first audit conducted under the new follow-up audit approach, although staff from my office presented on the results of a similar audit, the FICOM progress audit, back in October. We’re encouraged by the progress the division has made but recognize that more work is needed to fully address a number of our recommendations.
I’ll turn it over now to Laura, who will walk you through the results of the work.
L. Pierce: Good morning, Members.
Community corrections is an important part of our criminal justice system. Of all the offenders under correctional supervision in B.C., approximately 90 percent of them are supervised in the community. Only 10 percent are sent to jail.
The focus of community supervision is rehabilitation. When done right, it can reduce the rate of reoffending, providing a much-needed boost to public safety and saving the system money. In 2011, the difference was $7 a day in the community versus $194 a day in custody. It’s a much cheaper form of supervision. Community supervision also allows offenders to stay in their communities where they have access to their families, can remain employed and retain their housing.
In 2011, we published a report on the effectiveness of the B.C. community corrections division. We made eight recommendations to address key risk areas that were impacting the division’s performance. Since that time, the division has completed three self-assessments and one action plan of its progress in implementing our recommendations.
With this progress audit, we were looking to see if its assessment against the recommendations was accurate. Overall, we found that the community corrections division has made progress in addressing our recommendations, but more can be done. The division self-assessments and action plan stated that it had fully or substantially implemented all of our recommendations, and we found that only one met this standard. I should clarify that we only looked at six of the eight recommendations. Of the remaining recommendations, we found that the division has partially implemented four and taken no action on one.
Although the division’s assessment of its progress differed from ours, we’d like to point out that this was largely a difference in interpretation. The division based its assessment on both the effort it put into addressing each recommendation as well as the actions that were within its control. We were looking at the extent to which it had achieved the recommendation. We have worked with the comptroller general’s office to help ensure that expectations are clear for future action plans.
As I mentioned previously, we found that the division has substantially implemented one recommendation. This is the first recommendation — that they publicly report their performance in reducing the overall rate of reoffending. The reported rates are now broken down by community and custody corrections, and their website describes the role that the division plays in influencing reoffending.
Of the five recommendations we audited that are still outstanding, the division has committed to further work on three of these by looking to see that the changes it has made are working. You can see from the wording above that each has a strong quality assurance component to it.
For the other two recommendations, we are concerned that the division has no plans to implement them. Staff told us that there is nothing more they can do. These two recommendations are around evaluating the effectiveness of contracted services in community programs and identifying gaps in staff capacity and caseload level.
In our 2011 report, we talked about how the division provides programming to offenders through contract-
[ Page 1016 ]
ors and community resources — programming such as counselling, substance detoxification, housing and employment services. We recommended that the division extend its evaluation framework to understand the effectiveness of these, but we found that it has yet to do so. We also looked at staffing levels because the division reported being under-resourced.
In the end, we recommended that the division complete a comprehensive impact assessment to see if there are gaps between its staff capacity and caseload. The division has done some work in this area but feels that there is nothing more it can do to reach full implementation. By not implementing these recommendations, there is a risk that the division will not have a complete understanding of the impact its programs and resources have on its effectiveness.
This concludes our summary of the report.
B. Ralston (Chair): Thank you. I’ll now turn to the representatives of the Minister of Public Safety and Solicitor General.
B. Merchant: As the Chair had mentioned earlier, my name is Brent Merchant. I’m the assistant deputy minister of the B.C. corrections branch.
I thank the Office of the Auditor General for their report and their following up over the years. I think we’ve had four follow-ups now, or three follow-ups and the actual audit.
As Laura brought up, we have addressed all of the eight recommendations in this presentation, even though I think recommendations 5 and 8 were not part of the follow-up. As Laura had also said, there was a bit of a…. We had problems with the metrics for the ratings. There were some, I guess, conversations about how we rated and how the Auditor General rated those things. As we go through those, I’ll try to explain how we came to a different conclusion and rating than what they did.
The focus of the corrections branch, primarily — our main focus — is to ensure that the orders of the court are enforced and appropriately supervised. That’s our number one goal. By doing that, we also engage in programs that help address criminal behaviour.
I think most of you know that the corrections branch is one of the largest branches in B.C. government. We have over 2,300 staff. The work with the corrections branch is divided into four divisions. One of those divisions is our capital projects division. Another is the adult custody division, and also the strategic operation division. Bill leads the community division.
As Laura said, of the number of people, only 10 percent are in our correctional centres. But the numbers, I think, always surprise people. On any given day, like today, we supervise 25,000 individuals across 428 communities in the province, some of them very remote aboriginal communities and some of them in the Downtown Eastside.
Of those 25,000 each day, right now we probably have 2,750 inmates in our ten correctional centres. The remainder, which is around 22,500, are supervised by our probation officers in the community. The bulk of the people that we look after are in the community. Everybody thinks they’re all in jail, but they’re not.
With recommendation No. 1, I think there’s harmony there with the rating with the Auditor General that we have addressed this recommendation. I don’t know if…. Later you’ll probably have questions about it, but we do report out on our recidivism rates.
Before we had the audit, we did a kind of average, and we reported out on the recidivism rates. What we’ve done now is we report the recidivism rates for community and the recidivism rates for custody. They’re embedded in our ministry strategic plans, our branch strategic plans. We report out on the open data of that information.
This recommendation, to do with our evaluation of contracted service providers and community programs in reducing reoffending…. The Auditor General said that there’s no action taken. In a sense, we saw it as we have taken alternative action. In trying to explain that….
We evaluate our programs that we offer through our probation officers. Those are evaluated. We also have programs such as the downtown community court and the drug treatment court, where we are engaged with other partners to provide a service to the clients that we have. Those are evaluated.
We partner in a lot of ways. When clients come in, they may be presenting alcohol and drug problems. We work with the Ministry of Health to try to get services for those individuals. Do we evaluate the Ministry of Health programs? No, we don’t. To tell you the truth, I’m not sure how we could.
We have programs — more so, I think, in aboriginal communities — where you have one person doing the service. It’s a contract. We have a host of those kinds of contracts throughout the province. If you look at research and evaluation, given the sample size, it’s just about impossible to do an effective evaluation of those kinds of services.
That’s why we’re saying we have taken a different approach to it. We’re not trying to avoid doing work and trying to find out, because we are an evidence-based organization. But there are certain limits that we’re faced with in that area.
Recommendation No. 3 is to do with our staff capacity and our caseload. It’s not that it’s a difficult one. In the history of the corrections branch, there was a tool that was used called the staff planning technique. It did things like…. It said that if you did a pre-sentence report, it would take you nine hours. If you did other pieces of work, there were hourly amounts associated with that. Then you could total all those hours up and figure out what the capacity is of the program.
[ Page 1017 ]
But over time, we’ve evolved. We do what’s called risk-need-responsivity. The first part of that is risk. We evaluate the risk and the needs of the people that we supervise, and then we apply appropriate interventions to meet those risks and those needs.
Bill and his team are constantly — like local managers and regional directors in the community offices throughout the province always are — looking at the caseload and the workload of the probation officers. Then they make adjustments in that. You do have ebbs and flows, just like population growth. Some communities have a higher rate, and some communities have lower rates. We adjust our staffing based on those levels. So we do have a good tool.
We haven’t found a tool across Canada that addresses exactly what that recommendation does. Bill will probably explain. I’m sure you’ll have questions about this one. Bill will explain that in more detail after we finish going through these slides.
Recommendation No. 4 — to update our policies. Bill and his team have reviewed the policies and the training programs and confirmed the specific courses people need, depending on what they’re doing in their job. If you’re a sex offender specialist, you need to take certain courses, and that’s documented. Depending on the risk of the client and what they need, it’s determined by a number of courses that the probation officers…. We have gone through that extensively, and I think we’ve met it well.
Recommendation No. 5 — that’s one that was not part of the audit but in terms of the quality assurance system that we have in place. It says what we’re doing on the slide behind me. We have strengthened our quality assurance model by incorporating it as an aspect of a larger quality management framework. The structures in enhanced communication with staff have been implemented to ensure policy requirements are being met and that the quality and quantity of work is in keeping with existing policy.
Recommendation No. 6 — that we thoroughly document the rationale for risk assessment ratings and how offenders’ risks and needs be effectively addressed. We have made, I think, considerable changes in how we do our business. The Auditor General said that…. For us, we said we partially implemented. We’re in agreement with that — that we have partially implemented that one.
The community corrections ensures offenders receive and complete the interventions required in their case management plans. I remember after the audit, the papers kept quoting part of the audit, saying that only 35 percent of our clients received the interventions appropriate to their risk and needs.
Was that correct? Yeah, it was correct, but it’s funny how different people will look at that differently. We are actually, in a sense, pleased with it. But that doesn’t sound very good, I don’t think, when you say only 35 percent. The people that come in under our supervision come in with a host of presenting problems: mental health, addictions, just a whole host of issues that have to be dealt with.
We do have interventions for different parts of those, but one of the problems that you have is…. I don’t know if it’s easier to relate this to a family, but if you have a teenager, and the teenager has a whole host of problems in the household, do you try to cure all of them at once, or do you take it in priority? Which is the one that is the biggest risk?
You deal with that, and then you move down the list. That’s how you try to do it, and that’s why that number, if taken out of context, that 35 percent, sounds so terrible. But for us, it’s actually how we work through. We prioritize the risk to the needs, and we work through each one of them. You can’t do everything to everybody all at once and expect success.
We do have people there for…. And on probation, the average is around 350 days, so it’s almost a year. But some of these things have been ingrained with them for years and years and years, and it takes time to change those kinds of behaviours.
With the last one, the last recommendation, No. 8, we’ve made policy changes, we have done training with our staff, and we are moving in the direction that the recommendation had suggested.
I think, when Laura started, it was, you know, that we’ve only met one, and there are a lot that are only partially completed and things like that. I just want you to understand that with a lot of the things that we’ve done, we’ve moved it ahead quite a bit, but the metrics are a bit different.
When I was in university, I used to get a nine-point rating scale, and that kind of helped to figure out where I was at, at university anyway. But we only have a few ratings here, so unless you’re at, like, 98 percent, you’re not going to get “fully completed.” That was kind of the hard thing for us.
I hope that’s helpful, and if you have any questions….
D. Eby: Mr. Merchant, the last time you appeared in front of this committee, you were talking about a different corrections report in relation to corrections activities inside the prison. Today we’re reviewing community corrections.
During that last time that you appeared in front of us, one of the issues that was raised with you was staff ratios in prison. You assured this committee that a 29 percent budget cut and the ratios that we presented to you, of staff to prisoner, within B.C. corrections institutions were not an issue — that you had dealt with these things, that there was smoked glass that allowed guards to keep track of what was happening, that ratios were an urban legend. This committee, frankly, gave you a lot of…. You’re the expert. You’re in the prison. You know what’s happening. You talk to your staff.
[ Page 1018 ]
During the break, I asked you about violent incidents in prisons, whether they were increasing or decreasing. You assured me that they were decreasing. I read that into the record in a question to you. You didn’t correct me. Then following the meeting, you sent to this committee a letter that indicated that not only are violent incidents increasing at five out of the eight facilities under your leadership but violent incidents actually doubled at Surrey Pretrial, and they were up significantly at every single institution except for Prince George.
My first question to you is — before we get into today’s report and the evidence you provided to the committee today — how is it that last time you were here, you could provide us with such incorrect information about the state of what’s happening inside the prisons in relation to violent incidents and staff ratios and their impact on safety in the prisons?
B. Merchant: Well, my understanding was that when I came here today, we were going to be addressing the community corrections audit, and the adult custody audit was not part of that. So do I have the materials in front of me to answer your questions accurately? No, I do not. But what you’re saying is that somehow I purposely misled you, and I absolutely did not.
D. Eby: I did not say that. I said: “How could you be so wrong?”
B. Merchant: Well, when you’re taking those things, when you’re talking about violence…. This year violence has gone up in the institutions. Yes, it has. You’re talking about Surrey. Surrey at that point…. If I’m not mistaken, at that time period, we just opened…. On the other side of Surrey, we built an addition to Surrey which increased the count in Surrey. So if you’re going to see the violence go up, you’ve got to take into consideration the number of inmates in there.
I’m not sure exactly the time period of that letter, so I don’t have that information in front of me, but I think that’s what you were probably seeing in that letter. But as I said, I don’t have it in front of me. I wasn’t prepared for these questions.
In terms of when you’re talking about staff levels, it is typically reported in the paper that we have ratios in our correctional centres. They use 1 to 60, and more currently they have said 1 to 72 — 1 to 72 in the Okanagan correctional centre. We had its official opening a few weeks ago, but we don’t even have any inmates in it, so it can’t be 1 to 72.
There are 36 cells in each unit. We do have one officer in every unit. In addition to that one officer in every unit, we have people that are assigned to rotate through those units. They go in and out of those units on a frequent, irregular basis. We have supervisors — included and excluded supervisors — that go in and out of those units. We have program staff that go in and out of those units.
We have radios, we have personal alarm transmitters, and we do have people at control desks with cameras that look down on them. We have put cameras in all of our correctional centres.
People start referring to one officer to 20 inmates. Then we used to get two inmates…. When we got one more inmate in, we got two officers with 21. That was a time when we didn’t have all this technology. We didn’t have all these cameras to oversee what’s going on.
The ratio of one to X number of inmates is very misleading unless you take the whole scope of what happens within a correctional centre. Many of the people that have toured in correctional centres around the province have seen that this is not what it looks like in the paper. I would invite you to come to the centres, and I’ll take you through them. You’ll see for yourself.
D. Eby: I may take the witness up on that.
My question. Year over year, five years in a row, increases at institutions you oversee. Surrey Pretrial, 300 incidents in 2013 and 900 violent incidents in 2014.
We are going to ask you about community corrections today, but what I’m trying to get my head around is whether or not you actually knew this information coming into the last meeting or whether you’re completely out of touch with what’s happening in the prisons. If you knew this information coming into the meeting last time, you should have raised it with the committee. You should have addressed it with the committee when you were asked about it. You didn’t.
Today you tell us that violent incidents are up. What is it you’re not telling us about community corrections in your response to the Auditor General today that we might not ask about because we just don’t know?
I have difficulty understanding how we’re supposed to get answers from you, especially when we hear from the Auditor General that you’re interpreting your own compliance with the Auditor General’s report by how much work, on a subjective basis, you feel that you’ve put into it, as opposed to whether or not you’ve actually achieved the goals. You said: “We did all the things you asked us to do.” And then they go and look, and you haven’t done five of the six things they asked you to do.
So I wonder if you can explain to me how you could be so wrong about violent incidents, how you could have five of the six things the Auditor General asked you to do that you report, “We did them all,” and actually five of the six you didn’t do, and two of the six you didn’t do anything on.
B. Merchant: Well, I just find it unfortunate that, for you, I have no credibility. That’s unfortunate. I have not presented a case where you’re believing what I say.
[ Page 1019 ]
I didn’t bring the material for the adult custody part, so I do feel, to a degree, a bit blindsided by this. I can give you that information. I’ll gladly meet with you outside of this or on another occasion when I have all the material, and we can have, I think, an informed discussion. Right now it’s not an informed discussion. You have your material in front of you, and I don’t have material in front of me.
B. Ralston (Chair): I’m sure we could arrange to have you back before the committee on that topic, if you choose to appear.
B. Merchant: Okay. That would be good.
B. Ralston (Chair): Anyway, why don’t you deal with the question of the community corrections report? There was a question about the recommendations and whether or not you’d met them.
B. Merchant: The metrics?
B. Ralston (Chair): Yes.
B. Merchant: Well, I think there was just a different…. When we reported out, we were asked to rate ourselves on what we did. That’s what we did. We rated ourselves. The Auditor General had their rating. Bill and his team met with them, and they had the discussion.
Maybe you want to talk about….
B. Small: Thanks, Brent.
The degree of disconnect between what the Auditor General found when they came back to visit us and where we were at was not as profound as the colour changes on some of the guides would lead us to believe. In discussions with Laura and her team….
What much of the disconnect turned on started at the beginning with the self-assessments. The scale that was provided was fairly limited in its application. When we launched out, we identified that we had partially implemented many of the recommendations and that we would endeavour to work towards full implementation where that was a viable option and realistic, so the work began. Through the course of the self-assessments, at various points…. We undertook a substantial amount of work.
By way of example, the feedback regarding our training was that we had good training courses but that we hadn’t yet developed, in their view, a very coherent alignment of the various modules of training and the corresponding tasks to which probation officers could be assigned at the conclusion of that training. On a principled basis, from the beginning, we knew that that was what we wanted to see happen. What the Auditor pointed out to us, very helpfully, was that, in some cases, we had probation officers who had not yet completed training. They would then be assigned work prior to the conclusion of that training, and we needed a better document for tracking that process.
Following the first audit, we undertook a very broad review of all of our courses, ensuring that we had a strong connection between the task and the training that precedes it and that we had a more effective tool for tracking, in real time, probation officers’ progress through that training continuum. That was, for us, a very…. I thought it was a very important piece of work that we needed to refine to make sure that we were more accurately tracking that.
All the tools were in place. We were very pleased that we had developed a much more comprehensive tool — the feedback from the Auditor was that we, indeed, had a good tool — but, through the course of our review, realized that some managers were using an alternate tool or an adapted tool and, in fact, were not using the tool exactly as it was intended.
For us, we then set out to…. At the point that we had developed all of the necessary tools, we saw ourselves as having moved from partial to substantial completion. But in the absence of a defined metric differentiating between substantial and full, we were left with no choice but to occupy that somewhat nebulous space. As the auditors noted to us, when we met with them on their return, they had not provided us with any markers whatsoever to define what the differences were.
It was really up to the program areas to define the tipping point between partial and substantial implementation. For us, we knew we couldn’t say with certainty that we had fully implemented until such time as we could be assured that the tools were in place, that they were being used and that we had enough time to follow up and ensure that they were being used on a consistent and ongoing basis.
B. Ralston (Chair): Just to the Auditor General, Mr. Small has attributed a number of interpretations of your recommendations. Do you wish to respond to that, or are you content with his interpretation of your efforts?
M. Gaston: The comments that Bill’s made in relation to the meetings that we had with him are absolutely accurate in terms of the conversations that we had.
In terms of the interpretation of fully or substantially, that was something that came up in our discussion. It became apparent that there was a different understanding as to what these different phrases meant. It’s something that we’ve picked up in this follow-up a lot more than we have in any other ones. So as a result of that…. In fact, with the comptroller general’s office, the instructions that go out now with the follow-ups — in fact, with that clarification in terms of the different categories —
[ Page 1020 ]
would have been provided for the action plan that the division provided just last week, I think, in the material for today’s meeting.
If you look across the self-assessment that they’ve made of progress against the recommendations, there’s much more alignment in how they’ve assessed their progress against the recommendations — in relation to where we had assessed the six recommendations that we’d followed up before.
On recommendations 2 and 3, they’ve gone into a slightly different assessment, which is alternative action taken. That’s where the assessment by the division is that the recommendation is no longer appropriate. I think the guidance that goes out is that that’s down to any changes that have taken place either within the ministry or externally. That’s, obviously, a different assessment from the ones that we were looking at, at the time of our audit.
B. Ralston (Chair): Just, then, for me. The action plans are a relatively new innovation that the committee is attempting to implement and monitor through the committee with the Office of the Auditor General.
Looking at recommendation No. 2 — and I’ll get back Mr. Eby shortly — you’re, then, content with…. You’ve made some recommendations, and you’ve asked the auditee to implement them. The auditee has come back and described other actions that they’ve taken. In recommendation No. 2 in the detailed action plan, there is a column of action taken.
Are you saying that you are content with the ultimate explanation? Or are you simply just acknowledging that the assertion has been made, an alternate action has been taken, and you’re not commenting one way or the other? Or are you specifically disapproving? I suppose there are three categories: disapprove, no position whatsoever or approved. Can you explain that to me as we work our way through the action plans?
M. Gaston: I would say that we probably need to do some more work to conclude as to whether the new self-assessment is an appropriate assessment or not. Our work previously was based on a different assessment, and it may be that other action has been taken since that work concluded in March, I think, of this year. There are, obviously, six months since then.
B. Ralston (Chair): I’m reading from the detailed action plan. Action plan update is the first of the nine.
Recommendation No. 2, action taken — when does that date from? There’s alternate action taken, assessment of progress by the entity and then action taken. When was that action reported to the Office of the Auditor General?
I’m just interested in the sequence here. My impression is that this process is very protracted. I don’t think that serves anyone if it’s very protracted. I understand some work has to be done to sometimes implement these things, but it seems to be exceptionally protracted in this case. If someone is proposing an alternate action and then there’s no evaluation of that, that could drag on, literally — and, in this case, it looks like — for years. I think that defeats the supervisory jurisdiction of this committee and of the Auditor General.
C. Bellringer: I’ll ask Malcolm to jump in if I’m interpreting this incorrectly. At the time that we did this progress audit that this report is based on, I’d say that, no, we weren’t satisfied that what was done addressed the recommendation. What I’m hearing is that things have happened since then. We have not had a chance to have a look at those, so we don’t know whether or not it’s going to satisfy it.
Now, protracted? To a degree, yes. I mean, we ran into that problem with a point in time needing to stop…. Things go on after and during when we were doing an audit, so that time issue is always going to be a bit of a problem.
B. Ralston (Chair): Okay, thank you.
We’ll turn it back over to David.
D. Eby: This original report was done in 2011, and there were recommendations that were made to Corrections at that time. Between 2011 and Corrections being contacted by the Auditor General for the follow-up audit, what discussions took place to say: “This piece that you’ve identified, that we should follow up with our contractors in the community to make sure, when we have a contractor in a community that’s in a residential neighbourhood, that they’re ensuring safety for the families that live there”?
At what point did you contact the Auditor General and say, “We can’t actually follow up with these contractors; it’s too difficult; we don’t totally understand what you’re saying there,” and ask them for advice on how to address this issue that they identified for you?
B. Small: The recommendation that the Auditor provided to us at the time of the initial report — and through subsequent discussions we had had with them in preparation for that report as well as this audit — related to the effectiveness of the contractors as it relates to their impact on recidivism. This was an effectiveness audit looking at the impact that the community corrections division has on reoffending. To the extent that the contractors were involved, it was to what degree community corrections can say with certainty that each and every one of the programs they contract with is having an impact — preferably a downward impact — on reoffending rates.
Without delving too deeply into research and evaluation design — my background isn’t in that — what we do know is that there are limits on where we can have an effective research design in working with some of our con-
[ Page 1021 ]
tracted programs. We have an extensive array of programs with whom we do contract and partner where we have completed appropriately designed research and evaluations. There are many others, as Brent referenced, which are much smaller, scattered throughout the province.
Those programs, in and of themselves, are difficult to design a research or evaluation design that would effectively measure that particular metric because it related to reoffending. That’s where the challenge was for us. Indeed, as we look at some of the smaller programs scattered around the province, they don’t, in and of themselves, have a direct impact on reoffending but, in fact, support our staff to deliver the types of programs and services that we’ve seen, through the data that we are releasing, are having that suppressive or downward impact on reoffending.
For us, our partners in small communities are actually working with us, and they are trained in the delivery of our programs so that they can work with our staff to deliver the programs that we have evaluated and that have had that, at times, very significant impact on reoffending rates in the community.
D. Eby: I don’t understand why it’s hard to evaluate how many people are reoffending out of a particular program, no matter how small it is. I don’t understand that. But you have an explanation.
When did you contact the Auditor General after the 2011 audit to say: “You’ve asked us to do this thing. For these reasons, it’s impossible or impractical for us to do. What do you recommend that we do?” When did you contact them and talk that through?
B. Small: We actually had those discussions throughout the entire course of the audit itself in 2011. We met routinely, and we were challenged by the notional difference between what we knew to be the limits of our research design and what they were asking us to do.
They challenged us to continue to look for ways to develop a way of doing evaluation that really isolates a fairly small variable — i.e., a single worker in a small community and what impact they are having relative to reoffending. For us, it’s the broad spectrum of services and programs that we provide that is having those impacts.
We had those conversations during the initial reviews, the clearance meetings and the final clearance meeting prior to the publication of the report in 2011.
In each and every instance, we registered with them the challenges that we were having with the notion that we could design an effective research design to address those types of contracted programs.
D. Eby: Mr. Chair, this is my last question. This is to the effectiveness of this committee and the Auditor General’s processes.
The last thing the Auditor General hears from you is at the clearance meeting in 2011, saying: “We have concerns about this.” Then they hear from you: “Okay, we’ve implemented it, fully implemented your recommendation.” They don’t hear from you again until the follow-up audit, when they come back and say: “What are you doing about these contractors in the community? Are you looking at whether they’re affecting recidivism? You say you’ve fully implemented it. Show us your work.” And you say: “Well, actually, it turns out that it’s too difficult to do.”
Is that the chronology?
B. Small: No. What we did throughout that entire series of self-assessments is work with the metrics that we had, which was that we had substantially completed the work that we could complete within the areas that we could. At the conclusion of that period of self-assessments, we sat down and that’s where we unpacked all of it and had a look at the amount of work we had completed, which by our measure was fairly substantial. It takes up the better part of a page on the action plan.
But the Auditor quite helpfully pointed out that at the end of the day, from their measure, that didn’t meet the test in terms of our ability to measure the effectiveness of the full spectrum of programs that we intersect with.
G. Heyman: I want to follow up on this question a bit, because there are a couple of things that jump out for me with respect to recommendation 2.
First of all, there appears to be a contradiction between the first point in your response, although you may have explained this a bit, where you say you’re assessing “the effectiveness of our own programs and programs in which we utilize contractors or partner with other agencies.” Then the next point says: “Evaluation of contracted services and community programs to which we refer clients is outside the scope of our responsibility.” That, in itself, appears to be a contradiction.
The second point, of course, is where, effectively, you say you have no intention of doing it. Then in your answer to Mr. Eby, you say it’s outside your capacity to design some appropriate evaluation or research program on programs that are essentially small, to which I’d ask you to explain: how do you justify contracting, with the use of taxpayers’ dollars, to agencies, no matter how small? So (a) you’re spending taxpayer dollars; (b) the programs are supposed to reduce reoffending, and if they’re not successful, there’s a tremendous cost to both society and taxpayers as well as to the individuals who haven’t been rehabilitated.
A number of us in this room have run organizations, supervised staff and had to do performance appraisals and evaluations on individuals. I just don’t get it. Like, (a) how can you say that it’s not possible to do an evaluation of a contracted agency, and (b) how can you contract if you can’t do that evaluation? What is the justification
[ Page 1022 ]
for contracting when you have no mechanism in place to assure taxpayers you’re spending dollars wisely? It just doesn’t make sense.
E. Clark: I can address that.
In terms of the contracted agencies that we work with and the evaluation pieces, where we have the ability to conduct evaluations, we do do that. An example of that is our domestic violence programming. We partner with a contracted agency, and we have mechanisms in place to obtain data and have evaluated the effectiveness of that program in terms of reducing reoffending.
Other partner programs, such as — we mentioned earlier — the drug treatment court of Vancouver…. There is an interministry database that sits at SFU, and anonymized data from both Health and Corrections as well as SDSI is put together there. We’ve had the ability to link data there and conduct an evaluation of the drug treatment court.
The challenge comes where the Auditor asked us to put an evaluation framework in place related to our contracted service providers and community programs. In some cases, such as alcohol and drug treatment, we rely on the services provided through the health authorities to provide addiction services that are available to all of the citizens of British Columbia, including individuals under our supervision. There we do not have the linked data, nor do we have the expertise in the area of addictions to conduct such an evaluation.
With respect to some of the smaller contracted agencies, I think Bill did speak to the challenges related to conducting an evaluation of a very small service provider. Perhaps Dr. Greiner can speak to some of the statistical challenges in that.
L. Greiner: Generally speaking, you need a certain sample size, obviously, when you’re going to evaluate the effectiveness of any program. Also you generally need to have some sort of comparison group. When you have a very distinct community with very distinct needs — an aboriginal liaison officer, for example, running a program — and they’re tailoring that to the clients of that community, developing a comparison group to be able to demonstrate the effectiveness of that really small program with a very small sample size is almost next to impossible. I don’t know if that answers the question.
G. Heyman: A follow-up to this, and one other question, if you’ll allow that.
First of all, frankly, it just sounds to me like what you’re saying is: “We need to use a certain number of contractors and community agencies. Where we can’t do any research or evaluation, we just roll the dice and hope we’re right.”
So my question to the Auditor General is: do you think it’s reasonable, in this case, in community corrections, that contracted contractors and community agencies are being used without any system in place to evaluate their effectiveness, as opposed to either looking for a way to evaluate effectiveness or looking for another service delivery model?
L. Pierce: We keep hearing from the division that we were prompting them to conduct those evaluations. But we recognize the resource limitations and the challenges in doing so when they don’t have direct oversight for those community programs.
In fact, on page 18 of the original report, we emphasize this. We say that if a direct evaluation cannot be undertaken, we would expect the division to be seeking assurance that the community program they’re referring to has some kind of evaluation done. They could get that evaluation themselves or understand the mechanisms that that organization is using to ensure that they’re providing appropriate services.
G. Heyman: Thank you.
My other question has to do with recommendation 3. If I understood the Auditor General report correctly, it was that the ministry says it has no intention to implement this recommendation, although it’s listed as a partial response. And the branch response says it “has completed the development of a comprehensive assessment tool to examine staff capacity and caseload level” as well as a tool to allow local managers to measure average workload.
I have a question for the ministry staff and a question for Auditor General staff. The question for the Auditor General staff is: do you believe that the actions being taken are responsive to the recommendation?
To the ministry staff: how far along are you in implementing the use of these tools? And depending on the findings of the tools, how do you plan to address any deficiencies that are found, either in terms of excessive average workload or excessive caseloads?
M. Gaston: Yes, we assess the progress on this one as being partially implemented. We certainly see the action that’s taken has been taken to address at least part of the recommendation that’s been made. But as we commented in the report, we didn’t feel that it was fully or substantially met.
B. Small: Following the audit in 2011, Corrections undertook work to develop a far more sophisticated tool for analyzing and assessing relative workload across the entire division. That tool informs all aspects of the work that’s undertaken by our staff. It has enhanced the level of measurement to include things like risk — the assessed risk that Brent referenced at the outset of our presentation.
[ Page 1023 ]
In doing so, it allows us to make a far more informed decision about where the work pressures are and allows us to be, in real time, more responsive to those shifting demographics in terms of the work presented by the clients we serve and our ability to be responsive to that by moving our resourcing to where it’s needed the most to address that workload. On that point, we were in agreement that we had done a substantial amount of work. I don’t know if that answers your question, though.
G. Heyman: Well, if you shift resources from an area where you have adequate capacity to an area where you don’t, how do you assess whether you’ve created a situation where you no longer have adequate capacity for the area in which you’ve moved resources?
I guess the follow-up is to the Auditor General staff. Because you say this recommendation is only partially implemented, where are the glaring deficiencies, from your perspective?
M. Gaston: The comments that we made in the report — this was from the work that we’d done — is that to fully implement the recommendation, the division needs to understand what the true capacity of a probation officer is if he or she is to be effective in supervising and rehabilitating offenders.
Part of this issue came…. We mentioned it in our follow-up report, and it was certainly covered in the original report. We had looked at documents that the division had produced where they had actually stated that caseloads had reached an unmanageable level and that public safety was at risk.
This was an issue that the division had highlighted themselves as a risk and a concern, so our recommendation, or part of that recommendation, was that they should understand what the workload capacity of their staff was and to make sure that the two were matched.
G. Heyman: So if I understand your comment correctly, simply shifting resources around is not addressing the core of the problem. It’s a capacity issue overall.
M. Gaston: Well, they have looked at…. At least from the conversations that we’ve had with them, they’re collecting information on workloads and looking at a link there in terms of any issues that are arising. But we hadn’t seen something that basically quantified, if you like, what the workload should be of individual staff members — recognizing, obviously, the different complexity of different cases.
S. Robinson: I want to start with a specific question, and then I actually want to go a bit broader and a bit more thematic.
I want to go back to this question about the use of contractors and community service providers. I have been a community service provider, and I promise you there are tons of evaluation that happen at community programs. So I want to understand how it is, especially when you’re working with, for example, alcohol and drug services — which is another government service, paid for by taxpayers — that you don’t have enough relationship with the ministry to get some of the evaluation information that you need.
B. Small: Again, starting outward from the question that was put to us at the time. It was to look at the effectiveness of these programs relative to our stated goal of reducing recidivism. We, all along, have worked very closely with our partners, including alcohol and drug programs and others, to better understand their services. But the challenge at the outset, at the beginning of our journey through this audit, was to look at the degree to which we could directly influence measurement or metrics or evaluation of those programs relative to our stated goals of reducing recidivism.
We have a very strong working relationship with our ministry partners. In fact, when I look at the drug treatment court of Vancouver evaluation results, the results of the work on the interministry database, those are all by-products of that strong partnership. It was really centred on our ability to produce, on an ongoing basis, the kind of data that would support the recidivism measurements, as we’ve done with the programs where we have the steering wheel and we’re driving it.
S. Robinson: I just want to follow up. Then how do you do accountability? If the recommendation from a probation officer is, “You should do these programs in the community,” where’s the accountability that they’re actually following? It’s not enough that there’s just the output, that the client reports back, “I went to 15 counselling sessions, so I’m better,” rather than what the behaviour change is.
B. Merchant: Well, one of the things, maybe, is partly that we need a shared definition. When we talk about evaluations, we talk about it, as Leigh had mentioned. It’s a scientific definition of evaluation and how you do that.
S. Robinson: I know what that is.
B. Merchant: Then MLA Heyman had mentioned about — almost like performance appraisals — people in the community. We go to those communities — our probation officers and their local managers in the communities — where there’s maybe a single service provider. They do talk to the community. If it’s an aboriginal band, they talk to the members there, the elders. They talk to the police, Crown counsel. They do those kinds of things.
That is not a formal evaluation, but it is gathering information about the program to make sure to see if it’s
[ Page 1024 ]
being handled in an appropriate manner, if it’s effective. We have terminated some contracts that weren’t doing it, and we’ve put new ones in there. Just so we’re clear on the evaluation and what ongoing work there is.
S. Robinson: I thoroughly understand evaluation. I’m quite comfortable with the scientific method. I appreciate that there’s certainly a challenge when you have a small N, a small number. But this isn’t about getting it peer-reviewed or published in some journal. This is about accountability to the taxpayers and, certainly, accountability to this committee — making sure that you have some system in place that says: “Listen, we’re not 100 percent sure that everything meets all of our criteria, but we have some things in place that say that this is effective.” There needs to be some sort of way to monitor. If it’s not measured, then it’s at least monitored.
The overall tone of the report I found rather frustrating, because what I heard was: “Well, we did a substantial amount of work, and that should count for something.” My husband teaches students at a university, and he gets frustrated. He comes home and says: “Students want an A because they worked really hard on something. But it doesn’t matter if they didn’t learn anything or if they didn’t produce anything.” I certainly saw that tone in this report. That was actually quite disappointing. “We worked on it, but we still didn’t meet the goals.” It doesn’t really matter how hard you worked; it’s did you deliver.
Part of what I want to address or ask — this goes back to Mr. Merchant’s earlier comments — has to do with the statement that you made earlier around the goal of B.C. community corrections: to uphold the orders of the court, that they are followed. That was the number one goal.
I’m just pulling up a document that I was looking at earlier. I want to ask around the document called A Profile of B.C. Corrections: Protect Communities, Reduce Reoffending. It’s an October 2013 document put out…. Mr. Merchant, you’re at the head of the entire system. It says that the mandate of B.C. corrections “protects communities through the safe control and behavioural change of adults.” I didn’t see here that the number one goal is orders of court, but that it’s about the safe control and behavioural change of adults.
Then as I scroll down, it talks about the strategic operations division. “The goals of the division are as follows: ensure that case management of clients and inmates is focused on changing criminal behaviour and protecting communities.” What I’m hearing, actually, is that the goal isn’t so much about enforcing the orders but certainly about changing behaviour. It’s fairly consistent.
So what I wanted to check in with you around is: is that what you adhere to? Is that really the number one goal? Is it about changing behaviour, or is it about making sure that making sure that the orders of the court are followed? I’m hearing two different messages.
B. Merchant: Well, we would have no clients if the courts didn’t make an order. We wouldn’t have clients. The order of the court is something that we have to legally adhere to. That’s really at the basis of it. But the other part of that is the rehabilitative part, and we try to focus on that. I’m just saying they go hand in hand.
I’m not saying the only thing we do is make sure that a person reports to his probation officer twice a week and signs in and they go on their merry way. That’s not what we try to do. But the reason clients come and see us is because of orders of the court. Then, when they come and see us, what do we do with them? We try to engage in addressing their criminogenic behaviours and reducing the likelihood of them reoffending.
That’s why we call it “reduce reoffending” and “community safety.” They go hand in hand. I’m not trying to make one greater than the other is.
S. Robinson: Well, you did earlier. You actually said, “The number one goal is to enforce the orders of the court,” which is very different than saying: “The number one goal is to work with people to reduce recidivism or to work with people to change behaviour.” It’s a very different focus.
I’m reminded of an earlier conversation we had with an earlier report that my colleague, David Eby, responded to. We had a similar discussion that I’m recalling, which is about, “Well, we only have people two years less a day, and there’s really not much we can do with them,” which is again, I think, a philosophical difference around what’s the role of corrections.
And it is corrections, which is about changing behaviour. It’s not about enforcing rules, necessarily. It’s really about changing behaviour. It’s not a whole lot of what I’ve been seeing in terms of just philosophically as it comes through. It really does need to be about changing behaviour.
B. Merchant: I don’t disagree.
S. Robinson: Glad to hear it.
B. Merchant: When I added that clarification, I hope that made sense to you.
M. Hunt: I want to thank the committee members who have gone before me, because they’ve asked most of my questions on recommendation 2. But I guess I slightly want to turn the question, because they have been dealing with the accountability to the taxpayer for the dollars spent. Why are we wasting money on something if it’s not effective? Real simple question, right?
Can I turn this around the opposite way and ask the question? The judges are making these orders. Where is the accountability back to informing the judges as to whether they’re making good orders or not? If we’re
[ Page 1025 ]
not evaluating — and I’m assuming Corrections is, over Corrections — whether the stuff is coming out right at the far end of it all, where’s the feedback back to the judges that says: “This is a useless thing. Why are you sending people to do this?” Yeah, it’s a waste of taxpayers’ money, but we’re not accomplishing anything.
I just think that this whole evaluation thing is absolutely, totally critical to you effectively doing your job for the judges. True or false?
B. Merchant: I’m not sure I can give you a true or false, but I can tell you what we have done to try to inform the judiciary and others in the system. The corrections branch is somewhat of a closed environment. It has been historically. What we have started trying to do, dating back…. It’s a little over a year ago that we started a process called “Making connections, creating better outcomes.” We developed an informational presentation, shall we say.
In doing that, the provincial directors and myself have met with the Chief Justice and the Associate Chief Justice of the Supreme Court, 17 justices of the Appeal Court, the associate chief judge and the Chief Judge of the Provincial Court. We’ve addressed 135 of the Provincial Court judges. We’ve met with the chiefs of police. We’ve met with Crown counsel, administrative counsel, in the province.
Our staff have gone through the same routine. What they’re doing in each of the communities throughout the province is the wardens and the regional directors in communities are getting together. They’re inviting the judiciary Crown counsel and police — all the people in the criminal justice system — to get together. We explain what we do and how we get there and what we think is effective.
The criminal justice system, I think you all know, is a complex organization. I wouldn’t say it’s nimble in that sense. Change takes some time.
That’s what we’re trying to do. We’re trying to inform the judiciary, the Crown and so forth, of how criminal behaviours can be addressed. That’s through suggestions on what can be on orders, probation, bail orders, things like that. It is a process that does take time, but I think we’ve been moving in the right direction with that in trying to inform people about what the corrections branch does and some of the limitations we have. We’ve been trying to inform it.
We will spread that out to local mayors and councillors and the community, and will continue to do that so people have a better understanding of who we are, what we do and what we’re trying to do. Through that educational process, we hope that we can help change some behaviours that will impact how people are treated.
I don’t know if I answered your question or not, though.
M. Hunt: Well, yeah, sort of. But it’s an illusion that the public has. I’m from Surrey, so let’s just pick on history. We’ll do a history, and we’ll deal with the auto theft capital of the English-speaking world. You know? The reality is that there’s an illusion that is given to the public that the RCMP have done a great job going after these prolific offenders, and they’re locking them up. You’re telling us that, statistically, they’re not getting locked up. They’re going off to supervision, and they’re right back out on the street again — at least numerically.
How are we dealing with these? Again, this goes back to the issue of: are we just running a revolving door here until finally somebody gets fed up and says, “You’re a prolific offender. We’re locking you up instead of keeping you back there”? Because isn’t that the point of this audit, to see: are we being effective? Are we, in fact, changing the course of these individuals’ lives?
B. Merchant: Well, on the community side, 75 percent of the people that we supervise don’t come back to us in a two-year period. That’s the evaluation period. So 75 percent; 25 percent do reoffend.
I know that people in the community would like to see 100 percent. I know that. But 75 percent — that’s a lot of people that aren’t coming back. Does that mean our programs are effective? We think they are, and the evaluations that we’ve done of other parts of those programs show that they are effective.
Would I like to see that number continue to rise and get better? Of course I would. But there are some people that…. They don’t change as much as you’d…. Later in life they will change, through passage of time and things like that.
M. Hunt: Yeah, age and maturity. Time works.
Final question, on recommendation 4. I just find recommendation 4 to be a relatively simple, straightforward recommendation. It says to make sure you confirm what courses are necessary for people to supervise certain cases. Make sure they have the training before they get somebody like that.
To me, this is checking the boxes off, and it’s yes-no. Now, again, recognizing small communities, rural issues…. It may be a training issue, and that might take time. But when I hear your response to the Auditor’s report, it’s: “Yes, you’ve thoroughly reviewed. You’re decided what’s required.” Good. And then you said, “And we’ve got this wonderful tool,” but I don’t see anywhere here where you’ve said, “Are people using the tool, and is the tool effective” — actually doing this.
In other words, is it working? Are people getting the training before they get them, other than the exceptional situation of out in the middle of nowhere, where it takes time for them to get the training and they will, closer to the time?
B. Small: The turning point between what the recommendation was and what we had achieved was just
[ Page 1026 ]
whether or not we’re effectively implementing the tool. I’m pleased to say that we have agreed that until such time as it was being used consistently and ongoing, we couldn’t rate ourselves as anything other than partially implemented, based on our conversations with the auditors.
I’ve confirmed with our staff that, in fact, it is being used universally. Every manager in every office in the province of British Columbia now has a common understanding of and a common application of the policies and the tools necessary to assure us that those policies are being followed.
This is one that I’m very pleased to say…. If we’re using the colour coding, it would be green for us as well as for the Auditor, assuming that I’m being truthful, which I assure the committee that I am.
M. Hunt: And assuming that they came and audited you today instead of X number of months ago when they actually had the process.
B. Small: It is, as our colleagues said, an ongoing, iterative process for us. We are continuing to action the items that are in this report, and we’re not letting up on the gas. We continue to advance it. So with each passing month, my expectation is, and so far what we’ve seen is, increasing alignment between what has been expected of us, mindful of the couple of areas that we have discussed here, and what we’re actually doing. Those are coming right together. And yeah, we’re very happy with that.
So it is about the time even since this report was issued.
M. Hunt: Mr. Chair, I guess I should have said at the very beginning that I want to thank the Auditor General for this report. I think this is great, and I think this is the kind of stuff that we need more of — to hold all of us accountable to the report so that they just don’t sit and collect dust on the shelf. Thank you.
K. Corrigan: So the point of all of this is that we want to reduce the number of people who reoffend. So the question is — because it’s not in the report, and I know there are challenges with measuring…. But it sounds like we have a measurement now that everybody has agreed on of two years no reoffending, 75 percent. Are the numbers going up or down?
L. Greiner: I don’t have the numbers on me exactly, but they’ve gone down approximately a percentage point for the last three years.
K. Corrigan: Okay. Wait. The numbers that we are talking about were….
L. Greiner: They’re roughly the same, so statistically speaking, there hasn’t been a lot of change.
K. Corrigan: Okay. I understand.
B. Ralston (Chair): Is that a percentage point a year or a percentage point over three years?
L. Greiner: Again, I don’t have the numbers on me, so I don’t want to say anything that I’m not….
B. Ralston (Chair): Well, maybe you can just respond to the committee in writing so that we get that.
I mean, given that recidivism is one of the measures…. I know there are social finance initiatives in Britain where programs are set up, basically, specifically to measure recidivism, and the profit or not to the private provider flows on the basis of recidivism, so there are evaluative mechanisms out there.
I think it’s important just to have the statistics correct. Frankly, I would have expected that they would have been brought here before the committee anticipating that answer, since that is one of the very focuses of this report.
Anyway, back to Kathy.
K. Corrigan: I understand that statistically a percentage is, perhaps, not significant, depending on how large the sample is, I guess. But when you say “a point change,” if that is what it is, are you saying a point of improvement? Mr. Merchant was talking about 75 percent not, so are you saying it’s 76 percent? Is it going in the right direction or not, when you talked about, “about a point change”?
L. Greiner: Yeah, it’s….
K. Corrigan: An improvement slightly.
L. Greiner: Yes, I believe so.
K. Corrigan: Okay. I just wanted to make sure we’re talking about the same thing.
B. Merchant: Well, it’s a difficult one to do, because over years, you do have this going like this, so the changes aren’t that significant that you can say: “Oh wow, it’s really going up,” or “Wow, it’s really going down.” It varies from year to year.
Just as a sidebar, across Canada, and even in European countries, the recidivism rate is not common across Canada. If you go to Alberta, Saskatchewan and Ontario, they calculate their recidivism rates differently. Right now in Saskatchewan, Public Service Canada and the Saskatchewan government are doing what’s called recontact. It’s a more fulsome look at it — do people have interventions with the police? — and the whole nine yards, so to speak. And they’re looking at a four-year time period. But that study is just underway.
Across Canada, I co-chair the Heads of Corrections for Canada. We’re trying to standardize those kinds of definitions. I think, for the Auditor General and for yourselves, when you ask us questions — what are they doing in Ontario? What are they doing in Quebec? — we’d be able to give you information that is consistent across Canada. Unfortunately, and surprisingly, everybody defines everything differently so our comparisons to other jurisdictions sometimes are very difficult to do.
K. Corrigan: Obviously, having people in community is a lot more cost-effective, and probably there are lots of other benefits as well. Not being in a prison is a good thing for people, generally, in terms of future behaviour. But it is important, I think, overall, for us to say that while people are getting in trouble and coming before the courts again and having some kind of order, that those numbers are improving. To me, that is perhaps the best measure. I know there can be other things. So we don’t really have any evidence of any change despite having community corrections for — how long?
B. Merchant: Well, we’ve had community corrections for a long time. The corrections branch was formed with the introduction of the Correction Act in 1970. That’s when it became the corrections branch. There was the probation service of B.C., and that dates back to the 1930s or so. It’s been around for a long time.
K. Corrigan: This whole audit is talking about the effectiveness and accountability for the measurements, and so on. We would hope that, over time, as we’ve become more mature and able to look at these various measurements, we should see improvement. I know we’re dealing with a difficult population. I understand that.
I wanted to ask about a couple of other numbers as well. The original report — and you referenced it earlier — said that only 35 percent of the interventions are ever completed. These are the interventions that are programs and so on for the offenders. Do we know what that number is now? I know you’ve talked about improving by prioritizing, but I would assume that every single one of those recommended interventions for an offender are considered to be worthwhile. So what is the percentage of interventions that are completed now?
B. Small: We were just quickly conferring to look at whether we’ve actually developed that form of metric. We haven’t done that kind of measurement because, frankly, we…. Going back to the original report, the statement to us at the outset that only 35 percent…. It wasn’t clients. It was about 35 percent of all interventions that are identified as reasonable and possible to leverage during the course of supervision are ever completed, to which we said: “That sounds not bad.” When I say that, it sounds provocative. But the reality is, as Brent mentioned, our clients present with an enormous array of complex needs, criminogenic needs, needs that influence and impact reoffending, as they come through our door.
The average period of supervision is, ballpark, about a year in length. The average client is under our supervision for about one year, and they may present with three, four, five areas of their life that require attention in order to try to mitigate their overall level of risk.
We recognize that the literature, the research in the area of correctional practice, talks about the need to prioritize those needs and ensure that we’re addressing them in the right fashion. If you have a client who is coming through the door presenting as psychotic, has a very substantial crystal meth amphetamine addiction, has a history of mental health needs, and they have an antisocial belief system that says, “I’m entitled to take anything I wish to take in order to meet my needs” — sort of a general antisocial approach to theft — we’re not going to be able to address all of those, in shotgun fashion, at the same time.
Rather, we need to, through the application of a validated risk assessment tool, identify the broad spectrum of needs and then begin to work in a very deliberate — sequential, at times, and, at other times, simultaneous — fashion at addressing those needs in a way that allows us to ultimately tackle the attitudes, beliefs and thoughts that support their criminal behaviour.
There’s agreement between ourselves and the auditors about the evidence that cognitive behavioural approaches are effective at addressing criminal behaviour. For our staff, the challenge is that we need to apply the assessment tools effectively and then develop a case management plan that addresses those needs in an appropriate manner, both in terms of sequence and focus. At the time of the initial audit, we were very pleased that the Auditor contracted with Dr. Steve Wormith, who is considered to be one of the pre-eminent experts on correctional practice and correctional case management.
Dr. Wormith sat in Victoria with a sample of files, I believe 60 in number — if I’m incorrect I apologize, but a number of files — and he, himself, went through each one of those and applied those risk assessment tools to determine whether our staff were effectively doing so. Our staff were, in excess of 70 percent of the time, perfectly aligned with Dr. Wormith’s findings. By any measure, that’s a very strong interrelated reliability. He found that our staff were applying those tools very, very well.
There wasn’t an issue about efficacy in that regard, but, in fact, just the expectation inside of a policy framework for us that suggested that for each and every need that could possibly be contributing to criminal behaviour, we needed to identify an intervention to address it, which we still do. It left it at that, so there was the suggestion that our staff would be, potentially, expected to complete eight interventions in the course of one year. For anyone involved in this business, they know that one year is hardly
[ Page 1028 ]
time to effectively leverage and complete — which was the measure that they held us to — those interventions.
We responded and said that absolutely, we need to update the expectations that we provide to our staff through the policies that we provide them. Not to suggest that they don’t identify the types of interventions that we need to apply, but to make sure that they apply them thoughtfully, without this unrealistic — and, I dare say, wouldn’t be supported in terms of the research — application of a shotgun of all of these interventions at one time, flooding the client with all of these programs and services.
We recognized that that was a shortcoming, so we addressed it. We feel much better, not just about the risk assessment tools we’re applying but the way in which we’re applying the case management tools in terms of moving forward with case management plans to address those needs.
K. Corrigan: I don’t know if the Auditor General or others of the Auditor General’s office want to weigh in on any of this.
The suggestion, then, it seems to me, is that although that did stand out from the original report — only 35 percent of interventions are ever completed — what you’re saying is that that isn’t really a fair measurement of the effectiveness. I guess I’ll also ask, just to make sure that there’s been no suggestion, then, that the recommended interventions — the volume of them or the number of them — should be reduced in order that the completion rate goes up…. I assume there’s been no order that’s come down to that effect.
B. Small: There’s been none whatsoever. It is about sequencing.
K. Corrigan: It’s about sequencing.
I wanted to ask about one more number. You said that 35 percent…. The completion of interventions is not something that you’re measuring, which I find kind of surprising. Perhaps it’s a quality, but I do find it…. It did stand out so much in the report.
Another way of ensuring there’s oversight is the file reviews, which include looking at whether the interventions are effective. This progress audit says that less than half, or 47 percent, of the expected file reviews were completed by local managers between September 2014 and August 2015. Why would that be?
B. Small: For us, we had undertaken a substantial amount of work across each of the recommendations in terms of putting the tools and the supports in place to measure, refine or change approach to respond to the recommendations. When we met with the auditors for the preparation of this follow-up audit, what they indicated to us….
Again, that goes back to our feeling that we had a substantial amount of work. But the final piece for us is assuring ourselves, over time, that these tools are being used. So yes, the quality assurance piece by managers is, in our view, the fundamental conclusion of that entire process. What we realized is that, yeah, they are doing a substantial number of file reviews but not at the level that we want them to be.
Again, as the changes have gone into place, those numbers have continued to grow. We have seen that during the period the Auditor looked at in the most recent review, the figures, as you noted, were slightly under half of what the target should have been. We have seen since then — and this is a measurement that was taken this week — a 49 percent increase over the number of file reviews that were completed, and I expect those numbers to continue to grow.
Moving along. As I said, the number of file reviews that have been completed this year exceed the number of reviews that were noted by the Auditor by already 50 percent growth in those numbers, and I expect those to continue to grow.
The other piece that we were very pleased with…. Ultimately, it’s not just about the numbers but indeed the quality of the work that’s being completed. We’ve noted that in each of those rounds of assessment, which are continuous and ongoing by managers, in excess of 90 percent of those files are meeting all of the necessary measures of performance by our staff — 93 percent the last time we did that extraction.
We’re seeing ever-growing numbers of quality assurance reviews, and the findings of those reviews tell us a very positive message — that our staff are actually using the tools, are applying risk assessment tools properly, that their case management plans reflect those risk assessment tools and that the ongoing case management is concurrent with that case management plan. Those are all trends that we’re very proud of.
B. Ralston (Chair): Just to the Auditor General: were you aware of these recent changes in the self-evaluation that have just been put before us?
M. Gaston: No, we weren’t, actually. But obviously, the figures that we have in the report are those that we found at the time of our audit, which was very much earlier this year.
K. Corrigan: One more question?
B. Ralston (Chair): Okay, one more.
K. Corrigan: I’ll ask one more. I had a question for the Auditor General. We’ve been talking a lot about the progress, but I’ve got a question about process. What are lessons? What do you take away — I know some of it is
[ Page 1029 ]
definition — from the fairly stark difference between the self-evaluation in terms of implementation and your progress audit?
C. Bellringer: I’m going to answer it with, and what I’m struck with is…. There have been a couple of audits that we’ve issued recently where you’ll note the recommendations were actually already addressed before we even issued the report. I guess that’s the ideal for us — that during the course of the audit, there’s a better alignment. We are talking several years. We note the progress. That was a genuine positive comment that we’re seeing that progress.
But I’m not quite capturing the same…. I don’t know that we share the same sense of urgency around the completion. That’s my observation.
V. Huntington: It’s been canvassed pretty well, but I go back to a lot of Marvin’s comments. I feel like I’ve fallen down a technocratic hole here. All we’re discussing in the midst of one of the most important ministries and departments in the province, at least in the minds of many of our citizens….
I feel like, instead of metrics and research design…. Those components asked by the Auditor General don’t fit with the way we like to measure — what was the language? — “the effectiveness of the full spectrum of services offered by the organization.”
I just feel like what we’ve got, when you boil it down to common sense, is a bit of, possibly — and I say this with great respect to both the Auditor General’s office and Corrections — a communication problem here, and we’ve got, perhaps, a management and communication problem.
Surely you cannot refer individuals to a program without understanding what that program is. I can’t for a minute think that, over the years, Corrections hasn’t developed an understanding of which programs are successful, which programs aren’t successful, who’s offering successful programs. Why would you or could you possibly refer somebody to a program that wasn’t considered successful in meeting the goal that you have?
Now, say your goal is 75 percent lack of recidivism over two years. I think that’s a minimal. That’s not a bad goal, but I think it should be better. So if you’re upping your goals…. Say you want to take a look at…. Well, let’s try, “Which programs are so successful that we could expect maybe 76, 78,” and start referring.
No referrals should take place, in my mind, without a reporting requirement — none — and no service provider should ever be contracted that isn’t capable of providing you with a full and complete reporting assessment. The research in the ministry would then be looking at that statistical data and determining the programs and the success rate of the programs.
I just feel like there’s a lot of upside down management here, and that’s because I’ve fallen into this technocratic hole. It’s all metrics and research design.
Now we’re talking about files that are being reviewed. Well, surely managers in each unit and each community have, by now, got fairly sophisticated templates by which to manage. They’ve got fairly sophisticated reporting.
It’s as Marvin says. It can be checklist reporting at this stage, because your research and analysis in Victoria should have given you the understanding of which programs are successful. I just feel like we’re not talking rocket science here, and if there is a failure or an inability to understand how to evaluate a program effectively, then I think it’s a management problem.
I say that respectfully, because it’s too big. I haven’t figured out today whether this thing is being managed properly or not. It’s because we’re talking metrics, and the metrics don’t correlate with what the Auditor General wants. I just….
B. Ralston (Chair): Would you like someone to react to your statement?
V. Huntington: No. I’ll leave it at that, because I’ve got a page full of comments here. I just feel that not to evaluate a program at this stage in the development of the corrections branch is impossible to believe.
I just hope that you can work this out with the Auditor General. Or maybe the Auditor General is asking for evaluations that aren’t appropriate for the work you’re doing. But somehow or other, there has to be a reaching of a common goal here, and I don’t think you’ve reached it yet.
B. Ralston (Chair): Mr. Merchant, do you want to respond? Or Auditor General? Who would like to respond?
C. Bellringer: I’m just thinking through. In terms of what we would normally be doing, there’d be another update provided. We’d decide whether to go back in.
It does point, to a certain degree, to the limitations of follow-ups in general. We don’t go back in and re-audit with a new starting point. The benefit to that is it’s less time-consuming than doing a full audit. So we think it’s still worth doing. When we do end up with disagreements, it does end up being…. In order to resolve the disagreement, you have to go back in and redesign the whole test, if you will.
I don’t think that the nature of the recommendations that remain in progress are that complex to resolve. I do think it’s come much closer than it was five years ago. So we’ll take another look at it with this next round to see if we can report back to you. But it won’t be a reopening up of the whole audit.
B. Merchant: I think I agree with what you’re saying. We do know what programs work and what are effective, and we do that through our evaluations.
[ Page 1030 ]
Our relationship violence treatment plan is very effective for those domestic violence cases. It’s effective enough that the introductory part of that program, called respect for relationships…. We’re working with the Ministry of Education, and they’re going to incorporate the principles and how we do this into the educational system. That, to me, is a real indication of how good that program is.
We wanted to not only address the people that we supervise, but we want to help it earlier in life with the programs we know work with offenders. So if we can transform our programs into the school system and get it so kids understand what a respectful relationship is all about, that’s a good thing for everybody. It also has a preventative aspect.
Our drug treatment court. It’s been in operation for 12 years. We’ve evaluated. It’s effective. We know that. There’s a host of those things that we know are effective, and they’re all our programs.
We are nimble. We just did an evaluation of our substance abuse management, and what we found out was that for men and women in the community, it’s effective. For women in custody, it’s effective. For men in custody, it’s not effective. It’s even worse for aboriginal males. So we have ceased providing that, and now what we’re doing is we’re going through a process of adjusting the program so it will work.
So we do evaluate. That’s why we need to do the evaluation, because if things aren’t working, we want to change. We do want to do that rehabilitation part.
I know we get into these talks about the metrics and all that other stuff, but that’s what we want to do. That’s what community does extremely well. Every probation officer in this province is faced with just a host of people that most people don’t want to meet. They have presenting problems. But what they do and how they do it with those people…. What is it based on? It’s risk-need-responsivity. Those are principles that are all over Europe — they’re all over Canada — on how we deal with people.
It’s just sometimes, you know, when you go through evaluations and stuff like that, you feel, kind of, you’re attacked for what you do. Like, I understand that. It makes us better by doing that, so I’m okay with it. But you also take it personally as well. Sometimes that’s kind of hard to take, and you get a little angry about it. But our staff really do a good job.
V. Huntington: We understand that on this side of the House, yes.
B. Merchant: I mean, the main thing is that we’re trying to change criminal behaviour. There’s a top eight, but I mean, it’s people that…. You try to change their thinking patterns about criminal ideation, their criminal friends, their families. Are they employed, or are they not employed? What about their substance abuse problems and their mental health issues? Those are the big things, and cognitive behavioural programs are the ones that change those behaviours.
Could we get the numbers up? That’s what we try to do. We try to get the numbers up. I’d like them to be a lot higher, and we keep introducing new programs. Strategic training initiative in community supervision — we’ve rolled it out over the last three years. We’ve added 36 FTEs to do that, plus we’ve rolled out another 12 FTEs for staff and caseloads. But before we know the results of that, what we call STICS….
We won’t know for three years, because it takes that long to study it. But we’re the only jurisdiction in the world that’s rolled this out. And it has been studied, and it’s said it’s very effective. We are doing things. But some of those things — to get the results that you’ve asked for, that people ask for, to show that we’re doing the right thing — do take time.
We do meet with those smaller community service providers. It’s more like a performance appraisal. We do rate what they do, and we look at what they do and see if it’s effective. We talk to the community members, and we talk to people in the criminal justice system, and we try to make it better. If they’re not doing what we expect them to do under their contracts and things like that, we do change them. And we do that in consultation with the community.
B. Ralston (Chair): We’re replowing here, so I’m going to go to Lana.
L. Popham: My question is specific around placement of sex offenders into communities. I’m certainly not going to hash out policy, but as MLAs, we are not informed when a sex offender is placed into our communities. My question is going to be directed around evaluating effectiveness of programs and the cost of programs and the allocation of resources.
When a sex offender is placed into our communities, some people are informed, and some aren’t. In my community, for some reason, private schools are informed but public schools aren’t informed. Some neighbours are informed by a walkabout by probation officers, and some aren’t. Mostly it happens if somebody happens to be home at the time.
I don’t know why those decisions are made, and that’s not what this conversation is about. What I want to bring up is that this turns into a completely full-time job for my office and all of our offices — anyone who’s had this experience. We are the conduit to the community. When an evaluation of how a program is working for the community happens, we are not brought into that conversation. I think there is input that we could give, and also an evaluation of how many resources are put into each file.
My office becomes a place where the community vents its anger about an issue, mostly out of frustration with
[ Page 1031 ]
lack of information. I understand there are privacy issues, but I think there’s a chance for us to be brought in. We are a great resource. Our office ends up having three or more employees spending weeks on this file because the community is so up in arms.
I’m just wondering. When you’re evaluating, is that perceived as part of the program, or is it just the way it happens? It falls out.
B. Merchant: It’s a difficult one. I’ve never met a community that said: “We really want sex offenders in our community.” It rankles everybody, a sex offender. But sex offenders are released from custody, and they’re also released directly from court into the community. We don’t have any say on that matter. It’s whatever the court order says.
The key one is that some court orders say “live as directed.” Bill’s staff would then do an assessment of where they could live. If it doesn’t say “live as directed,” then that person can live wherever they want. We have no say in that at all. There’s no legal manner to do that.
We have different notification procedures. One that you see more often…. Well, you don’t see it more often, but it’s bigger. It’s when we do a public notification. There’s a process through that. We can’t give your office a heads-up on it. We work with the local police force on that, and then we do a notification. We cannot notify anybody else. Under the Freedom of Information Act, we cannot notify. I cannot give you a heads-up. That’s the law.
We do other kinds of notifications. In that sense, it’s one of the harder ones to deal with. I understand the work it causes you, because it all ends up on my desk, as well, and it builds. We had that one case in your riding that Bill helped out on.
I don’t know if you have more to say about it, because you’re the ones that….
B. Small: No. I think communication is really critical. MLA Popham and I had…. I recall the case in question, and one of the most important components was that we can be as responsive as possible to the inquiries as they come through.
Certainly, I appreciated the letter of thanks at the conclusion of that particular case around the level of work that our staff did to ensure that your staff were properly supplied the right information to answer the questions and also to direct some of those questions back to us. At the end of the day, we were charged with the responsibility of supervising these individuals, and as Brent says, our hands are as tied or as free as the court order allows us.
Not to dwell back into the court order again too deeply, because that creates its own little energy, but the court order permits and restricts our staff in terms of the degree of intrusiveness they can apply in their supervision of these clients. That said, our primary focus is the public safety, and that’s the reason why we pull the trigger on notifications to the community to make sure that they’re apprised of who it is that’s living in their community. We do it within the bounds of the law, and we do it with the intent of ensuring that the public is aware of who’s around them.
That is a double-edged sword, because invariably, there will be a response from that community, and I understand that response. It’s based out of fear. It’s based out of worry. And our job is to work with those community members and with your office to ensure that everybody has as much information as we can provide without breaking the law and that we continue to work with those stakeholders and support them as we work with the clients.
Our job is to work with those clients to deliver the kinds of programs and services to reduce that risk so that it moves from the realm of perceived to actual risk. If we do our jobs well, that risk goes down. Certainly, the evidence we’re getting is that we’re heading in the right direction.
L. Popham: I have a follow-up, if you don’t mind. I appreciate all of that, for sure, but I guess my point is that because we are unofficially involved in something like that, if you’re going to be evaluating the effectiveness of a program, it would be great if at some point, even after the fact, we could be brought in as part of the assessment, because I think that our feedback would be valuable. We’re using our resources, so there is money being spent from our end on this, for sure. I think just involving us in the evaluation at some point would be helpful for you but also for us.
V. Huntington: That goes for almost every ministry.
L. Popham: Yeah.
D. Eby: My first question is to the Auditor General. Can you clarify for me again why recommendation 8 wasn’t part of your analysis? It seemed to be a really important part of the initial report. Recommendation 8 deals with the failure to adequately document breaches of conditions and the failure to enforce breaches when they happen.
A specific incident — or a number of incidents — was outlined by Auditor General Doyle at the time, in particular where men who had been identified as being at medium or high risk to reoffend had contacted their victims of domestic violence in violation of their conditions, yet for some reason there hadn’t been enforcement around that breach by the supervising probation officer. Why was it that you didn’t look at whether or not this was happening in this particular review?
M. Gaston: As I say, the Auditor General mentioned it earlier. Obviously, we need to think of the resources that we’re using in each of these jobs. We’d take an assessment
[ Page 1032 ]
at the time that there was a suitable overlap or some overlap between recommendations, particularly the last four. A lot of them were around quality assurance processes in place to try and make sure that the division was achieving its goal. So we targeted our resources on some of the recommendations but not all of them. It was really just recognizing that that allowed resources to be used, looking at other issues elsewhere.
I would just highlight that the more recent action plan certainly shows that there’s a recognition that there’s still work to be done on recommendations 5 and 8, which we didn’t actually follow up on. The division’s own assessment of its progress there has changed.
D. Eby: The second question. I wanted to clarify this with Mr. Small, with respect to Kathy Corrigan’s questions about interventions completed in the 35 percent rate. You were very unambiguous that it is your opinion that this is a sequencing-of-interventions as opposed to an availability-of-programming issue. I want to let you know before we wrap today that I will write to Elizabeth Fry, John Howard and the president of our legal services. I will ask them about the availability of programming, because this is something that, in my experience, has been identified as an ongoing issue in provincial corrections.
Before I do that, I was wondering whether you wanted to take any opportunity to clarify whether or not offenders are able to get into programming in provincial prisons related to addiction, mental health, and so on that are necessary interventions to avoid recidivism or whether this is a sequencing issue.
B. Merchant: Well, I’ll answer that, because you’re talking about the correctional centres. Within correctional centres right now, 65 percent of the people…. Of the 2,750 that are housed there today, probably 63 percent of them are there on remand status.
D. Eby: Maybe I used some terminology that suggested I was talking about inside the prison. Certainly, I’m glad to hear about that. If someone needs detox, drug treatment or mental health treatment in the community, I want to know whether or not they can get that, if they want it, or whether or not there’s an availability-of-programming issue here. You’re suggesting that this is simply sequencing. I’m wondering if someone who requires these interventions can get them.
B. Small: Backing up slightly from that, the original audit suggested that we were failing to complete just under two-thirds of all of the interventions that we identified that we wanted to do. That really is, at its core, a volume issue. We went through many of the cases during the time that we were debriefing the Auditor’s original report. We would identify six criminogenic needs or needs that required attention to reduce the likelihood of reoffending. We were obliging ourselves to identify correspondingly, at minimum, six programs or interventions to address those six needs, and over the course of less than a year, we were not completing all six of them.
That really did, at its very core, relate to the physical capacity of one individual to take those kinds of meaningful strides across the full range of all of the things that they need to work on. These are highly complex, multi-needs, often very multibarrier clients. By barrier, I am referring to everything from willingness to be involved, personal capacity.
We have staff who work on the Downtown Eastside of Vancouver, which I know you’re aware, and those clients are very, very challenging to work with. For us, if those clients are completing more than a third, from start to finish, of programs in the course of less than a year, we take a great deal of comfort from that.
My comment today related to our response to that. We were, quite unknowingly, holding our staff to a standard in our policy framework that was out of step with reality in terms of our ability to deliver that full range of intensive, intrusive services in a short window of time.
The literature tells us that correctional interventions need to applied both in a targeted way, focusing on the things that can influence behaviour, but also in a thoughtful way in terms of attending to the various needs. There are many of our clients for whom we’re really addressing some of those fundamental…. You know, the hierarchy of needs. They have no housing. They’re actively psychotic. Their primary health needs are in terrible condition.
Our staff are not going to be in a position to work with those folks on their thoughts and beliefs about the role of power in a relationship before we’ve been able to attend to those very fundamental needs and get them into a place where they’re actually ready to do that other exploratory work. So my comment around the 35 percent was really…. I’m not asserting that resources are universally available to all clients at all times in all communities.
Our staff — as do all people working in the human services field, from social work to probation and everything else — are continually working together and in the communities to identify areas where we can continue to grow and foster new programs and encourage the development of those programs. We work very closely with community service providers all the time on new programs and new services.
D. Eby: So which programs are missing, then, or difficult to access for clients if it’s not just sequencing — if there’s an availability of program issue?
To my mind, addiction treatment and mental health treatments and housing are probably the top three. But I’m curious about what you have seen in your evaluations
[ Page 1033 ]
of what you aren’t able to get clients that need it because there’s a scarcity of a particular program in a community.
B. Merchant: Well, some of the programs you need groupings of people. Depending when those groups are organized, if they’re community programs, the person may be on the tail end. Their timing is not right. So, do the people we supervise have access to all of the programs all of the time that they’re in our custody? The answer is no, they don’t.
You’re right. It’s almost like Maslow’s hierarchy of needs. You’ve got to look after housing and health and things like that before you get to a higher state where you can start dealing with, where they’re willing — the responsivity part — to start engaging in programs that address their criminal behaviour. But if you don’t have housing and your health is terrible and you have an addiction, it’s pretty hard to start thinking about: “Where am I going to get a job?”
So you have to take that into consideration when you’re dealing with people. People in community supervision are no different than others. I mean, there’s only so many of those programs around. There’s only so much that can be done with so many people. It’s difficult. Some of them get the programs right when they need it, and others don’t.
D. Eby: I guess it’s hard for me to emphasize my frustration with a missed opportunity here. You are in a position where you are able to collect data on recidivism — what drives it, what prevents it, what the actual availability of real programs like housing, mental health and addiction in the community are, sex offender treatment programs.
To come to this committee and say, “We’re doing the best we can, but there’s not enough housing” or “There’s not enough mental health treatment” or “There’s not enough addiction treatment” or “There’s not enough X….” That’s something that you should know as legislators, because that is driving recidivism.
What the Auditor General is telling you is that you are not collecting the information. What your answers to my questions are telling me is that you are not collecting that information. You don’t know. You don’t know what’s not available. You don’t know what’s driving recidivism. You don’t know why — according to your numbers, which apparently you didn’t bring — 25 percent of people are showing up again and not just committing a crime and not just committing a breach, but committing a crime, being arrested, going to trial, being convicted and being sent to jail within two years. One in four of your offenders is doing that.
There’s a whole bunch of people that don’t make it that far that are committing crimes, that are committing breaches, that don’t make it that far. And you can’t tell this committee what’s driving that or what absence of programs or what program should be improved. That’s what the Auditor General is telling you here today.
If you hear frustration in my voice, continual frustration with this particular ministry, it’s not always like this. But with this particular ministry and this particular program, that you can’t come and tell us these things and that you can’t provide that information to us as elected…. We go back to our community, and we say: “Well, there were these issues identified with community corrections, and people are being released and they’re reoffending, but nobody really knows why, and nobody really knows what programs….”
Even the experts in the prisons are watching people on probation fail and come and be convicted and brought back to jail. What kind of a system is that? How do you come here without having that information? I’m lost for words at that.
B. Merchant: Because we’re dealing with people, you’re asking a very complex question that can’t just be answered with a series of questions and answers like that.
I’ll gladly have you come, wherever you are, and I’ll bring staff and I will explain to you what drives criminal behaviour. I’ll explain to you how you address criminal behaviour. And we will show you the research on how it’s done, and we’ll talk about risk-needs-responsivity and assessments and how you have to deal with people at the right time in their development.
I’ll tell you what we’re short of and what we can’t do and all those kinds of things. But for me to go on about that would take hours here. And just having little bits of questions asked and then saying: “Well, you don’t know what you’re doing….” We do know what we’re doing. We do know what we’re doing, and we do it very well. Our staff do it very well.
If you’d like to have that in-depth conversation and that in-depth education about what we do and how we do it and how we get to these people, you’re more than welcome to do that. But it’s not just a thing where we can do it in an hour and a half with numerous questions about an audit. Criminal behaviour — there are books written about it; there are courses on it. There’s a lot in there, and we’re more than happy to go through that with you any time you wish.
B. Ralston (Chair): Well, I think it’s more than satisfying one individual member of the Legislature. It’s also public accountability. That’s what this committee is here for. You appear to be frustrated by the constraints of this committee. I’m sure we can arrange for — I’ll discuss it with the Auditor General for the next audit — a more expansive discussion of that.
I think these are legitimate questions that are posed to us. Whether it’s in Chilliwack or in Point Grey, people are concerned about recidivism, people recommitting
[ Page 1034 ]
offences. What’s going on in their community — the fear of crime and ongoing community safety — is a real public issue.
It’s not a question of simply satisfying one individual member of the Legislature. It’s an ongoing accountability of the ministry, and you as the ADM of this particular section, to the public through this mechanism. So I don’t think the approach you’re taking is…. I think it’s a bit too narrow, if I could put it that way.
Anyway, I’m going to turn it over to Kathy.
B. Merchant: We did say that we were going out and meeting with judges and all that on this, making connections, creating better outcomes. So we are trying to….
B. Ralston (Chair): Judges aren’t elected. You know, they’re not…. I’m familiar with the court system, having practised law in the court system, and judges have a different accountability. They report maybe to the Judicial Council if there are public concerns about it, but ordinarily they’re not publicly accountable in the way that we as legislators are to our constituents, especially on issues of crime and public safety. These are important issues, and to simply say, “Oh, it’s too complicated to explain to you here; I’ll give you a private briefing,” is not a good answer, in my view.
K. Corrigan: I have two more questions. Page 11 of this progress audit, under recommendation 3, says: “In our 2011 audit, the division reported that it was under-resourced. It stated that caseloads had reached an unmanageable level and that public safety was at risk.” The recommendation was “that the division complete a comprehensive impact assessment to determine if there are any gaps between its staff capacity and caseload level currently and in the future.”
The Office of the Auditor General says that this has been partially implemented but that your division reports “that it can’t fully implement this recommendation and establish if there are gaps between its staff capacity and caseload level because valid tools are not available to complete such a detailed assessment.”
I find that a little bit troubling because so much of what government or ministries can or cannot do depends on whether they’re appropriately resourced, and you’re saying that you can’t measure whether or not you have appropriate resources in order to do the work. So I guess I’ll just ask for your observation, then. Is the division still under-resourced, do the caseloads remain at unmanageable levels, and is public safety still at risk?
B. Small: I’ll address those in sequence for you. First off, the picture that was provided back in 2011 does look different today. In the intervening period of time, we have seen caseloads drop to a dramatic level, at a dramatic rate.
At that time, we had over 24,000 clients under our supervision. We’ve dropped by about 2,000 clients provincially during that time period and have been hovering near and around 22,000. It’s a fluid number, and we track it on a regular basis. During that same time period, we’ve added 48 new probation officer positions to the complement of probation officers working for B.C. community corrections.
If we look at it as a two-part scale, we’ve seen reductions in the number of clients and the accompanying workload demands, and we have increased the complement of probation officers working for us. I think both of those have gone to the extent of addressing our concerns about our resourcing.
To the follow-up issue, which is about the tools. The challenge that we were given at the time we discussed it with the auditors was: we need to improve our overall understanding of our work but to identify, through some method of measure…. I worry about continuing to go back to the notion of metrics, but that is the framework by which we’re challenged, even in our discussions with the Auditor: to make sure that we have metrics in place to be able to demonstrate and provide evidence of impact.
For us, the challenge was…. We were to develop some sort of tool that would help us identify — to use Malcolm’s language — that our cup is running over, that we’re overfull. To that piece, we’ve done a very robust job of identifying all of our work. We have been unable to find anywhere any meaningfully replicable tool that identifies that very point. The complexity of our work is such that there is no standard unit of measurement.
Clients come to the table with varied and often fluid levels of complexity. On one day, a probation officer may have a caseload of 50 clients and that is a very manageable workload, because the nature of where those clients are at is quite stable. The next day, two clients can simply account for a vast increase in their workload, simply because of where they’re at, and things go challenging for them.
For us, the development of a standardized tool is something that we’ve not been able to come up with. We’ve worked with our research group. We’ve worked with others. There is no such tool anywhere in use elsewhere.
For us, our assessment is continuous and ongoing. Supervisors are meeting with staff. They’re reviewing caseloads. We have very robust tools to know exactly how many clients, what risk levels are being supervised. For us, the important piece is about having a good understanding on the ground about what’s happening for each of those probation officers on a regular and ongoing basis.
K. Corrigan: One more question. You talked about metrics again. I have one final question on recommendation 2. The report says: “Although the division committed to developing key performance indicators to track the performance of its contracted service providers, this
[ Page 1035 ]
work has not materialized. The division informed us that reoffending is best measured through program evaluations rather than key performance indicators. We found, however, that it has not evaluated any of its contracted service providers since our report was issued in 2011.”
I’m trying to understand what the difference is, I guess, between a program evaluation…. How do you evaluate a program without having some kind of markers, which I would assume would be key performance indicators? How can you evaluate it if you don’t have some kind of way of measuring whether or not it’s doing well?
L. Greiner: I can speak to that. Our evaluations…. We do have key performance indicators. I haven’t read through…. Maybe it’s a misunderstanding of this.
Our evaluations…. Our key performance indicator is reducing recidivism, for the most part. We have other outcomes that we’re interested in: reducing the seriousness of the offence, reducing criminogenic needs or needs that are related to reoffending. If we can reduce those, at least we know we’re making those changes initially.
Again, maybe I’m taking this out of context.
Like I said, we do have key performance indicators for our program evaluations — programs that are designed to target factors that change criminal behaviour.
K. Corrigan: Well, maybe then I could just get an explanation from the Auditor General on why the report says something different.
L. Pierce: This point here is in reference to the lack of KPIs: is there any kind of proxy to measure the effectiveness of community programs and contracted service providers? We acknowledge and applaud the work that the division does to evaluate its internal programs, its core programs, its partnership programs. The oversight that we continue to see is the lack of any kind of evaluation on the effectiveness of the community programs and contracted service providers.
B. Ralston (Chair): Just for the record, for the uninitiated, KPI is a key performance indicator. Otherwise, it’s jargon for those who might not know what it means.
K. Corrigan: I never used KPI. I said it out both times.
R. Sultan: I wanted to support my colleague Marvin Hunt’s comments and questions in two respects. First of all, this is a good audit, and we should thank the Auditor General for her continuing good work in that regard. Secondly, to observe the obvious, this ministry deals with the hand that’s dealt to them in terms of resources, the increasingly complex client base and rulings of the judiciary.
If I could be philosophical for a moment, I notice no hesitation whatsoever in the judiciary of Canada to infringe upon the prerogatives and decisions of the Legislature. Could I, therefore, in a somewhat cheeky fashion, suggest that perhaps we could increase our understanding of this complex issue if we turn the formidable powers of the Auditor General upon the judiciary and find out whether their decisions make sense?
B. Ralston (Chair): Well, maybe not today.
I think that concludes it. We’ve gone overtime on this report, but I know there was a lot of interest in it. We’ll evaluate. We’ll have a look at the transcript. There were some requests that were made for further information. Subject to those responses being provided — and the opportunity to further question, by the committee, those responses — I think we’ll consider this particular audit concluded.
D. Eby: Just a note on the recidivism rate. If the information could be very clear about the exact measurement — what is being measured — as opposed to just a list of percentages, that would be helpful.
B. Ralston (Chair): That’s been promised, and I would think probably 21 business days would be a reasonable time to have that response to us. That could go through the Clerk of Committees. If there’s further interest in that response, then we may reconvene on that particular point.
If we take a brief break, we’ll set up for the next report.
The committee recessed from 11:18 a.m. to 11:30 a.m.
[B. Ralston in the chair.]
B. Ralston (Chair): Before we begin our next topic, there are two items of business. One was to welcome Dan Ashton to the committee. By agreement between the House Leaders, Greg Kyllo is no longer a member of the committee, and Dan Ashton has replaced him. That’s an informal process. There’s a document that’s been sent, so it’s all official now.
I’d like to welcome Dan to the committee. I’m sure he’ll be a productive and valuable member of the committee. I know he has experience in legislative committees, having chaired the Finance committee.
Secondly, as an addendum to the report that we just discussed, the Auditor General had a further brief comment before we begin our next audit.
C. Bellringer: It was just that we had, actually, in our file with us here, a copy of the 2015-16 annual service plan report of the ministry. It does show that performance measure in here and the rates of non-reoffending, and it has the numbers for community corrections. I’ll just read out…. In 2013-14, it was 76.7; in 2014-15, 75.9;
[ Page 1036 ]
and in ’15-16, 74.6. The targets are also in there. It has also got the numbers for custody and the overall rate. But those were the ones relevant to the report.
B. Ralston (Chair): The name of that document again, just so we’re clear where it’s from.
C. Bellringer: It’s the 2015-16 — there’ll be one every year, obviously — annual service plan report. It’s for the Ministry of Justice. Then you go, within that ministry, to Public Safety and Solicitor General. The documents are published together.
B. Ralston (Chair): Thank you.
We’ll now deal with the next item on our agenda, which is the Office of the Auditor General report entitled An Audit of Mid-Size Capital Procurement in Post-Secondary Institutions, a report that dates from May 2016.
Representing the Office of the Auditor General is the Auditor General herself, Carol Bellringer, and Chris Thomas, senior manager of financial audit. From the auditee, the government, we have Kevin Brewster, assistant deputy minister, Ministry of Advanced Education; David Galbraith, deputy secretary to Treasury Board, Ministry of Finance; Heather Hill, executive director of capital, Ministry of Finance; and James Postans, director of Ministry of Advanced Education.
I’ll begin by inviting the Auditor General to make opening comments.
Auditor General Report:
An Audit of Mid-Size Capital Procurement
in Post-Secondary Institutions
C. Bellringer: Thank you Mr. Chair.
Our office has done many past reports on capital assets, and usually they’re looking at a very large single project, many of which were procured as public-private partnerships, or P3s.
This audit was designed to look at how a sector — and we’re looking at the post-secondary institutions — was managing procurement of its significant projects below the capital threshold that would have triggered greater oversight from Treasury Board staff and a review of the project for its suitability to procure as a P3.
With that small introduction, I’ll turn it over to the team member, senior manager Chris Thomas.
C. Thomas: Good morning. Thank you for the opportunity to discuss our report on mid-size capital procurement in post-secondary institutions.
Capital assets are things like schools, hospitals, roads, bridges and, in the case of this audit, primarily buildings on university and college campuses. This audit builds on our office’s theme of examining significant capital asset projects in the public sector. In our 2015 report Monitoring Fiscal Sustainability, we talked about the significance of capital assets as a risk for government, because they’re so costly to repair or rebuild, and that it’s challenging to constantly monitor each asset to know what kind of shape it’s in.
Capital assets are essential for government organizations to provide public services, and they provide those services for a period of time beyond a single fiscal year. As of March 2015 — the figures this report was based on — government had $39 billion worth of capital assets on its books and $40 billion as of this March, 2016. Government spends between $5 billion and $7 billion each year building and maintaining all of its capital assets.
When it comes to B.C.’s public post-secondary institutions, there is around $700 million of capital activity annually related to the construction and upkeep of campus buildings. We looked at major building projects constructed between 2008 and 2013 with provincial funding of between $20 million and $50 million. We went back five years in order to have a large enough population of samples to be representative.
We wanted to determine whether or not the Ministry of Advanced Education’s procurement processes for these buildings are in keeping with government’s overall capital management framework and the ministry’s Capital Asset Reference Guide.
Our findings were that overall the Ministry of Advanced Education is doing a good job planning and constructing new buildings. Campus building projects met budget and construction goals, and the ministry follows good practice for implementation, monitoring and reporting processes during construction.
However, we did find that the ministry’s planning processes did not meet government’s framework or its own capital asset guide when the projects we audited were planned. Since that time, though, the ministry has improved its planning processes. Additionally, the post-secondary institutions we looked at lacked conflict-of-interest policies around capital procurement but have now rolled out policies.
These changes will help ensure that procurement decisions are based on value for money and the business case and not on personal interest.
We also noted that institutions were not reporting back to the ministry on whether or not their new building actually met their needs beyond the basics of providing new learning space. The lessons learned from this kind of reporting could really help inform future planning decisions.
Based on our findings, we made seven recommendations. One was directed to Treasury Board staff, and the other six were made to the Ministry of Advanced Education. We are pleased that both the Ministry of Finance, on behalf of Treasury Board staff, and the
[ Page 1037 ]
Ministry of Advanced Education accepted our recommendation and have already made considerable progress.
B. Ralston (Chair): Thank you.
Over to you, then, Mr. Galbraith, and then we’ll hear from you, Mr. Brewster.
D. Galbraith: Basically, there’s the one recommendation for Treasury Board staff. As the OAG highlighted, we’re happy to accept and move forward on this recommendation. We’ll fully address this through our update that we’re doing of our capital asset management framework. Heather, who’s with me today, does the lion’s share of the hard work and the updates that are going on right now.
Basically, we agree with the findings that it’s important that these documents are stored appropriately and also can be found easily. Therefore, like we said, we’re going to be…. As we go through the update — which we’re going to release, I believe, in spring of 2017 — of our capital asset management framework, we will incorporate some guidance and the checklist for those areas that are impacted by the recommendation.
K. Brewster: From the Ministry of Advanced Education standpoint, there are an additional six recommendations, which we reviewed and fully accept and support. We’ve made efforts to advance that work and are continuing to do so.
I’ll just go through the first recommendations, 1 and 2. We feel we’ve already implemented those and continue to work on that. We are taking steps to implement recommendations 4 and 5 this fiscal year, and the work will continue on after that. We’ll also be implementing recommendations 6 and 7 starting in ’17-18.
If I can just go through recommendation 2. The ministry has had all its facilities assessed under the facility condition index process through VFA Canada. We now have a full database, and we are having our facilities reassessed on an ongoing basis, 20 percent every year.
B. Ralston (Chair): Could you just explain what VFA is?
K. Brewster: VFA is the company name. The original name of the company was Vanderweil Facility Advisors. They’re the market leader in assessing the physical condition of capital assets.
B. Ralston (Chair): There may be those who are keenly following these proceedings who don’t know that.
K. Brewster: Right. Sorry about that.
We are using that data when we come forward with approval. We expect the post-secondary institutions to reflect the condition of their facilities in the business plans that they are bringing forward to replace or renovate their assets. We expect to see that data. We in turn communicate that through in our approval documents where we’re requesting approval for funding from the Ministry of Finance.
We recently denied one post-secondary institution that wanted out of the building where the facility condition index showed that it was in good condition and it had a life span left to go. They simply wanted to move to a different location. That was one of the items we used as a basis to make that decision, because there’s no point in building something that’s still got a good life span left and then to abandon it. So we are incorporating that data as we go forward.
The third recommendation is regarding our capital planning documentation processes. We are making sure that all our business plans that are coming from post-secondary institutions are fully analyzed by our staff. They are connected very closely to the request for approval documents that we are moving forward through our ministry and through the Ministry of Finance.
If you go back, you will now be able to see a full package of information where something has come through. It’s supported by a business plan. It received analysis, questioning, justification. It was reviewed and got approval. We have the approval documents attached, and it becomes part of a robust file of projects moving forward.
We’ve recently been recipients of funding through the federal government, through the strategic investment fund, and we’ve made doubly sure that all of our approval documents for that initiative are fully supported and backed up by a decision process. When people go back to look at that years from now, they will be able to see the trail of how projects came through, were reviewed, approved, processed and funded. We feel we’re proceeding with that, but it’s an ongoing piece of work to make sure that everything is fully documented.
Recommendation 4 is that post-secondary institutions should report on their compliance with conflict-of-interest principles. We do take this very seriously. What this requires is…. Because the actual procurement, the tendering work, is done by the institutions themselves and not by the ministry, it’s incumbent upon us to make sure that post-secondary institutions are going through a disclosure process.
Their own staff who are running their procurements need to disclose whether they have any conflict of interest with any companies that are seeking to tender on any work. So we will be communicating, by the end of this year, guidance material to the post-secondary institutions that they need to complete those disclosure forms for the projects that they’re undertaking and make sure that those are on file.
For the larger projects, in my experience in the Ministry of Health through P3 projects, those have al-
[ Page 1038 ]
ways been done. We do need to now follow up and make sure that for projects below $50 million in value, that work is also being done and put in place at the post-secondary institutions.
The fifth recommendation is regarding a risk-based audit plan. We do have project boards. That is a process that has been put in place — about five years ago. Those project boards do review the status of projects as they go through the procurement and implementation phase. They identify risks and deal with those risk mitigations as the project evolves.
We do need to take it a step further, and we will be, by selecting, on a risk basis, projects for a greater level of scrutiny. We’ll hire an auditor to do that, to make sure that we’re double-checking that projects are following the proper procedures and proper procurement process that they need to be doing. That is a piece of work we’ll be doing some planning for by the end of this fiscal year, and we’ll get that work underway.
The sixth recommendation, recommendation No. 6. Basically, we have the sector exchange information about how the work is going. Learning from projects that have gone well and projects that haven’t gone well and some of the things that we can learn from so we don’t repeat the same mistakes — that is a very important piece of work.
We do that internally amongst our staff. But we do need…. We do have a meeting — I believe it’s twice a year — with all the facilities directors of the post-secondary institutions, where we convey information and exchange knowledge. But that needs to be a little bit more robust and have a bit more formalization so that people are learning from the lessons that others have learned.
We do have 25 different post-secondary institutions with a range of levels of expertise, and they can all learn from the work of others. But we will be having that forum in a more robust fashion and making sure that it’s a good forum for lessons learned, because we don’t want to repeat problems that have happened before. We also want to learn from the successes of others.
This relates to what we will call post-occupancy evaluation. After a building has been done, we go back to it later and say: “Okay, did we get everything that we needed to get? Did we achieve the objectives of why we made this investment in the first place?”
I’ve had the privilege of working in capital investment in British Columbia for almost 25 years. There is a tremendous forward momentum to finish projects and go on to the next and the next and the next. One of the hardest things is to actually go back and say: “Okay. Well, how did that one go?” But that’s a very important process. We used to do that a number of years ago at the Ministry of Health.
Doing a post-occupancy evaluation — to be honest, it is quite an intensive piece of work, but it’s an important piece of work. We have to close that learning loop and make sure that everybody is aware of what happened — how did it go? — and then go forward with the lessons that we’ve learned. It’s a very valuable exercise. We will be doing that, and we’ll plan to undertake those on an ongoing basis.
Typically, though, when things don’t go well on a project, we hear about them right away. Nevertheless, that’s a rather informal way of doing it. We need to formalize that work and check to make sure that we’ve gone through and fully understand how we’ve invested and the results of that investment — if it worked, if it didn’t, if it got part of the way there — so that we can embed those lessons in going forward in the next project. We do have a continual series of projects going forward into the future.
I can only speak for the Ministry of Advanced Education, but we’d like to thank the Auditor General and Chris for the work in going through this report.
We’re happy to answer any questions you may have.
B. Ralston (Chair): We’ll begin questions. We’re going to adjourn at 12. I know members sometimes make plans over the lunch hour, and I don’t want to disrupt those. We’ll take questions and adjourn at 12 and then resume at one to continue questions.
Who wants to be first here?
K. Corrigan: I have more than ten minutes of questions. We’ll get a start on it.
The decisions that are being made on capital projects now…. I’m very glad to hear that part of what’s going to be happening in the future is a more robust evaluation of whether or not the objectives were met. I’m wondering about whether or not that will include any kind of evaluation of the determination of what projects are being built. I’m thinking specifically about the impact that the blueprint for education is having on programs that are being offered in post-secondary institutions.
There has been quite a focus — because of the evaluation of the training that’s going to be needed for the future, as you well know — on investing in trades training, some tech training, and so on. That would include the capital buildings that go along with it. My real concern is that some of that evaluation is done on the basis, for example…. The first market demand reports talked about the importance of LNG, and the decisions were being made on the basis of a very robust LNG industry.
Will the work that the ministry does take into account the predictions that were made and, therefore, the decisions that were made about capital investments and whether or not they aligned with the needs that turned out to actually be there in the future, or is that outside the scope of what we’re talking about?
K. Brewster: No, it’s not outside the scope.
Within the ministry, our capital team doesn’t build anything unless there’s a demand for it. We need to make sure that the need is there. Post-secondary institutions
[ Page 1039 ]
have to justify that need to us in their business plan. We review that against the top 100…. There is an annual report that’s produced by Jobs, Tourism and Skills Training, the Labour Market Outlook. We’re reviewing the Labour Market Outlook each year, as the projects come forward, to make sure that project requests that were made a year ago or two years ago, as they’re working their way through the system, are still valid.
A number of the trades projects involve training people to learn how to do electrical work, welding, construction, plumbing, mechanical and all the things that may be done for LNG or constructing a building, many transferable skills to other areas.
That is reviewed each year, as projects come forward, to make sure that it’s still current. It can take a couple of years from when an institution submits a request for a project to actually get into the plan and be ready to go ahead. So we check to make sure that it is still current.
K. Corrigan: A follow-up on that. The Labour Market Outlook is a forward-looking document, but you’ve said that no projects will go ahead unless there’s a demand — I think you used that word — or a need.
When you say that, do you mean that those institutions have demonstrated that there’s a present need for those spaces to be in place? Or are you saying that they have looked at — or somebody has taken a look at — the labour market outlook for the future and, from that, have determined that there’s a need?
K. Brewster: That’s a little bit out of my field.
Institutions are offering spaces to train. They work in conjunction with the Industry Training Authority to determine the spaces that they feel are needed for students. They look at the local market demand. Local could be local in the region or local in the immediate community. They look to see what the wait-lists are for particular training programs, and they match their training to training programs to meet that demand.
They do look at not just sort of what’s happening right now, but they look outward. They take an outward look in terms of what the demand for specific training programs would be. As well, are those training programs going to fit jobs that are in demand? Or the government feels that those jobs will be in demand in the future.
K. Corrigan: I guess my concern is whether or not fairly significant capital decisions and, flowing from that, operating decisions, as well, are being made on the basis of…. I won’t say it’s politics but, certainly, a policy direction of the government, as opposed to those institutions saying: “We really need this space.” Also, I’m concerned, I guess, that government says, “We are going to have a whole bunch of training spaces for trades,” for example, or tech and that those are, essentially, policy decisions. I’m just not sure which is coming first, the push by government or the actual need that’s on the ground.
You’ve pretty well answered my question, but I’m concerned about it.
K. Brewster: Sure. From a practical point of view, many of the trades projects we have are replacing outdated facilities, with a bit of expansion capability. A lot of those projects that we’re doing are replacing trades facilities that are from the 1940s or 1950s that really do need to be upgraded in terms of the power supply.
Some of the air quality is terrible. At BCIT alone, in one of their welding facilities, the air is thick because they can’t evacuate the air fast enough and exchange it. So we’re upgrading the air quality. We’re upgrading the welding booths. We’re providing new accommodation for what are, essentially, existing programs and providing a little bit of extra capacity to meet demand.
Some of these facilities can also switch their training from one to the other very quickly. They can adapt.
We are looking at all of that. What we don’t want to be doing is investing public funds where they shouldn’t be invested or, worse, building something that sits vacant.
B. Ralston (Chair): It’s now 12 o’clock. I’m going to ask that the committee recess, and we’ll reconvene at one. I think Kathy might have a further question or two, and then Vicki is next.
With that, we’ll recess, and we’ll be back here at one o’clock.
The committee recessed from 11:59 a.m. to 1:01 p.m.
[B. Ralston in the chair.]
B. Ralston (Chair): Good afternoon, Members. We’re continuing on our consideration of the Auditor General report An Audit of Mid-Size Capital Procurement in Post-Secondary Institutions.
Kathy, continue with your questions, please.
K. Corrigan: On page 12 of the report under “Post-secondary institutions” in the column on the right side, the second paragraph says: “In terms of government’s capital asset management program, the ministry is mainly involved with the province’s 25 publicly funded institutions.” I’m just wondering. I’m not sure if it would be the Auditor General or the ministry that would answer the question. Does that mean there are exceptions to public involvement, that there is involvement in private projects? Where does that word “mainly” come from?
C. Bellringer: If you don’t mind, I’m going to answer the question with a little bit of added information. Then I’ll pass it on to Chris to actually answer your question.
[ Page 1040 ]
If you go to the back of the report, you’ll see that the assistant Auditor General on this audit team was Bill Gilhooly, who has retired, so, unfortunately, he’s not available today to answer any questions. But I will point out that we’ve actually filled that position since then. The new assistant Auditor General for financial statements will be starting on Monday. He actually happens to be sitting to the right of me, and we’ll need to change the title on his card. He’ll not be attending these meetings in the same capacity going forward. I just thought I’d add that in for full disclosure.
Back to your question.
B. Ralston (Chair): Maybe just for the record, because it won’t show up in the transcript who is seated to the right of you. That’s a little cryptic.
S. Newton: Stuart Newton, comptroller general.
K. Corrigan: Congratulations.
S. Newton: Thank you.
C. Bellringer: Okay, Chris, over to you.
C. Thomas: To answer your question, there is no funding that’s provided by the ministry. We couldn’t categorically say that they were only involved with the publicly funded institutions, because there can be a reporting relationship. There can be a monitoring of what facilities are available, etc., but there is no funding relationship.
K. Corrigan: Okay, great. That’s fine.
My next question. I was surprised to see on page 13, within the Background section, that of the $3.4 billion that is going to be spent over five years from 2013-14 to 2017-18, there’s going to be $3.4 billion, averaging about $680 million a year, but only a little over $1 billion of that is coming from the Ministry of Advanced Education. The other money is coming from the federal government and other sources of funding.
I guess the question that I have is: are decisions being made about capital projects — I would assume that the answer is yes — on the basis that there is lots of money coming from other sources? Why is it that so much of it, the predominance of it — over a third of the money that is going into capital projects in British Columbia post-secondary institutions — is coming from other sources? Just a general question.
K. Brewster: In the post-secondary sector, we have two types of capital projects where investments are made. One is for core facilities. Those are for traditional teaching facilities, laboratories — you know, the basic research facilities. Those receive government funding, but not all of them receive government funding. An institution may receive funding from another source and go beyond the funding that we’re providing, or they may elect to construct other facilities with funds that they get for core facilities.
The other type is non-core facilities. An example of that would be gymnasiums, student union buildings. Those, particularly for gymnasiums and running tracks and that sort of thing, are things that institutions can charge a fee for and receive revenue. So public funds don’t go toward those.
With student union buildings — those are often funded by student unions themselves by a fee that is charged to the students, and they pay for the development. So it’s a mix of investment and type of facility.
K. Corrigan: Just to follow up on that, then. That ratio of around one-third — only one-third coming from the Ministry of Advanced Education. Is that pretty typical across the country, or is the proportion that the ministry puts in higher, or do you know?
K. Brewster: I don’t have that data. I don’t believe we track and compare like that. I mean, we could make an effort at looking at that, but we do look at the capital priorities, which the post-secondary institutions are putting before us, and working on that basis.
K. Corrigan: And then to the Auditor General. I noticed with regard to the projects that were chosen for this audit that they were ones where there was a larger percentage of the money. It was not entirely but predominantly funding that was ministry funding. Was that because those are the types of projects you were interested in? What was the reason for that, when you see that, overall, only a little less than a third of all the capital funding comes from the province?
C. Thomas: The selection criteria that we used…. We were looking for projects that fell under what was, at the time, the province’s capital threshold for a P3 review, which was $50 million, and over $20 million, which was the threshold that we set. We would expect that we would find significant construction activities involving a full range of risk management techniques being used by both the ministry and the institutions. That was the full population. What we looked at was everything that met those criteria during that period.
K. Corrigan: Oh, okay. Great.
I have more questions, but if you want to go on to somebody else….
B. Ralston (Chair): I think there’s a number of other questioners. We’ll come back to you. Vicki is next.
V. Huntington: Thank you very much, Mr. Chair.
[ Page 1041 ]
I noticed, surrounding recommendation 5, that the Auditor found that, of the eight projects they were reviewing, only two had been audited for effectiveness by the ministry, yet both of those audits identified weaknesses in the capital procurement processes. Those audits were only conducted because they were a requirement of the federal contributions.
Did that not trigger bells in the ministry that if you are identifying weaknesses on those specific audits, you might go back and review some of the other projects? What was the thinking behind the reason that you had not audited the others?
K. Brewster: That was before my time at the ministry. We’ve had a considerable amount of staff turnover since that time. That, however, is not an excuse. We should have done that, and we will go and pursue audits of our projects as they complete, in a more aggressive manner. We will follow up on the recommendations as they arise.
V. Huntington: Is there any way in which the ministry can attach the same type of requirements for its contribution that the federal government does — a standard set of requirements for any contribution?
K. Brewster: Well, what we typically monitor when we make capital investments…. We’re looking to make sure that the scope is acceptable.
V. Huntington: I’m sorry. What does that mean?
K. Brewster: That the scope of the project is appropriate, so the institutions aren’t building things that they don’t need to build and that it’s not going to generate vacant space. It has to meet the need, I think, as we’d discussed before the break, about the types of educational programs that are being offered and in the future.
We do focus quite a lot on making sure that the project maintains on its schedule, it’s within its budget and within the funding sources to make sure that it finishes on time and within its budget. Typically, those are the big pressing pushes forward. Where an audit shows that there were some discrepancies that needed to be looked at and fixed, yes, we do need to pursue that, and we will pursue that.
In this case, it was the federal government that had a number of projects through the knowledge infrastructure program. They had a series of auditing requirements that they were doing on projects across Canada. I’d have to look at them in more detail to see whether any of those would be appropriate for British Columbia. But we certainly do need to take a more intensive look at the projects that we are doing going forward, making sure that we audit appropriately and follow up on any findings that we find in the audits.
V. Huntington: Well, I think that’s the nature of the recommendation. If these two audits uncovered procurement process weaknesses, then I think the government, the ministry, should be looking at their contributions to the project in exactly the same way, having requirements and seeing whether those are fulfilled at the end of the day.
The other thing I wanted to ask was of Mr. Galbraith. We were chatting about this over lunch. Over the process of looking at various reports — whether it was technology issues, primarily in the Ministry of Health but also in Justice, whether it was this and others — what I’ve noticed is that there seems to be a lack of accountability in the central agencies: the comptroller general’s office — you and I spoke about that once, Stuart — the chief information officer’s office.
The question was to Treasury. Does Treasury feel in its budgeting process or approval of budgets for these projects that it has any responsibility to ensure certain safeguards and audit procedures and reporting requirements are in place? What is the role of you as a basic central agency?
D. Galbraith: In response to your question, and I’ll talk about on the capital side…. I have Heather here in case I meander off track, because Heather has much more experience and has gone through a number of the projects.
We take a risk-based approach, generally, on items that come forward to Treasury Board. We see them on what I’ll call the business case stage, where we look at the scope, the size, the need, the complexity of the project and the cost of the project. We test all of the assumptions around those different categories to make sure that what they’re bringing forward to us is the best estimate of where we’re going to end up.
We track the track history of the various entities that come to us, and we know which areas may not have as much strength or have as good of a track record. We can then make the requirements — whether it be on reporting or returning to, whether it be to the full board or to the Minister of Finance or at an administrative level — and we can build in a number of safeguards. We could theoretically do some gating — what I mean by gating is staging — of the funding, depending on milestones, etc.
So we have a number of tools at our discretion, I guess, that we can respond to projects as they come forward to us. Generally, a number of projects in a number of the facilities — they have a very good track record. If there are specific things, Heather could add to this. But we do monitor how they’re doing.
We tend to rely on that governance relationship of the ministry with whatever the sector is that they’re dealing. That’s where we believe that appropriate governance level is. However, we are there to do our best due diligence in the setting of the budget or the expectation on what kind
[ Page 1042 ]
of procurement, etc. Or, and more importantly, a lot of what Heather does is work on the expectations and the requirements of the capital asset management framework, which is the guiding tool for all of the sectors on major capital projects.
That’s a quick overview of how we approach it.
V. Huntington: If I could just make a general observation. It’s my opinion, obviously, only.
Some of the ministries don’t appear to have the level of management expertise and some of these large procurement processes that the central agencies seem to think they do. There has been enough auditing going on, on some of these major — hundreds of millions of dollars — projects that have been boondoggles, basically. Yet central agencies — are they stepping in? Do they have templates that they are monitoring? Do they have any way, or even a desire, to require accountability in those ministries, to oversee the training and the ability of the ministries to carry on these massive projects? I haven’t seen any proof of that.
I understand the chief information officer’s office may be looking at something like that. I’d almost like to have them come in so that we can ask what those changes might be.
Just as a general consideration that I’ve seen is that I think there’s a failure of the central agencies to create processes that the ministries are required to follow and will be audited to ensure that those requirements are being met. That’s just my opinion after a number of years on this committee.
G. Heyman: The report states that planning activities and risk management assessments for the projects and institutions sampled met the requirements, and that procurements were generally conducted in a fair, open and transparent manner. But it also noted that: “None of the eight institutions we examined had processes in place to ensure that procurement team members and their advisers declared and addressed any real or perceived conflict of interest.” The report goes on to say: “We found no monitoring underway to watch for project-specific conflicts of interest that could arise on procurement teams.”
I look at the ministry’s response to the recommendation that arises from this, which is that “the Ministry of Advanced Education require post-secondary institutions to report on their compliance with the conflict-of-interest principles in the capital asset management framework for major projects, as part of the post-award communication of the ministry.” The ministry response is: “The ministry takes conflict of interest, either real or perceived, very seriously.”
My question is: did the ministry have any mechanisms in place that required of the institutions to monitor potential conflict, declare it, disclose it, etc.? Or was that just a gap?
K. Brewster: They were required to do that under the capital asset management framework. The way that is done is that people involved with a procurement are to declare whether they have any conflict of interest with any of the companies or individuals that are involved in a procurement. I think what we’ve seen from the audit and this report is that, at the institutional level, that did not happen. Did we follow that up? No. To be perfectly honest, no, we did not. Are we doing that now? You bet we are.
G. Heyman: Just further to that, can you report on what level of uptake or response the institutions in question currently are?
K. Brewster: Probably three times a year I meet directly with the vice-presidents of finance of all the post-secondary institutions. I have communicated that requirement to them already. I repeat that requirement to them each time I meet them.
My colleague James Postans meets with the facilities directors of all post-secondary institutions twice a year. It’s happening at that level, and they will all get correspondence from me by the end of this fiscal year.
We are also in the process of organizing a mandatory facilities education session for all facilities directors for post-secondary institutions, in conjunction with the B.C. Construction Association, to make sure that everybody is following proper procedures and procuring things in the way that they’re supposed to and that they’re following the capital asset management framework requirements.
G. Heyman: Just a question for the Auditor General. As a result of this audit and this finding, do you believe that, potentially, other institutions outside of government but responsible to government also don’t have mechanisms in place to comply with a capital asset management framework? If you did, did you routinely communicate this issue to other ministries — to bring it to their attention to perhaps follow up and ensure that what Advanced Education has now decided it needs to do to ensure that the post-secondary institutions comply will be done by other ministries with similar relationships?
C. Bellringer: No, we didn’t have that in mind yet. In terms of whether I think it’s happening elsewhere, it’s impossible to say without going in, for sure. We have a couple of capital projects on the three-year performance audit coverage plan, but nothing along the lines that you’re describing. But I’ll give that some thought for next year’s plan around something specific to the conflict of interest or specific to the compliance with the capital asset management rules.
G. Heyman: To be clear, I wasn’t specifically suggesting further audit. I was suggesting pre-emptive communication that would flow from your findings in this audit.
[ Page 1043 ]
C. Bellringer: We tend not to do that outside of the audits that we’ve…. What we’ll do is use an audit and use it to communicate to others within the same sector. Certainly, anybody who was involved with this particular audit has seen the results, and we’re talking about it.
Some audits have a bit of a broader relevance. We’ll bring it up at audit committee meetings or…. For example, the one we did on school districts. We are bringing it up at any of the audit committee meetings of those school boards that we’re attending.
I hadn’t thought to do that in a broader way, and we haven’t done in the past, but we can give some thought to that.
G. Heyman: Maybe just a follow-up question. Perhaps I should have addressed the question to the ministry. As a result of what you’ve discovered following this audit, do you communicate with other ministries or deputy ministers through the Deputy Ministers Council that this is what was found in this audit? “These are the actions we’ve taken, for your information, in case you think you may have a similar situation.”
K. Brewster: My colleagues, my opposite numbers, at the Ministry of Health and the Ministry of Education are aware of the audit. I have communicated to my colleagues at the capital branches, made them aware that this audit exists and what we are doing at Advanced Ed.
Typically, what happens is that the capital staff at all the post-secondary institutions have often, at some point in their career, either worked for a consulting company, either as an architect or an engineer, or they’ve worked in a contracting company. But it’s most likely consulting companies.
The community is small across British Columbia. They tend to know each other. They’ve changed employment over the years. They’ve often moved…. As their careers go on, they move out of the private sector and into public sector work. At times, they know of the companies that are applying for the work. Are they directing? Clearly, there’s a potential for conflict of interest where work could be directed by somebody to one company or another. I don’t actually know if that’s truly happening, but we are closing that door, that possibility.
L. Throness: I just had a question for the Auditor General, a follow-up question on this discussion.
Regarding the conflict-of-interest situation, there was no actual conflict of interest that the Auditor found that generated this recommendation. This is only a theoretical possibility that there was not adequate reporting that would reveal such conflict. Is that the case?
C. Thomas: No, we didn’t find anything, any particular conflict of interest.
L. Throness: Okay. That’s all I had.
B. Ralston (Chair): But there are examples in other provinces — I think in Ontario, in the hospital sector, for example. There was a series in the Globe and Mail about just this problem. It’s not unknown to public procurement in the country.
R. Sultan: My question builds upon some questions Kathy Corrigan had posed earlier, where she observed, on page 13 of the Auditor General’s report, that $3.4 billion in the past five years was spent on adding to or extending the life of capital asset stock in the post-secondary sector. Additional capital funding comes from the private sector, individuals, federal government and goes directly to the institutions.
First question. I was surprised, the other day, to be told that where the province does not actually provide the majority of the capital funds for a project, such a proposal goes through an entirely different decision channel in Victoria than one for a pure capital project. Namely, it has to go through the operating accounts of the government and be accounted for in the current profit-and-loss statement, which, of course, for a large capital project, could be a very significant hit to the P and L in any one year. I don’t even know if that’s true, and I’m asking for your verification.
The other point is: if we have these jointly funded capital projects, whose balance sheet does it end up on? We’re very proud of the magnitude of our capital assets, but do we even own all of them?
That’s two questions.
K. Brewster: I probably want to share the answer to this with my colleagues from the Ministry of Finance.
Post-secondary institutions own the assets, basically, that are on their land. They’re the owners.
R. Sultan: Regardless of funding.
K. Brewster: They are the owners. They may receive donations. They may receive funding from the federal government or the provincial government, but they are the owners of the properties.
Where capital investment is made, if they use their own funds, they have to amortize those funds over the life of the asset, and that shows up on their operating statements. I’m not an accountant. I’m surrounded by accountants. Generally, what I try to do is surround myself with accountants. It’s safe that way. But I’m not an accountant.
B. Ralston (Chair): You don’t have to apologize.
K. Brewster: Where institutions provide their own funds, they have to amortize that on their operating state-
[ Page 1044 ]
ments. Where the government’s providing funds, they are also amortizing that over the life of the asset on an annualized basis.
With the federal government, they’re getting income and amortizing that over the life of the asset at the same time.
B. Ralston (Chair): Mr. Galbraith, did you want to add something?
D. Galbraith: I’ll try. You used some language I’m not familiar with, unfortunately.
Basically, the rule of thumb is if we own the asset, we carry the debt, regardless of if we pay for it through capital or through a P3 arrangement. Hence, we carry that debt.
I’m not sure of the additional process you talked about on the P-and-L statement. That confused me, and I’m not familiar with that.
K. Brewster: If I could just also add to that. The institutions on land title are the owners, but they are also part of the government reporting entity, so their assets roll up into the entity.
R. Sultan: So even if we built, let’s say, a new TRIUMF accelerator and we put up 40 percent of the money and the feds put up 60 percent, all of it would be claimed by UBC. We roll up the entire amount into our consolidated financial statement, including UBC, and say thank you very much to the feds.
S. Newton: That’s the difference between building and constructing an asset that you own and the funding stream that allows you to have money to build and own the asset.
A new TRIUMF facility would be owned by UBC, consolidated in their books, amortized over its useful life. That is independent of where the source of the funds came from.
R. Sultan: Well, what I’m particularly intrigued by is the statement that was made to me about the actual amount of the government’s contribution to a major infrastructure project in Vancouver. I won’t beat around the bush. It’s the Lions Gate wastewater treatment plant.
We might be putting up, say, 40 percent of the money, but we don’t control that project. It’s owned by Metro, I guess, or the water board. Therefore, it has to go through our P-and-L statement.
D. Galbraith: That one is fairly straightforward with respect that we don’t own that asset. Therefore, that would be a grant.
The Ministry of Community, Sport and Cultural Development does a number of grants to municipalities and other like entities for assets that they hold, and we expense those.
R. Sultan: So we control the universities. We don’t control the wastewater treatment plants?
D. Galbraith: We don’t control the municipalities. Sorry, let me rephrase that. Careful of what I say. You trapped me there.
We don’t consolidate them within what we call the global reporting entity.
B. Ralston (Chair): Okay, I have two more questioners first time around. I’m going to try and wrap this up fairly soon.
We have another two reports. I’m not sure we’ll get to the final one. I do want to allow sufficient time to discuss the IT one. We had allocated, in our agenda, I think two hours. I’d like to be able to not rush that one.
Without putting more than just a little bit of pressure on the questioners, I’d ask that we continue but be able to wrap up very shortly.
K. Corrigan: I wanted to ask about the facility condition assessments and the indexes.
The report, on page 23, lets us know that there are these assessments and indexes that result from these assessments. There seems to be an acknowledgement of what I’ve heard on the ground as the spokesperson for Advanced Education — that many of our buildings are crumbling. That seems to be, in some way, acknowledged by the statement in the report that the index and the evaluation done by VFA Canada — and I’m using just the letters because that’s what it’s called — provide an idea of the amount of funding “needed to catch up on deferred maintenance, something often called the infrastructure deficit.”
A few things on that. Since this work has been done, do we have an estimate or could we get an estimate of how much that deferred maintenance deficit is? What is the number? How much are we talking about?
K. Brewster: I don’t have the number, but we have to be careful always of what it is we’re looking at. Any particular campus may have, let’s say, 20 different buildings on it. Each one will be assessed for its facility condition. Some may be old; some may be newer. We can then aggregate all of that up and get an average for that campus.
There may be some older facilities on that campus that may not be critical to the operation of the campus, but that would give us a lower score. It would give a different picture of what’s actually happening. So we have to be careful what it is we look at when we look at facility condition index scores on institutions within the ministry.
Then each facility is assessed on five different categories of criticality. Whether something is…. Basically, how imminent is a system going to fail? Is it something that needs to be repaired in the next year, in the next two
[ Page 1045 ]
years, or can something wait for three or five years? That’s what the facility condition assessment does. It categorizes all of those elements of a facility and when they need to be maintained. We can roll up the whole number, or we can roll up the categories 1 and 2, which is typically what we look at to see the criticality of that infrastructure need.
We also have to balance that off against a changing way of how a program may be delivered. Just to use an example of when I was in the health sector, we could have a hospital that was built in the 1950s, and we could maintain it so it’s in perfect condition. It would have a facility condition index score that would be great, but it would still recognize that health care was delivered as if it was in the 1950s.
So we have to also balance the facility condition assessment score against how modern the facility is in terms of delivering the program it needs. Yes, it is possible to roll up all these numbers. We just have to be really careful and know what it is we’re rolling up and providing.
K. Corrigan: On a follow-up to that.
B. Ralston (Chair): One? Fine.
K. Corrigan: One follow-up, Mr. Chair.
Is it possible to get an evaluation for each of our institutions, understanding the limitations and what it means is to get a snapshot, as the report says, of the conditions of the capital asset stock that we have in our…? I would appreciate if we could get that information coming to the committee. I think it would be really useful.
K. Brewster: Yes, it is possible to do that. I would like to ask to work with someone so that we are clear on what it is that people would like to know. If we want to know absolutely everything about every single building, that’s going to be a very long report. That’s fine; that can be produced.
It’s just that what we’ve experienced in working with the facility condition data is we have to be careful ourselves as to all the information we’re taking in and making sure that we’re looking at the right things and making the right decisions on it.
B. Ralston (Chair): I think what’s being requested is more of a dashboard, as I think the jargon term is — something fairly quick but comprehensive. I don’t think it’s necessary to do a lot of make-work in terms of a massive report or anything like that.
K. Brewster: We can certainly do that. We have the data.
B. Ralston (Chair): Just to give a sense for the committee of the scope of the concern.
K. Brewster: The good thing is we have the data. We’re looking at it, and we’re using it. Yes, those reports are possible.
B. Ralston (Chair): Perhaps you can work through the Clerk’s office and our researcher, Mr. Wall.
K. Corrigan: Mr. Chair, just on that. The dollars, if possible. I’d be interested in what the estimate of the cost would be as well.
B. Ralston (Chair): Vicki.
V. Huntington: I can forgo the question.
B. Ralston (Chair): I have one last one from David Eby — a first-time questioner, apparently.
D. Eby: Recommendation 5. In the ministry’s response to recommendation 5 on page 8 of the Auditor’s report, there’s a lot of talk about project boards. Do project boards actually apply to these kinds of mid-size capital projects that were examined by the Auditor General? I’m not totally clear. I thought they were just for projects beyond the definition of this report, $50 million and up.
K. Brewster: No, we do have project boards for smaller projects. They do function. A lot of them are…. We assess the degree to which the institution needs the help. There may be a project where we feel that there may be a slightly higher risk for succeeding if we had it governed by the institution only. So yes, they do exist on projects below $50 million, but not every single project below $50 million.
In some cases, we simply participate in the institution’s project steering committee. In other cases, we will actually send one of our staff to the monthly meetings that are going on at the institution. In some cases, we will provide more help if we feel they need that.
D. Eby: My follow-up is simply…. It wasn’t totally clear to me when audits were going to be rolled out by the ministry. You’ve had the final audit report for six months, and the response here in that report says there would be a plan to undertake periodic audits. When will the first audit be taking place, how many will you do this fiscal year, and what’s the schedule for next year?
K. Brewster: When we produced the action plan, it was prior to the federal government strategic investment fund being announced. So we’ve had a considerable amount of capital work added to our plate, which is great. We’ve been able to secure federal funding for British Columbia. But that has impacted our work.
We’ll have a plan for undertaking the audits by the end of this fiscal year, and we will be undertaking aud-
[ Page 1046 ]
its starting in ’17-18. By the end of March, we will have a plan to undertake the audits that will happen in the next fiscal year.
B. Ralston (Chair): Thank you. That’s all the questioners on this one. If we could conclude that one, subject to the information being provided and any follow-up questions that may be desirable.
We’ll move now to the next one, so if we can take a brief break to let the staff set up for the next report.
The committee recessed from 1:43 p.m. to 1:47 p.m.
[B. Ralston in the chair.]
B. Ralston (Chair): The next item that we’re considering is the report of the Office of the Auditor General on Getting IT Right — or getting it right, I suppose; it’s meant to be a pun — Achieving Value from Government Information Technology Investments, a report dating from October 2016.
Interjections.
B. Ralston (Chair): We’ll need some order over there, thank you.
From the Office of the Auditor General: Carol Bellringer, Auditor General; Sheila Dodds, assistant Auditor General; and Kevin Keates, who is the manager of performance audit.
Representing the government and the auditee: Cheryl Wenezenki-Yolland, associate deputy minister and chief records officer, Ministry of Finance — welcome back, Cheryl; I haven’t seen you in a while — David Galbraith, deputy secretary to Treasury Board, Ministry of Finance; Bette-Jo Hughes, associate deputy minister and government CIO, office of the chief information officer, Ministry of Technology, Innovation and Citizens’ Services — that’s, I think, out in front as the longest one so far — Heather Hill, executive director, capital, Ministry of Finance; and Philip Twyford, executive director, IMIT capital, office of the chief information officer, Ministry of Technology, Innovation and Citizens’ Services.
Welcome, everyone.
I’ll turn it over to the Auditor General, to her and her staff, to begin their presentation.
Auditor General Report:
Getting IT Right: Achieving Value
from Government Information
Technology Investments
C. Bellringer: Thank you, Mr. Chair.
All jurisdictions in the public and private sectors face challenges in achieving value from IT projects, including B.C.’s public sector. The IT projects are not just about technology. They are IT-enabled business change, impacting an organization’s culture and work processes.
As you know, over the past few years, our office has reported on a number of government IT-enabled projects — including Panorama, ICM and, most recently, the workstation support services contract — identifying challenges with planning, consultation and governance practices.
Learning from the past challenges was one of the main drivers of this report. We wanted to understand why some IT-enabled projects have struggled to achieve value and where greater focus should be placed to improve success. We developed 20 questions to support successful oversight of IT-enabled projects.
While there have been many recent improvements in central government oversight of IT-enabled projects, there’s a need for a better central view of IT investment across government. The report contains three recommendations to improve oversight of and accountability for publicly funded IT projects.
Kevin will take you through the report.
K. Keates: Thank you, Carol, and good afternoon, Members.
Almost every aspect of the B.C. government’s business depends on IT, from delivering health care and social services to generating electricity and processing billions of dollars in transactions. In 2014-15 alone, government spent $668 million on developing new IT systems and major enhancements to existing systems, and even more to maintain and operate those systems.
Achieving success in IT projects is about more than just being on time and on budget. It’s also about achieving value. However, IT projects are complex and expensive, and achieving value is not easy. One international study found that about 19 percent of projects fail, 52 percent run into problems and only 29 percent succeed. In particular, large projects have the highest risk of failure. This trend isn’t unique to government. It applies to both the public and private sectors.
This report is not the result of an audit. It’s the result of our effort to better understand why some IT projects fail and others succeed and to provide advice on how government can improve its oversight of large IT projects.
We reviewed research studies and publicly available information on IT projects in B.C. and other jurisdictions to identify common reasons for failure and success. We combined that research with our findings from our past IT audits, such as the Panorama public health IT system and the integrated case management system.
We also requested information from ministries and other government organizations to determine how much ministries and broader public sector organizations invest in IT and the oversight of this investment. We then discussed our findings and analysis with government staff who have oversight roles and with IT experts.
[ Page 1047 ]
While there have been many recent improvements in government oversight practices, overall we found a need for a better central view of IT investment across government. Current central oversight is focused on ministries, whereas the majority of IT investment is in the broader public sector, and while capital spending on ministry IT projects is monitored centrally by the OCIO and Treasury Board staff, ministries had not been required to report project-related operating spending during the term of these projects.
Our report made three recommendations to assist government in improving its oversight and accountability for IT projects. We recommend that central oversight of ministry IT projects include both capital and project-related operating costs; that ministries obtain IT investment information from their broader public sector entities to support central monitoring; and that the Ministry of Finance periodically review whether public reporting of IT investment meets expectations for accountability and transparency. We are pleased that the government has accepted these recommendations.
In addition to the three recommendations, the report also contains a list of 20 questions we developed to help decision-makers understand project risks and improve their own project oversight. These questions are grouped into four broad areas that organizations need to get right to achieve value in their IT projects. They were developed based on our research into common causes of failures and success factors in our discussions with ministry staff and subject matter experts.
The four main areas are: people, which is about having enough of the right expertise to apply good practice in project management, systems development, change management, procurement and vendor relations; planning, which is about the development of robust business cases with realistic expectations in alignment with organizations’ needs; consultation, which is about achieving meaningful engagement with key stakeholders, including system users, external customers, consultants, vendors and other government bodies; and governance, which includes setting strategic direction, prioritizing investment options, establishing roles and responsibilities and effective monitoring oversight.
The goal of this report, our recommendations and the 20 questions, is to help decision-makers and leaders of IT projects guide their projects to greater success.
That concludes our presentation.
B. Ralston (Chair): Thank you. Over to the government, then.
B. Hughes: Thank you, Mr. Chair and Members, for the opportunity to present today.
I will be going through the entire presentation for continuity purposes. However, when we get to the questions and answers, my colleagues will join in, depending on the applicability of the question to the different areas.
Overall, the Auditor General found that there is a need for a better central overview of IT investments across government. As noted, there were three recommendations that were made to improve the oversight and accountability.
The first one relates to the central oversight of ministry IT-enabled projects, including monitoring of total project costs, both capital and project-related operating costs, for the term of each project. That recommendation is the responsibility of my office, of the CIO.
The second recommendation requires that ministries obtain IT investment information from their broader public sector entities to support central monitoring of IT investment across the government reporting entity and that the Ministry of Finance periodically review whether public reporting of ministry and broader public sector IT investment meets legislator, government and public expectations for accountability and transparency. Those two recommendations are the responsibility of the Ministry of Finance.
Government’s overall response is that the Ministry of Technology, Innovation and Citizens’ Services, which has the responsibility for the office of the CIO, as well as the Ministry of Finance appreciate the report provided by the Office of the Auditor General and recognize and acknowledge that IT systems are central to the efficient delivery of services to citizens and organizations throughout the province to support effective government operations.
We appreciate that the report recognizes that there are differences between the governance framework for ministries and broader public sector entities, the SUCH sector agencies and Crown corporations, and agree that reporting should meet accountability and transparency requirements within the relevant frameworks. The ministries accept the Auditor General’s recommendations pertaining to the assessment of the oversight of IT-enabled projects and welcome the acknowledgment of the good practices that are part of government’s current oversight framework.
Speaking to the governance framework, ministries must follow the Core Policy and Procedures
[ Page 1048 ]
Manual, chapter 12, which is specific to information management and information technology management. My office, and my responsibility as the government chief information officer, is responsible for those policies for ministries, working with the chief records officer within the Ministry of Finance.
Broader public sector agencies — including health authorities, school districts, colleges, post-secondary institutions and Crown corporations — are governed by boards who are responsible to a minister. These organizations have their own governance structures, legislative requirements, policies and procedures, and follow the spirit and intent of the Core Policy and Procedures Manual in compliance with the capital asset management framework.
The office of the CIO works with ministries responsible for health authorities, schools, universities and colleges on their plans to ensure that IT general controls are regularly reviewed. My office has templates that are available on the Internet and provides guidance to any organization who contacts our office or is looking for those supports. My office also works very closely with the Crown corporations and the health authorities through the council of CIOs, which I chair.
The first recommendation relates to the central oversight of ministry IT-enabled projects. In terms of a specific response to the Auditor General’s recommendation, our 2016 planning process for IT capital projects is now underway. The business case, which is handled through my office…. Those templates now include the total project costs, both capital and related operating, for each IT-enabled project that comes in for review and recommendation.
My office has also developed a new reporting framework that includes both capital and operating costs for each project. We are incorporating feedback from ministries and other organizations, such as the office of the comptroller general, to ensure that we have clear definitions and reporting procedures for project-related operating costs and that we are being consistent and accurate in our reporting, particularly around capital and operating expenses.
Some specific key actions that we’ve completed to improve the overall success of ministry IT-enabled projects.
We now have central coordination of the selection and oversight of ministry IT projects. That mandate was given to our ministry in December of 2012 by Treasury Board.
We now, instead of having very large projects, further to best practices, break down those big projects into more smaller, self-contained phases.
Again, going back to the recommendations around people, we are building more capacity within ministries to ensure that we have the IT-enabled project resources to ensure success.
We have established a prequalified list of vendors, through my office, that can be used by ministry project teams to fill identified gaps. That is the group that we use to find external resources to do third-party reviews of projects when necessary.
We are in the process of developing a governance and assurance framework, working very closely with the Office of the Auditor General and the office of the comptroller general.
Additional actions that are planned or underway to continue to improve the overall success of ministry IT-enabled projects.
We continue to look at ways to build capacity within ministries to ensure that we have that capacity within government and we’re not relying specifically on external vendors. There are a total of 62 net new FTE positions that are going to be working on minor IT capital projects across 11 ministries. We’re moving that funding. That’s not new funding. That’s funding that’s being moved from contracting resources to build that expertise within government.
We’re strengthening the governance and oversight framework, currently finalizing a common IMIT governance project management and assurance framework. That incorporates recommendations from three external consultants, who we brought in to bring to us those best practices across industry in the management of IT projects, to ensure that we are looking at best practices on how to improve the general management practices of IT projects.
We have just put into production a new application condition and investment tool. That will enable us to understand all of the IT applications that we have across government. We’ll be able to understand the health of all of those applications so that we can use that information for prioritization of our investments and, as mentioned by the Auditor General, look at the value and benefits of each of those projects and ensure that we are tracking that, to not only track whether the money is being spent but that we are deriving the value and benefit out of those projects.
Of note, that new product that we are implementing was built by a B.C. company.
The second recommendation relates to ministries obtaining IT investment information from their broader public sector entities.
The Ministry of Finance response. With regard to the broader public sector oversight, since 2016-17, the service plan guidelines and templates from the Crown agencies resource office have required more robust major capital reporting for Crown corporations for increased transparency. This will continue for significant IT projects. The new 2017-18 mandate letters for Crown corporations, post-secondary institutions and health authorities require entities to identify significant IT projects to their responsible minister.
Recommendation 3 relates to the Ministry of Finance periodically reviewing whether public reporting of ministry and broader public sector IT investment meets legislator, government and public expectations for accountability and transparency.
The response from the Ministry of Finance indicates that the ministry regularly reviews and adjusts public reporting requirements in quarterly reports, budget documents, mandate letters, service plans and annual service plan reports within the accountability and transparency framework. In May 2016, the Ministry of Finance issued a guidance document for reporting major capital projects to clarify expectations and ensure consistency in the information that was received and reported.
[ Page 1049 ]
In summary, with regard to the oversight of ministry IT capital projects, my office recognizes the importance of achieving value from government IT investments, and we are committed to improving central oversight of IT-enabled projects under my office’s mandate.
We work closely with ministries to continuously assess and manage risks to the organization and to contribute to the further success of IT projects. We provide advice to other organizations in the government reporting entity when requested, and we have made our framework and templates available on our Internet so all of our best practices and the work that we have done is accessible to the broader public sector. We welcome the work of the Auditor General and her staff and the valuable information that was provided in the report.
With regard to the summary for broader public sector oversight, the Ministry of Finance appreciates the report, recognizes the difference between the governance framework for ministries and the broader public sector entities and agrees that reporting should meet accountability and transparency requirements within the relevant framework.
The Ministry of Finance will continue to work towards the integration of IT investment information into reporting documents such as service plans, and the Ministry of Finance appreciates the efforts of the Auditor General’s office in assessing the oversight of IT-enabled projects in government’s current governance framework.
That concludes our presentation.
B. Ralston (Chair): George has indicated the first question.
G. Heyman: I’ve got a couple of questions, and then I may have some more later.
While this report is primarily meant to give guidance and make suggestions for best practices rather than review particular IT projects and how well they presented value for money, I think it’s worth noting that there is a long history and a long list of IT projects over the last decade or more that were over budget, failed to deliver — failed completely to deliver what they were supposed to do — and had to be replaced. It totalled around $1 billion for, in many cases, questionable value.
I note that one of the recommendations is to break down big projects into smaller, self-contained phases, each with their own targets and goals for success. That makes sense. But it’s also worth noting, and the report notes this, that for projects that are under $50 million, there’s no reporting requirement under the Budget Transparency and Accountability Act until the total exceeds $50 million.
If my memory is correct, I think about 18 years ago, the Workers Compensation Board dealt with the issue where very large IT projects were being presented and approved by the board of directors or governors in smaller packages, so people weren’t getting the full picture of what was happening with the totality of the project. Provisions were implemented that if discrete segments were part of a whole, it had to be reported at every step along the way.
So I’m wondering if, in the implementation of this recommendation to break down big projects into smaller, self-contained phases, there will be any adjustments to the reporting requirements to ensure that where these smaller projects are part of a larger overall project expenditure, they will be properly reported and examined.
B. Hughes: I’ll begin, and then I’ll hand it over to my colleagues if there’s additional information.
A very good example of where we are implementing this new approach is with the natural resource permitting project. It’s a very large, multi-year, multiministry project. We have broken down that project into phases over multiple years. When the project does hit the $50 million threshold, it is then identified in the public reporting. That occurred this year, as the project is into its second year of funding.
The other projects that we have, we have completed since our ministry was responsible for the oversight of all the IT projects. We’ve completed 36 individual projects. All the information on those projects is available. We have 55 of them currently underway. All of that information is available.
But to your point, once larger projects that have smaller pieces cumulatively turn into a bigger project, those are reported appropriately.
D. Galbraith: Maybe I can just build on that. How we structure them is that once the business case is approved, then they’re required to be reported publicly. In the case of an NRPP, they are discrete chunks, so to speak. As those chunks….
B. Ralston (Chair): An NRPP is what?
D. Galbraith: Oh, sorry — the natural resource permitting project. Sorry about that.
As the individual components, the business case is approved. Then as the total gets to $50 million, it’s disclosed within what we call the $50 million table within our budget. We actually have gone further than the legislation requires. We update that table quarterly. So you will see…. I think it’s November 29 that we’re coming out with the second quarter, and you’ll see any changes that are occurring there.
C. Wenezenki-Yolland: I would also add that in the guidance that has been recently provided to the broader public sector in regard to mandate letters, service plan
[ Page 1050 ]
reporting and annual reporting, the guidance we have given actually requires the reporting of significant capital projects.
We have explicitly used that language to avoid a dollar limit cutting off projects, because a project, in regard to IT, may be very significant to the business while still being lower in dollar values. That would allow for the reporting, as the member has identified. If it is a significant project intended to be phased over multiple cycles, that would be identified as significant and would receive all of the public reporting.
There is a requirement for broader public sector entities to disclose those projects as part of their annual cycle to their ministers responsible. Then that will be followed through both in their service plan disclosure and in the subsequent annual reporting so that there is a full cycle of accountability in regard to IT projects.
G. Heyman: There’s also discussion of implementation of a new application, condition and investment tool that will enable IT investments to be prioritized based on a value framework. Are you able to describe, in general terms, what that might look like?
B. Hughes: The tool is called C55. It is a tool that was developed by a B.C. company, Copperleaf. That tool historically has been used to look at investments in the energy sector. This is the first instance of this tool being used for IT investments. We have created this tool which will become the inventory of all 1,600 business applications in government.
All of the information about that application — its age, as well as whether or not it’s currently meeting the current business requirements and how that technology is upgraded — allows us to identify risks and, therefore, prioritize our investments.
It allows us to track the number of applications and, also, the opportunities for us to consolidate investments. So it gives us a broad view across all of government — to be able to see, for example, if there is an opportunity for us to consolidate on a payment application or an authentication application. That way, government is making wise investments and able to leverage the investments that we are making.
That tool will be used on an ongoing basis by ministries to submit their requests for funding. It also provides an opportunity for us to have those requests for new applications vetted by subject matter experts across government. Everything from architecture to security to privacy to data, whether or not user testing, user design, is being undertaken. Are there opportunities for us to reduce red tape? All of that evaluation is also captured within that tool.
G. Heyman: Finally, for now at least, under the people category, the report notes that contractors can supplement gaps in capacity and expertise but overreliance creates its own risks and challenges, and that there’s a risk of not retaining the business knowledge necessary to help ensure the project will continue to provide value. It goes on to say — these are the questions that need to be asked: “Where external consultants are used, does the organization have the capacity to procure, negotiate and manage the arrangement to achieve the expected benefits?”
One of the projects that springs to mind, reading this, was the contract for health records management that was let to Maximus. We’ve had a lot of discussion about that at the committee.
Among things that were discussed were, for instance, in the health benefits operations, key contract terms. There was ambiguity in key contract terms that resulted in an impossibility to determine whether the ministry had fully received the expected transfer of financial risk.
It noted that in the case of this contract, of three legacy systems that were supposed to be replaced, only one was replaced. The second was six years late and the third not replaced at all, at the time of the audit. So in 2013, it was reported that the Ministry of Health could not fully monitor the expected benefits of that particular project.
I recall that in discussion at this committee and review of the report, it became apparent that the ministry simply did not have the expertise in-house to begin to grapple with the deficiencies on delivery of the contract. In the result, the expected benefits were not delivered, yet the contract was retendered, re-awarded — at, I think, a 40 percent increase — without those issues being addressed.
My question, given the enormity of that one single example, is: what measures is government planning to take to ensure that when a contract is let, the capacity is in place to properly manage and oversee it, define the deliverables, monitor them and insist that they be delivered, as well as to retain in-house expertise to ensure there’s continued business knowledge within ministry capacity, not dependent on outside contractors? Any specifics you can offer would be welcome.
B. Hughes: A number of things I can speak to, with regard to your question. As I mentioned, the ministries are looking to build capacity within government. We have a very focused effort looking at how we do more hiring within government, how we build that capacity, build those skills that we need within government — not just for the work that we believe government employees should be doing but also to build those skills and capacities around the procurement and management of external sources.
We have done a considerable amount of work looking at what the right skills are that you need to have on different types of IT projects. From small to medium to large projects, how do we move employees through those different projects so that they build those skills, working with the private sector, identifying where they have skills
[ Page 1051 ]
that it’s better for us to source externally than have inside, but also looking at the training that’s required to ensure that that government staff have the skills and ability to use those resources to get the best value?
I can’t speak specifically to the Ministry of Health example that you used. I do know that that audit, as you mentioned, was from 2013, and there’s been a significant amount of work done since then.
I can also say that within our ministry, the strategic partnerships office is an office that does provide a level of oversight to those large alternative service delivery deals. There is a very focused training program to ensure that people that are managing those contracts receive the training and the advice and oversight, and with some additional recommendations that we’ve received recently from the Auditor General’s office, it’s building more capacity within the strategic partnerships office to provide that support to those large outsourcing agreements.
G. Heyman: Just as a final supplement to the question, are there clear mechanisms in place to ensure that both in the writing of the contract, before it’s signed, and the monitoring of the contract…? If a contractor clearly fails to deliver on any aspect of the contract but certainly on a significant one, are there commensurate financial or other penalties to protect the taxpayer rather than simply rewriting the contract terms to reflect the failure of the contractor to deliver but still receive the same payment?
B. Hughes: There’s a significant amount of expertise and oversight in the development of those procurement documents and in the creation of those contracts. There are procurement experts within our ministry, both within the procurement branch and the strategic procurement office, to ensure that the procurement documents themselves are setting out the requirements and the obligations that we require the service providers to meet.
Legal services branch, risk management branch, Treasury Board and the comptroller general’s office are all involved — both with the development of those procurement tools as well as the negotiation of those contracts with our service providers — to ensure that there are very clear service levels and obligations that need to be met by the service providers.
If they fail to meet those requirements…. It’s very clear what those remedies are for the province to exercise if the service provider is not meeting their obligations.
B. Ralston (Chair): Has there ever been any litigation?
B. Hughes: Litigation? I can’t answer that question; I don’t know. I do know that there have been instances where our service providers have had to pay remedies, as per the contract, if they have failed to deliver.
S. Robinson: First, I want to thank the Auditor General’s office for undertaking this report in particular. In the 3½ years that I’ve been sitting on this committee, there have been a number of reports that have come forward around large IT projects that hadn’t met the test, where there’s been a failure to deliver in some aspect. We were certainly seeing a trend that started before I even got elected.
I think it’s worthwhile to have the Auditor General’s office recognize that there was this trend and that there ought to be something more systemic going on. I’m really pleased to see the end of all that work come up with some, I think, pretty strong recommendations.
I just want to ask some questions about capacity. That’s part of what we suspected as we were seeing this pattern: that there just wasn’t capacity in government to make the right decisions, right through. There were various different challenges in different projects.
In the slide where you start listing…. It’s slide No. 10, I think. In our package, it’s page 4. This is in the government response to slides that showed….
B. Ralston (Chair): It looks like they were edited down somewhat, because the numbers are….
S. Robinson: Yes, it has lost its number. There were slides 8 and 9, and then they stopped being numbered. I’m assuming it’s slide 10; I’ll call it slide 10.
The third bullet, in terms of key actions completed to improve overall success of ministry IT-enabled projects, notes: “Building IT-enabled project capacity and expertise in ministries.” Did we ever have the capacity? Was there ever a time when there was some capacity and we just stopped building it? Or did we lose it? What happened?
B. Hughes: I think there are a couple of answers to that question. I think that through retirements and through managed staffing actions, it has been challenging for ministries to retain capacity in the IT areas. As we know, it’s a growing sector, and we are competing in the market with the private sector to attract IT resources. That is a challenge for us, and it is something that has been recognized.
Within the public service, there is a key work stream specifically focused on building IT capacity across government, which I am leading with the Deputy Minister of the Public Service Agency. Yes, there has been a loss of capacity, which we are rebuilding.
The other thing I would say is that the role of technology in supporting our business across government and in all of our daily lives has grown. So the need for those technical resources in ministries within the business areas — not just in the IT shop, if you will — has grown.
While there’s a capacity challenge, I think there’s also the demand for IT resources that has grown. It’s not just people who are dealing with the servers in the basement.
[ Page 1052 ]
It’s people who are fully engaged on business teams that are looking at how we can use technology to improve the way that we deliver services to citizens to meet their demands around the ability to access government services easily. I think there are two challenges there that we are attempting to address.
S. Robinson: Thank you very much for that answer. The bullet says we’re going to build IT capacity. Is it both of those streams? There’s some focus and energy to build the capacity in both of those places?
B. Hughes: Yes, absolutely. What we are doing is helping ministries understand as they’re going into these IT projects and, depending on the size, helping them to define the types of skills that they’re going to need. For a small project, they may not need the same type or the same number of people to work on projects. So we help them to identify what sorts of skills they will need. We help them to identify where they can resource them — whether or not it’s something within their own organization or if there are other places across government.
The other thing that we’re trying to do is to get the IT sector across government to think about this in a broader way. It’s not just who’s in my ministry but what are our IT resources across government. As we have people working on these projects and gaining this experience, we want to be able to move them on to another project so that we can inject some success into that subsequent project, too, as well as build that capacity.
We are looking at the types of skills that we need and then working with them to build that capacity within government, looking at ways that we can do a better job of drawing in IT resources outside of government, working with the post-secondary institutions to get new graduates into government — how we can use those resources across government and, where we don’t have that resource, work collectively with the private sector to help them understand what sorts of resources we will be looking for so they can start building their bench strength to be able to work with us on an ongoing basis.
S. Robinson: I have two more questions.
In the next slide, there is a note speaking to additional capacity-building in ministries for IT projects, with a very specific number of 62 net new FTE positions. That’s in addition to the earlier statement about building capacity. All told, what are we looking at in terms of how many people we need?
B. Hughes: I don’t know. I’ll tell you the difference between the two things I was talking about.
There’s an overall piece of work that’s looking at building capacity generally. Those 62 FTEs, specifically, were for capital projects. We’re working on how we ensure that we have the right people working on capital projects, but we also have a need for people working on not just the projects but the ongoing maintenance.
In terms of number of people in IT across government, I’m looking at Philip to see if he has a number.
P. Twyford: To date, out of the 62, we’ve actually hired 17. We’re finalizing competitions on 11 more. Competitions are currently underway for an additional 34.
In addition to that, the Ministry of Finance has brought another 12 net new positions. There was a question previously about some of the big outsourcing deals. We have rebalanced some of those and actually brought some of the IT skills back in. By the end of this year, we expect that we will have added 100 net new FTEs into government for IT projects that support business.
These are not people doing pure IT anymore. These are business analysts. They’re really architectural folks who are looking at: what’s the broader mechanism, as Bette-Jo said, about how we work horizontally? We’re a little bit fluid in moving them around and making sure we leverage them.
S. Robinson: One last question, Mr. Chair.
Knowing what we know now…. If you could go back — I don’t know; eight or ten years, when the attrition started — what would you do differently? What would you recommend we do differently ten years ago or 15 years ago — whenever things started to shift around this?
Things go in cycles, and I want to avoid this happening in the future. So what would your recommendation be, going forward? Technology is always going to change, and we’re always going to be chasing after it and never going to quite be on top of it.
B. Hughes: Well, I’ll build on some of the things I’ve already said. Actually, this is an area where I’m quite excited about the opportunity within government. I think we need to ensure that we continue to have very strong relationships between government and the post-secondary institutions. We need to do a better job of selling government as an employer to those young graduates.
There is a phenomenal number of very exciting projects, very exciting areas, to work in, in government. I think we need to do a better job of helping people understand the great career they can have in government. I’m excited to start doing that.
We need to work more collectively as a sector across government. Instead of hiring someone into a program area or a ministry, we need to look at hiring that individual into the public sector and look at how we can help them develop their career, how we can help them move from project to project so we start infusing that experience across government.
How do we ensure that we’re identifying people to bring in and being more planful about their careers, to help them move from those small projects to the medium projects so that when they get on to a big project, they’ve got some skills and experience behind that?
We are doing a significant amount of training in those sorts of basic skills that have been identified. How do we do a better job of planning? How do we develop better business pieces? How do we do a better job of looking at architectures, both business architecture and technical architecture, so that we’re all working with the same plan? How do we build training around…?
A big area that we’re really focusing on is how we work with our users. Whether those are internal users in government or if that’s the public, how do we engage them early on to ensure that whatever we’re building, whatever we’re going to eventually deliver, is actually going to meet their needs? There’s a great deal of work that’s going on, on that. How do we bring in new ways of developing systems, as opposed to going away and coming out four years later and saying: “Ta-dah. Here it is”? How do we build something, get the users using it, giving us feedback, continually iterating on that?
I think the other thing…. Ten years ago it would have been great if people would have had an appreciation that technology is integral to the business of government. It’s not something that’s done in noisy rooms with blinking lights. This is something that’s absolutely integral to the way that we will do our business, and they need to ensure that technology is sitting at the table with business to achieve business outcomes, as opposed to thinking about it as something that’s separate and apart from what we do.
S. Robinson: I’m grateful to hear that there’s some recognition to having good IT resources as part of government, as opposed to sitting outside of government, to deliver something for government — that it’s part of it. I look forward to seeing an improvement on being able to deliver these kinds of projects.
K. Corrigan: I appreciate your enthusiasm about the future. But I do think that there has been a policy, not just in IT but across government. For essentially 15 years, the approach has been: contract out everything. What that has meant, it appears to me, is that the capacity of government ministries, the expertise, the institutional memory and the managerial ability were lost for a long time. I think government is finally understanding that you do need to have that kind of expertise within, and you can’t just assume that contractors are going to go along and do what they’re supposed to do without any kind of oversight.
Unfortunately — government may be coming to this conclusion now — we have the Panorama health system, which was $114 million and 420 percent over budget. There was a report on that. BCeSIS. The integrated case management system — called a colossal failure. IHealth. The electronic health records initiative. Maximus. B.C. Hydro. I mean, it goes on and on. There have been many excellent reports coming out of the Auditor General’s department.
I’m glad that we are turning a corner, but the taxpayers of B.C., unfortunately, have spent a lot of money to get there.
I wanted to ask about the total IT spending. Just doing a really quick projection…. Maybe you can just confirm whether I’m correct or not. The report, on page 12, says that for the core government, for the ministries, the capital spending was $121 million in 2014-15 and that the operating cost was $392 million. That’s just over 3 to 1, three times for operating compared to one for IT, $392 million to $121 million.
Then on the page before, the capital spending for 2014-15 for the core government — the schools, universities, colleges and health authorities and the Crown corporations — was $668 million.
Can we project and say that if the ratio of operating costs to capital is 3 to 1 for core government, that it’s probably about right that for the whole government reporting entity, the operating costs would be $2 billion a year, plus another $600 million on capital?
I’m just trying to get a sense of the magnitude of the expenditures. If that is the case, we’re talking about a pretty significant chunk of…. Maybe 3 or 4 percent of government is spent on IT. Does that sound like a legitimate guess? It’s not a comment on anything. I’m just trying to get a sense. Is that about right?
D. Galbraith: To be honest, I wouldn’t be able to confirm how accurate that is for you. I couldn’t tell you if that is an assumption that carries through.
K. Corrigan: Well, I’ll make the assumption. I’ll just assume that it’s 3 to 1. It’s just that it’s a lot of money if it works that way for core government.
Because it’s so significant, I think I understood from our chief information officer that we are going to, in the future…. Because we have the operating spending for core government but we haven’t had it for the schools, universities, colleges, the health authorities and Crown corporations, that is going to be required from now on. So we are going to understand how much is being spent on IT in all parts of government in the future, starting when?
C. Wenezenki-Yolland: The instructions to the broader public sector are going out with the ’17-18 mandate letters. We would see that reporting coming in for the ’17-18 fiscal year. That’s consistent with the current accountability cycle that we’re in. In order to provide new direction to them, that’s what we have to meet. It would be ’17-18 before we’d have a full accounting of that information.
[ Page 1054 ]
K. Corrigan: Okay. That’s good for now. I’ll come back.
D. Eby: I can’t shake the feeling…. I read the ministry’s response and the Auditor General’s report carefully. It just feels like we’re getting fleeced on these projects — the IBM project; the eHealth project, where they got paid $72 million. And they didn’t deliver a working project. Then it gets taken away from them and handed over to this company called Cerner.
It just delivered the IHealth project on the Island that was creating the potential for overdosing patients. It was increasing the number of people walking away from the emergency room. It was tripling the amount of time that doctors spent on paperwork. Turns out, it wasn’t tested with the actual end-users, one of the issues identified in the Auditor General’s report. And that’s the company that’s been asked to take over the $800 million eHealth project. It can’t even deliver us a smaller project on the Island.
I read the questions in the Auditor General’s report about what you should be asking when delivering these projects, to avoid failure. Question 18: “Is it clear who is responsible and accountable for the success of the project?” Or question 20: “Are those tasked with oversight prepared to step in and cancel or substantially alter projects if the need arises?”
How could it be that we don’t know who is responsible and accountable for the success of a project or that there is nobody who is tasked to step in and cancel or alter the project if the need arises during the pendency of the project?
That Cochrane report into the IHealth project came out a week before the slide presentation that you prepared for us today. It was out seven days before your report responding to the Auditor General’s report. Did you incorporate the findings of the Cochrane report? I mean, to know that some of these pieces….
Okay. I’m just trying to come up with a question here. I guess the first question is — and everything seems to focus on the ministry: what is the accountability for these vendors — for this group, Cerner — that delivered this incredibly flawed project on the Island that’s now doing eHealth on the Mainland? What is the accountability for IBM? And how are these firms getting onto the approved vendors list? Are they still on the approved vendors list? If we’re not going to sue them, are they still on the list of the firms that we would deal with?
Maybe we’ll start with that. What is the accountability for our partners — and I use that in quotes — on these projects who take the money and don’t deliver working projects to the people of B.C.?
B. Hughes: I’ll start, and then my colleagues can jump in.
I can tell you that for every IT project within government, there is very clear accountability for that project. We are required to have executive sponsors. We are required to have the executive financial officer sign off. There is a very clear accountability for the oversight of these projects. I can tell you that, as I share the results of this audit and others with my colleagues, with my CIO colleagues, with deputy committees and at Treasury Board, one of the questions that is asked is around accountability. So there is very clear accountability established within government on these projects.
With regard to the accountability of our vendors, as I mentioned, there are very robust procurement practices in place that are undertaken by public servants that are going to the market, and there is a significant amount of expertise that weighs in to the development of those contracts. There is quite robust management of those contracts so that when they are not delivering, they are held to account.
I can’t speak specifically to the projects that you identified, but I do note that the Auditor General, in her report, did mention that with the project with Vancouver Coastal and IBM, there was a decision made by the governing body that they were not delivering on what was required, and they did make a change. So they were practising good governance and oversight in that respect.
D. Eby: I wonder if you can give me an example of where there has been accountability for a vendor that has taken money from the province of B.C. and not delivered a working project, where they’ve had to pay money back or where they’ve been sued or where they’ve been even just removed from the approved vendors list?
Is there an example that you can think of — maybe not the ones that I listed — any example of accountability for a vendor? I mean, we have a long list of failed IT projects, so there’s no shortage of vendors who could be subject to these kinds of sanctions. Is there an example of accountability like that that you can think of?
B. Hughes: The ones that I’m aware of…. I do know that in the agreement between Maximus and the Ministry of Health, there were remedies that were paid by the vendor for not meeting certain obligations. I know that within my own organization, in the delivery of our hosting services, our network services, there have been instances where the vendor has been obligated to provide remedies to the province for not meeting their service levels.
D. Eby: And do they remain on the approved vendors list even after these things?
B. Hughes: There isn’t an approved vendors list per se. Every time there is a procurement, there is a process that is undertaken through a public procurement process.
D. Eby: So they’re still eligible to apply, just as if they’d never done….
[ Page 1055 ]
B. Hughes: They’re still eligible to apply. One of the things that is required is a reference. So if we do go through a procurement process and there is a service provider who meets the requirements of the procurement, part of the process is for the executive responsible for that procurement to do a reference with other accounts within government so that we are aware of the performance of that vendor.
D. Eby: And when you have a company like Cerner that fails on the Island, which results in the need for an independent audit, and doctors are going back to pen and paper and refusing to use the system and so on…. Then they’re also working on a project on the Mainland, the clinic transformation project.
Is there a process whereby you say: “Uh-oh. We’ve got a big problem on the Island. These guys are working on an even bigger project on the Mainland. Let’s have a look at that and examine whether we should be using these guys”? Is there some sort of feedback process where our vendor has multiple projects, and you find a problem in one, and then you go back to the other projects?
B. Hughes: Again, I can’t speak to that project specifically and the discussions that have taken place between the Ministry of Health and the health authorities, but I do know that within government, when we become aware of a vendor who is not meeting their obligations, there certainly are discussions between the CIO members as well as the executives responsible for those businesses.
S. Gibson: I’m somewhat comforted by the remark that our situation in B.C. is not unique. I think we realize that. As we read the business press, we see some concerns internationally.
A couple of quick queries that I have. You advance the view that there should be more central monitoring — in other words, almost like a clearinghouse of understanding of the IT in our province. But at the same time, you advance the view that it should be smaller components, so they can kind of bite-size pieces, to paraphrase, so we can get an understanding.
I agree, for what it’s worth. But isn’t it tough to have both the big picture, which you’re advancing, and also the micromanaging to ensure that we’re doing the right thing? A brief response is all I’m looking for.
B. Hughes: The broader oversight really occurs in the planning. As we work with ministries to identify what their requirements are and start to build that, the analysis of the ability for us to consolidate, to standardize and to reuse investments that the province has already made happens through the planning process with the people in my office and all of the subject-matter experts across government.
That view of looking at how we can have a broader corporate view on the implementation of IT projects happens more at the planning phase. In terms of breaking it down into smaller projects, that really gets to the execution.
You may have an analysis of a project — again, like the natural resource permitting project — where, through the planning phase, we identified a number of opportunities for them to use assets that the province had already put into place. Then, when they get into the execution, we’ve broken that down into manageable chunks, both from a capacity prospective but also to ensure that the province is deriving value at every aspect.
They have a requirement to go back to Treasury Board, to provide evidence that they have realized the value and benefit that they said that they were going to. Then the province has an asset. That way, we ensure that we’re not stranding assets along the way.
If there is an issue with capacity, whether that’s resources or staff or whatever, and the project needs to be paused or we slow down the implementation, the province still has those assets that have been delivered, as opposed to having very large projects that are multi-year. If there is an issue with the scope or the implementation, then you may end up with stranded assets. We’re trying to avoid that in the execution of these larger projects.
S. Gibson: Yeah, makes sense. Another quick one. I will compliment you. You have the people, planning, consultation and governance compartments. I thought that was very helpful, the way you categorized that.
Another quick query. You’re hiring 100 more IT experts. That maybe was a precursor to a question I was going to ask. Isn’t having the same level of smarts that the vendors have one of the things we really want to do in government, to be honest? The vendors are geniuses when it comes to this stuff, and we may not sometimes have the same level of smarts.
My sense is that those hirings must be a part of levelling out the playing field. Is my understanding…?
B. Hughes: Well, I would make the statement that the people that I work with in government are just as smart as the vendors, if not smarter. I think that that is a fallacy, that the private sector has all the smart people.
I was having a conversation with an executive of Gartner recently who has been working with my organization. I asked him: “I’d like to hear what you think. You’ve worked with my organization for six months. What do you think of my organization and how we’re functioning?” He said that they are a group of the smartest, brightest, most passionate people that he’s ever come across. And he’s worked in the public and private sector for some time.
We are competing with the private sector in terms of bringing those people in. As I said, I think that we need to do a better job of explaining what an exciting place
[ Page 1056 ]
the B.C. public service is to work in so people can see that they can have a career in IT working on a number of different things, not getting hired into an area where they feel that they don’t have the opportunity to grow and use their skills.
S. Gibson: I have one question.
Passion. I taught students all the time: passion is good. It’s a good thing.
Last point here. You have a phrase which you really don’t address anywhere else in the report. It kind of jumps out at me. It says: “Be certain that the potential rewards of new technology are worth the risk.” That’s a brilliant little phrase, and it’s hidden there. It’s only mentioned once, and you never refer to it again. I’d like somebody just to elaborate very briefly because I think that’s quite salient, in my view.
I’m speaking to the Auditor General, of course. It’s a genesis question in a way, right?
S. Dodds: It’s really about that planning piece, again, and the business case and understanding what the value proposition is. It’s that risk management. What’s the risk versus the value?
B. Hughes: I would add that the new Copperleaf C55 tool that we’re putting in place….
One of the things that has been, I think, a big challenge for us, as we’re working with the ministries…. This isn’t just about asking them how much money they need and how many servers and how many developers and that sort of thing. We’re really driving them to identify the business outcomes they are trying to achieve and to start to describe the work that they’re doing in business terms. How are we transforming the business? How is this going to improve our own business processes? How is it going to improve the delivery of a service to citizens?
That is, I think, a big part of…. What all of us are trying to do is to get people to understand that this is about IT enabling business transformation, business outcomes. It’s not just about the technology.
S. Gibson: Right on. Thank you very much. I appreciate that.
L. Throness: I have a question for the Auditor General.
I’m a bit of a skeptic on this file. In your report, you said that there were 1,600 systems in 2013, 40 percent of which were obsolete. There would be more now, maybe 500 systems, that are going to be coming down the pipeline very shortly. The track record across North America is not very good.
I look at the four recommendations: people, planning, consultation and governance. They’re quite broad and theoretical. They’re quite idealistic. They’re, perhaps, a bit vague rather than hard-nosed and specific.
I’m wondering: why these four recommendations? Why not things like mandating off-the-shelf solutions, where appropriate? Contractor penalties for lack of delivery have been talked about already. The Panorama recommendations were an independent review — good project management practices, good contract management practices.
The Auditor General is not a technical expert in the delivery of massive IT projects. Did the Auditor General conduct a scan of, say, projects around North America to find the common elements of problems and distil all the recommendations of lessons learned into these four recommendations? How did you arrive at these four, and why are they the answer?
C. Bellringer: The four areas that you’re referring to are sitting in the 20-questions section of the report. The recommendations themselves were around the oversight. So the recommendations were pointing to the fact that the project costs weren’t including both capital and the project-related operating costs, moving into the broader public sector piece and then looking to make sure that the kind of reporting remains relevant going forward.
I’m going to ask the audit team to give you a bit more background on the kind of research that they did. The people, planning, consultation and governance piece came from quite a bit of detailed research, so I’ll hand that over to them to explain it further.
S. Dodds: What we did was look at a scan on audit reports. We looked at what we had produced over the past several years. We looked at other jurisdictions, both Canada and elsewhere, looking at reports on IT projects to understand what indications of success factors were and what some of the causes of failure were. We looked at other independent reviews of B.C. IT-enabled projects.
We also looked at some of the guidance that’s out there. We looked at Val IT and COBIT and some of the international research — and we’ve referenced some of it in here — to really look broadly at what’s going on. As we note, it’s not unique to B.C., it’s not unique to the public sector, but there are common challenges.
The idea of the four categories of advice was: what are some of those things that you should be paying attention to, if you’re responsible for these large projects, to be able to increase the likelihood of success?
L. Reimer: My question is in response to an answer that was given to Member Eby with respect to the director of procurement checking references. You made reference to other accounts in government.
It seems to me that when we’re talking about a problem that is so international in nature, we would only check a reference for other accounts in government. Are we checking beyond that?
[ Page 1057 ]
B. Hughes: Yes, absolutely. The question was specific to: when we have a vendor who is doing work within government, do we make sure that we check?
As part of the procurement process, our reference checking, the vendors are required to provide us with a list of reference accounts. Those could be international. They could be across the broader public sector as well as within government. So yes, that checking does happen more broadly than just across government. But if we know that I’m undertaking a procurement where there is a vendor who is currently providing a service within government, I make sure that, even if they don’t list it as one of their references, we check that as well.
We have a good understanding of whether or not they are performing, as well as going to our strategic partnerships office, which has an oversight of all of the deals. That will provide us with specifics for that vendor and the relationship they have with the individual ministry, as well as lessons they’ve learned over the course of doing these large contracts — and making sure that we are aware of all of the things where people have lessons learned, if you will, on different contracts as well as things that we’ve learned from audits in other areas.
R. Sultan: Several questions.
First of all, to the chief information officer: the scope of responsibilities that you’re undertaking is so breathtaking. It’s really saying, “Run the British Columbia government better,” because that’s really what we’re almost talking about. It’s hard to find a department or a ministry that, in fact, isn’t enmeshed in this world and changing fast.
Do all these various entities respect your authority? Do they give you grudging acknowledgement, or do they come running for help in a respectful manner?
B. Hughes: All of the above, I would say, sir.
As the chief information officer, and also with my colleague the chief records officer, we have a policy responsibility that ministries are required to follow, and our broader public sector colleagues follow the spirit and intent. So there are policies in place. There are standards that we put in place. There is an acknowledgement of the mandate of our offices and their requirement to follow those. There are also a lot of guidance documents that we put out. There’s a significant amount of training that we do. There are awareness sessions and communities of practice.
There is a very good relationship with the ministries that we have developed. I would suggest that that has improved over a number of years, particularly on the capital projects, as we have consolidated that work — working with the ministries to, again, recognize what their business needs are and, collectively how we are going to help them to be successful.
Following up on a previous question, we do recognize where technology trends are going. We know that ministries are interested in utilization of cloud services, utilization of off-the-shelf systems, so that we’re not building things that are more challenging and potentially more expensive. How can we take advantage of the technical advances that are occurring? Our architecture and standards branch works with the ministries to help them understand where technology is going and how they can safely consume it.
Security is an area where I think, again, there have been great advances. As the security threat increases, we have to work collectively to make sure that our systems are secure. That is an area where we see a broadening of interest, not only across government but across the broader public sector.
Yes, there’s a policy requirement that I have that ministries respect. There’s also a significant relationship-building that goes on between my office and the ministries so that we can work together to achieve government’s priorities and our business outcomes.
As well, yes, we very often do get calls when people are in trouble, whether that’s within core government or the broader public sector, and we do everything we can to assist them.
R. Sultan: I’m looking at the table, exhibit 3 on pages 20 and 21. I notice that Vancouver Island Health Authority has a $100 million project underway. Over there on the other page, we have Vancouver Coastal, Provincial Health Authority and Providence with a $480 million project underway. I would assume there’s some commonality between these two systems. Is it really imperative we have two of them instead of one?
B. Hughes: I can’t speak to the specific projects within the health authorities. Those are outside of my mandate.
I don’t know, Cheryl, if you have anything you can add to that.
C. Wenezenki-Yolland: No, I cannot. That would be through the oversight of the Ministry of Health, and we don’t have specifics on the individual projects presented in the table.
Interjection.
R. Sultan: Well, my colleague says: “Perhaps this is an Auditor General question to ask.”
S. Dodds: We haven’t looked at either of those systems.
C. Bellringer: We do have the clinical and systems transformation project on our performance audit coverage plan list, but we haven’t started that one yet.
R. Sultan: I notice, also, there are other entities where I happen to know some of the things of aston-
[ Page 1058 ]
ishing character going on — for example, the Ministry of Transportation, in terms of tracking traffic flows around the Lower Mainland. It’s quite astonishing, if not a bit creepy.
It’s not on your chart, and I assume WorkSafe B.C. also has had some humongous projects, which were referred to earlier. So this is not a comprehensive list. Even so, it adds up to $2.2 billion. Any comment? Or is this just a sampling?
B. Ralston (Chair): It’s a lot of money.
S. Dodds: The list is developed from the Ministry of Finance reporting on large capital investment. So they’re IT projects which have had approved spending of $50 million or more. That’s what’s been compiled.
There may be other systems that are being funded through operating expenses that would not be in this summary of the public reporting.
R. Sultan: Well, again, perhaps in line with my introductory exclamation, are we trying to do too much here? Or, put another way, is there any reason, realistically, to think at the end of the day that a Public Accounts Committee 20 years from now may say: “How did we do?” Well, about 20 percent of them failed, 30 percent succeeded, and half of them continue to be very challenged.
Are we so good that we’re going to do better than that historical track record, which seems to be a North American benchmark?
S. Dodds: Those statistics are actually from a global study.
R. Sultan: World. Well, even so, the point is that our ambition is facing long odds of success. McKinsey points out that 17 percent of these projects are so seriously off base that they have threatened the very existence of the organization sponsoring them. I don’t think there’s any doubt about the province of British Columbia going out of business, but the failures here can be extremely expensive.
I’m just offering a lament. I don’t offer any solutions. I commend your attempt to get your arm around this world. Your reach may not be quite long enough, but keep trying.
B. Ralston (Chair): And we’ll have that discussion in 20 years.
I think Ms. Hughes mentioned Maximus as a contractor where there had been some takeback or payback required, back to the government. Yet there was a critical Auditor General’s report of Maximus, and in 2013, the contract, which was due to expire in 2015, was extended to 2020.
What’s the lesson, do you think, for a contractor like Maximus and other contractors when, despite a critical audit, despite differences of opinion with the government such that Maximus had to cough up some of the contractual money, they get their contract extended for seven years? What’s the lesson there?
B. Hughes: I can’t speak specifically to the Maximus contract and the extension process. That would have been undertaken by the Ministry of Health.
B. Ralston (Chair): But as the office of the chief information officer, presumably you have some supervisory or oversight jurisdiction.
Is your office completely mute and speechless on that kind of an endeavour where a contract…? Let’s say a disputed contract. Certainly, there are some differences of opinion about their efficacy, and there was an audit by the Office of the Auditor General which raised some real questions about that. Is there no role for your office whatsoever in making that kind of a seven-year commitment to a vendor with that kind of a track record?
B. Hughes: My office would be engaged with the Ministry of Health and the service provider, specifically to ensure that they are complying with government standards in the way that they are delivering the services.
The oversight of the contract itself would have been the strategic partnerships office. I do know that they were engaged and did provide advice and recommendation with regard to the extension of that particular contract, but I can’t speak to the specifics on that.
K. Corrigan: I wanted to ask a couple of questions about public reporting versus reporting from the ministries or from other government entities.
Right now $50-million-plus capital projects are required to be publicly reported. Now, we’ve heard that the mandate letters are going to require that all government, essentially, has to report. So I guess the first question is: is that information going to be made public then?
C. Wenezenki-Yolland: Yes, the way the accountability structure works is that the guidelines would have gone out to the ministries and to the Crown corporations around the expectation.
The mandate letters, during that process, will clearly identify the required changes in regard to reporting for IM- and IT-related projects. When they complete their service plan reporting process, which is part of the annual budgeting cycle…. As part of that service planning reporting process, they will identify those capital projects. Those documents are public documents. That information will be available publicly.
In addition, when they do their annual reporting, it’ll follow in the annual reporting cycle. So what you will see
[ Page 1059 ]
is you’ll see the benefit of the planning and the projects that are coming, and then you’ll see the annual reporting against those plans. It will be public.
K. Corrigan: One of the points made in the report, on page 17, is that if IT services are part of a larger service contract, then they become operating costs, as opposed to a capital project. Even if it’s part of a bigger service contract, are we then, as the public, going to be able to understand…? In, say, a contracted-out situation, are we going to know how much the IT costs are that are associated with that? Is that going to be publicly reported?
Sometimes IT services are provided by contractors. If that’s the case, then it’s operating. So are we going to know? Is there going to be public information about how much that costs? Because it’s just simply a different way of doing it, but the cost could be very significant. Is that reported?
C. Wenezenki-Yolland: We were just debating on that. That information would be available through contract reporting. It wouldn’t necessarily show up in capital reporting because it would not be an IT capital project. It would be an operating expense.
In the contemplation of the reporting we have right now, that wouldn’t be identified separately. You would get full disclosure of those contracts as part of the annual public accounts reporting process, but they wouldn’t be differentiated as capital projects.
K. Corrigan: It’d be hard.
I guess two questions about incentive within government. One of the concerns that several of us expressed when we were looking at the Panorama report — at least, I expressed, and I think others did as well — was a concern about whether or not there is an incentive for government, when an IT project goes terribly wrong, to keep it quiet because it can be embarrassing.
My question is: are both government and the Auditor General’s office satisfied that there are the necessary mechanisms in place to ensure that that’s not possible anymore? Panorama continued on for years. I’m sure there are denials, but it sure looks like part of it was, “Keep going with it, keep putting money into it, and maybe it won’t become a huge embarrassment,” which it did in the end. I’m just wondering what we have in place now that wasn’t in place a couple or three years ago that would ensure that that couldn’t happen anymore.
B. Hughes: There are a number of controls that we have in place now, and I’ve mentioned several. We have centralized oversight of all of the IT projects in government. My office is responsible for the minor projects. Those are under $20 million in total, or $10 million per year. We have oversight of those right from the development of the business case through to completion. We require the ministries to report quarterly on those, so we have a very good insight into how those projects are tracking.
My office also works with Treasury Board staff on the major projects. We are involved in the oversight of the execution of those projects as well. As mentioned, the projects now are being broken down into smaller phases. There is a much earlier indication if things are not going well, and there are actions taken to rectify that. There are deputy-level project boards that are required on every one of the large projects. I sit on all of them, I believe, that are underway right now. Again, we have insight into how those projects are going.
There is definite accountability to the executive sponsor of those projects to ensure that they are executing to get the value and benefit — that the service providers that we are working with are delivering on what they are delivering on. There are requirements to report back to Treasury Board on a regular basis to ensure that they are delivering on what they said they were going to do in their original business cases.
K. Corrigan: Is there public accountability? In other words, there’s accountability…. You’re talking about accountability within government, and that’s encouraging. If a project was to be going off the rails and was costing tens of millions of dollars — it’s happened repeatedly — how are the people of this province assured that they would be aware of it?
D. Galbraith: I was going to say that for those items on the $50 million club, that is possibly one of the more exciting times. At quarterly and budgets, the media is…. That is scrutinized very closely. We disclose every single change, regardless if it’s just a timing change, with regards to that table. It is followed quite closely.
I don’t know, Cheryl, if you want to talk about the other pieces that we’re bringing in.
C. Wenezenki-Yolland: In regard to the broader public sector, as well, with the increased reporting that I was speaking about earlier…. The nature of the reporting will include the tracking of the project. It does include both the annual spend and year-to-date spend, with explanations for any variances. That is all publicly reported, so there is an increase in that transparency as well.
L. Throness: A question for the CIO. I’m looking on page 7, where the ministries accept the Auditor General’s recommendations and welcome the acknowledgment of the good practices that are part of the government’s current oversight framework, such as central coordination of the selection and oversight of ministry IT projects — that being the first bullet. Then we go to what my colleague pointed out, on pages 20 and 21: three projects alone on
[ Page 1060 ]
health care, totalling $841 million, a massive amount of money, and you are unaware of it.
I guess my question would be: why are you unaware of it if there is to be central coordination, which is the recommendation that was accepted by the CIO? If you are personally unaware of it, is someone on your staff aware of the close connection between these three, and could we get an explanation from you, for this committee, as to why they are separate and not coordinated?
B. Hughes: Those projects are the responsibility of the health authorities, and there are different governance frameworks that govern the health authorities.
Cheryl, perhaps you can speak to that.
C. Wenezenki-Yolland: In that context, the Ministry of Health and the Minister of Health are accountable for the coordination of the health authorities. Those health authorities have an accountability back to the Minister of Health.
If we had individuals from the Ministry of Health — there is a team within the Ministry of Health that is responsible for the accountability framework for the health authorities — they would be able to provide you with detailed answers about the nature of these projects: what they are, how they could be coordinated, if they are coordinated. Unfortunately, we do not have those people here with us today, because we had not anticipated answering specific questions on specific projects but speaking more generally to the accountability structure.
L. Throness: So there is no central coordination, then.
C. Wenezenki-Yolland: There is central coordination of the health authorities within the Ministry of Health, and then there is public reporting on the expenditures of these projects to Treasury Board staff. But as to central coordination of all IT projects across all public sector organizations and that kind of planning — because it is over 200 different entities — those entities that are external to core government have individual boards and board members who are responsible for the oversight of those Crowns and agencies. They report in to the minister, and they work with the accountability individuals within each of the ministries.
There is a team within the Ministry of Health. There is a team within Advanced Education. I know you were speaking with Advanced Education earlier about their coordination of capital projects in regard to the post-secondary sector. And there would be, in the case of the Ministry of Education, a team within the Ministry of Education that coordinates the education sector. That information, at a high level, comes in centrally.
At the centre, from a budgetary perspective, Treasury Board makes the decision, based on risk, which of those projects they’ll monitor on a regular basis, which are some of the projects you’re seeing here and that show up in the $50 million table, as Dave Galbraith has identified.
L. Throness: So then the central coordination, from the centre of government, would be the variance, not the rule. The rule is that it’s not centrally coordinated, that each reporting entity coordinates it individually and that only some things are looked at centrally. Is that the case?
C. Wenezenki-Yolland: In the case of ministries, it is all coordinated centrally, as Bette-Jo has identified. In addition, there is work that Bette-Jo, as the CIO, does with the CIOs from the broader public sector entities to ensure that there are consistent tools, disciplined approaches, applied to these projects.
From a financial perspective, that is monitored through Treasury Board. Any projects over $50 million would be monitored there.
In addition, the broader public sector is required to follow the spirit and intent of all government policies. So any policy direction that would come out from the CIO’s office for government, they would also be required to follow.
L. Throness: Okay. I’ve made my point.
V. Huntington: Part of what I was going to say has been asked by Laurie. I think there is a fundamental Achilles heel here — we’ve seen it in other discussions with the Ministry of Health, when we were looking at two full days of discussion in that case — and that is the failure of a centralized coordinating body that’s overseeing the IT projects.
If I think of a health authority board being responsible for the oversight and the accountability of these major IT projects, I just absolutely shiver. I mean, it’s impossible to comprehend that a board would be able to even pretend to be accountable and to report through.
I think there’s a problem here that if the central organizations like Treasury Board, Finance, the CIO don’t get a handle on this and don’t…. Basically, I think you’re hiding behind a governance structure that is incapable of monitoring these large projects at this point. I think you have to get on top of this.
Incredible changes are taking place here, and I’m really happy to see the difference in attitude and the difference in coordination and work that’s going on compared to our last discussion with the previous CIO. But I just do not think you can continue to ignore ministries like Health. That’s where the bulk of the problem, the bulk of the billion-dollar losses, is occurring.
Yet we say: “Oh well, we’re not responsible for the accountability of those IT projects. The boards are, and the boards answer to the minister.” That’s a complete
[ Page 1061 ]
gap in accountability, because the boards are incapable of it. We’re seeing that over and over again in these huge projects.
I just think this is a fundamental policy issue that government has got to get its arms around. You’re trying to, but there’s a huge gap in the capacity to monitor these projects. And you’re going to continue to see….
As Laurie says and as Ralph said, you’ve got three projects going here with two or three different health authorities. Why? And who’s monitoring them? A board? It just isn’t working, and I think we all see that. I think….
B. Ralston (Chair): Were you looking for a comment, then?
V. Huntington: Yes, I’m commenting rather than questioning here.
B. Ralston (Chair): No, no. I mean it looked like Ms. Hughes might want to respond.
V. Huntington: Oh, I beg your pardon. I’m sorry.
B. Hughes: No, that’s okay. I don’t want to leave the impression with the committee that there is no coordination or cooperation between my office and my colleagues in the broader public sector. As I mentioned, I work very closely with the CIO in the Ministry of Health, with the CIOs from the health authorities. We share significant amounts of information, and I know that they have discussions between them about how they can work more effectively together.
We collectively share best practices and our learnings. The Auditor General recommendations — I have a meeting next week with the council of CIOs. This is an agenda topic for our discussion. So absolutely, we do work together. We share information. We work collectively in procuring services, in consuming services. We do work together. But from a governance mandate, I do not have responsibility for the health authorities. As Cheryl mentioned, that is the responsibility of the Minister of Health and their individual boards.
V. Huntington: And that is what I say is the fundamental problem here. I think that as a policy issue, the deputies and ADMs and ministers and caucus members speaking to your policy committees…. It’s an issue that has to be dealt with. It’s not working. You can cooperate all you want, but if you have no oversight, if you have no capacity to audit, if you have no capacity to enforce accountability, then the structure isn’t working.
B. Hughes: To clarify, even in government, these projects are the responsibility of the individual ministries. I do not have an audit function. We work with them in the planning, the management of the capital funds, the oversight. But the responsibility for the individual projects rests with the deputy minister responsible.
V. Huntington: I understand that’s what you’re saying. I’m saying that it’s a fundamental policy problem that government has to come to grips with, because it’s not working. There is no oversight that is enforcing the accountability on a major IT project.
There’s a gap even in what we’re saying and what you’re replying. There’s a problem here, and it’s not being dealt with.
B. Ralston (Chair): Thank you. You’ve made that point fairly clearly and repeatedly. I think it’s clear on the record, your point of view. We’ll see whether anyone acts on it.
D. Eby: It was in February, I think, that Mr. Twyford was here. During that, Mr. Twyford outlined two new major IT projects that he described as currently in flight, in which the office of the CIO was taking a different approach to oversight and managing risk.
There were two projects. The first, PricewaterhouseCoopers was brought in to apply something called the PRINCE2 framework for project management. Then, in the second, KPMG and Ernst and Young were in place. They appear to have developed, at least from my understanding of the comments…. There was a new risk framework developed for that project, and another company, Gartner Consulting, was brought in as well.
At that time, Mr. Twyford was not able to tell us about what the projects were. They were going to Treasury Board.
Certainly, I think it’s fair to describe that you were optimistic about these new structures that you’d put in place to manage risk. Can you give the committee an update about whether those structures worked, where those projects are at now and what you’ve learned from this new approach that you told us about last time you were here?
P. Twyford: Certainly. One of the projects that we were talking about was road safety in the Ministry of Justice. That project has been approved. It is now underway. That’s one of the specific projects.
Backing up a little bit, what we have done consistently over the past few years is look at best practice. Gartner is an internationally respected IT consulting firm, along with Forrester. We’ve really looked at why projects go wrong. Again, our focus is on those smaller projects, the under $20 million projects with the ministries. We’ve looked at what the components are. The Auditor General has outlined four of them. And we’ve started putting things in place.
We looked at capacity. We’ve worked with ministries to build capacity. As we talked about, 100 net new FTEs
[ Page 1062 ]
coming in. But we’ve also talked about much more robust planning frameworks. That’s the work that we did with PricewaterhouseCoopers. It’s more the process that people go through, working with the business areas. They’re really IT-enabled business projects. They’re not IT projects.
We’ve put those new frameworks in place, and that’s part of what the Auditor General has commented on, that new robustness. We’re also about to roll out a new governance and audit assurance framework. So working with the consultants that you just talked about, it’s really about improving the planning, improving how we then look at benefits and benefits realization, improving capacity and skills.
The PRINCE2 methodology is one of two project management frameworks. There’s PRINCE2, and there’s PMBOK — project management body of knowledge. They’re really the same. We’ve picked PRINCE2. We’ve been running training courses. We’ve also been working with the PRINCE2 Foundation to tailor those projects.
We’ve run a number of courses. We’ve improved training. We’ve improved the oversight. We’ve improved how we generally look at the planning, the risk management. So every project now gets scored against cost, benefit and risks. That’s part of what Bette-Jo was talking about on the C55 project. It’s applying due diligence at every point in that value stream in those IT projects. I’m happy to say that we’ve taken the lessons from the natural resources permitting project. We’ve applied those to the road safety project as well.
We’re seeing, again, those benefits of breaking projects down, having that larger plan, but delivering them in phases, in asset-specific deliverables, so that we’re able to capitalize. If we stop a project at a certain point, then we do not have stranded assets, we don’t have stranded costs, and we’ve maximized the benefit for most projects.
D. Eby: You have the road safety project and another project which still can’t be identified. Are those projects…? Are you seeing results from this — that the projects are on time, on budget and, most importantly, that they actually work?
P. Twyford: Yes. The natural resource permitting project…. One of the things that we didn’t talk about at that time is that, using our new governance framework, we’ve brought in third-party reviews.
PricewaterhouseCoopers was brought in to do an independent review of that project in the planning phase and also in early implementation. They identified 90 ways that those projects could improve, areas that they can strengthen — what we would call a cure rate. Every one of those 90 issues has been identified, and it has been cured in subsequent projects.
We’re about to start the next phase of that third-party review. We’re actually writing the statement of work to go back out to the market to again bring another independent organization — it may be the same one; it may be a different one — and do that work.
The other project that we’ve done…. We’ve done this with road safety, and we’ve done this with the natural resource permitting project. We’ve also done it with the tribunal transformation project, which you may have heard about in Justice. We’ve done independent reviews. We’ve identified ways that we could make improvements.
I think you’ve probably heard about some of the successes in the tribunal project and its implementation. We know we’re not perfect, but we do bring in consultants. We do make the best use across government, as Bette-Jo says, of some very passionate and intelligent people. We learn those lessons and continue to apply them as we move forward.
B. Ralston (Chair): I don’t have any other questioners, so I think that would be the end of this report. Thank you very much.
If we could take a brief break. I think we’ll begin the last report. I’m not sure that we’ll complete it, but we do have people here to talk about it, so we can begin it. We’ll conclude promptly at four, because I know members make plans for flights and things like that, and so do staff. So if we could just take a brief moment, and then we’ll go from there.
Auditor General Report:
Management of Mobile Devices:
Assessing the Moving Target in B.C.
B. Ralston (Chair): The next topic — we may not finish, and I want to thank the presenters for waiting patiently while we ran somewhat overtime on our previous reports — is the Auditor General report Management of Mobile Devices: Assessing the Moving Target in B.C., a report that dates from October 2016. Representing the Office of the Auditor General are Carol Bellringer, Auditor General; Sheila Dodds, assistant Auditor General; David Lau….
C. Bellringer: David’s not here.
B. Ralston (Chair): David is not here. John Bullock is down there — John Bullock, senior IT audit specialist.
Representing the auditee and the government: Bette-Jo Hughes is the associate deputy minister and government CIO, office of chief information officer, Ministry of Technology, Innovation and Citizens’ Services; Ian Bailey, assistant deputy minister, technology solutions, Ministry of Technology, Innovation and Citizens’ Services; Cheryl Wenezenki-Yolland, associate deputy minister and chief government records officer, Ministry of Finance; David Curtis, assistant deputy minister, corporate information and records management, Ministry of Finance; Sharon
[ Page 1063 ]
Plater, executive director, privacy compliance and training branch, Ministry of Finance.
Over to the Auditor General for a presentation.
C. Bellringer: We chose this topic because mobile device technologies evolve so rapidly, and it’s now sophisticated enough that most if not all of our personal transactions and government’s business could be done using a mobile device. When we were looking at doing this project, I had been talking to the Privacy Commissioner at the time. They were also looking at doing a project in this area, so we decided to do our work concurrently.
We ended up issuing separate reports. We did the work separately. We used the same sample. We made sure that some of the meetings were organized at the same time to try to minimize the time that the ministries would have to spend talking to us. But our reports are issued separately. Ours is looking at security issues. The Privacy Commissioner’s report on privacy issues is not ours — in case there may be some confusion at some point around that report, if you’ve read it.
We looked at whether government is managing the use of mobile devices in a manner that maintains the security of sensitive government information. Our audit focused on the role of the office of the chief information officer, and we sampled five ministries. Our report has seven recommendations. John will go through it and also explain to you what we did in terms of a tips sheet.
J. Bullock: Smartphones and tablets are now an essential part of our lives. These mobile devices are regularly used for business, and they can access sensitive information. Our audit looked at whether government is managing mobile devices in a way that maintains the security of the sensitive information under its control. Any loss, theft or exposure of this information could have serious implications for government and the people of B.C. We examined government’s mobile device management practices, focusing on the role of the office of the CIO, the chief information officer, and a sample of five ministries.
Mobile device security is possible, but there are a number of challenges. Convenience and security are often at odds. This is especially true for mobile devices, which we use almost continuously and in almost every setting. The devices are tiny, creating a high risk of loss or theft. The always-on, always-connected nature of mobile devices maximizes their exposure to network-based threats. Finally, while PC laptop security basics are well known, awareness about how to secure mobile devices is low. All of this goes towards explaining why our report focused on the importance of securing mobile devices.
We found that while government has been proactive in many areas, they can do more to secure mobile devices and, by extension, the sensitive information to which the devices have access. For example, government does not maintain a central record of mobile devices with access to sensitive government information. Such a record is the most critical IT control. You can’t protect what you don’t know about.
We also found that appropriate security settings may not always be in place. Some key settings are left to employees to implement, and given the choice between security and convenience, many of us will choose convenience.
Our report’s seven recommendations focus on how government can improve mobile device security by better documenting decisions and assessments surrounding risks and plans to address risk; updating policy to make it more clearly applicable to mobile devices; implementing new systems and procedures, such as a new mobile device management tool and a detailed inventory of mobile devices; and establishing policies for key security settings and ensuring the key initial settings are applied before the devices go into service.
We’re encouraged that government recognizes the risks posed by the rapidly changing nature of mobile devices. Even before we completed our audit, the office of the chief information officer was implementing some of our recommendations. For example, a new mobile device management tool has been adopted. That’s an important step towards automating the installation and maintenance of security measures.
Security and privacy are strongly linked. At the same time that we conducted this audit, the Office of the Information and Privacy Commissioner conducted its own investigation of mobile device management in government. While our audit focused on the security aspects of managing mobile devices, the Privacy Commissioner’s investigation focused on the privacy aspects. With all of the useful information we collected to inform our respective reports, it was natural that our offices work together to develop guidance on maintaining security and privacy of mobile devices.
The result is our top 15 tips for security and privacy when using mobile devices. It covers topics like choosing a strong password and securely disposing of our devices. The tips are in priority order and cover both work and personal devices. The guide can be used by anyone to secure any mobile device.
This concludes our presentation.
B. Ralston (Chair): Thank you.
Still to come, the ministry.
B. Hughes: Thank you, Mr. Chair and Members. As mentioned, the management of mobile devices audit was looking at managing the use of mobile devices in a manner that maintains the security of sensitive government information. The Auditor General identified seven recommendations, and the province is in the process of implementing all of the recommendations.
As mentioned, there was a companion report put out by the Office of the Information and Privacy Commissioner in recognition of the fact that many policies around privacy and security aren’t…. It isn’t easy to clearly separate those things. We have invited our colleagues from the Ministry of Finance to speak to any questions that may come up with regard to the privacy of information on mobile devices.
Government recognizes that the protection of government data, networks and information is of primary importance, and we take this responsibility very seriously. We know that mobile devices represent an area of increasing risk, as the Auditor General’s Office has identified, and we recognize that there is more that we can do to improve our security controls.
The Auditor General did recognize that government has been proactive in developing strategies for mobile device management, and there is more to be done. Our current controls that are in place include device password protection, device encryption, inactivity until lock time and the ability to wipe a device if it is lost or stolen.
The new mobile device management service that was mentioned will address four of the seven recommendations from the Auditor General, and there will be policy and process updates that will address the remaining concerns.
We began implementing our new mobile device management service in July of this year, and we have on-boarded 8,500 of government’s 12,000 devices to date. That on-boarding will be complete by the end of December, 2106. The work that we’re doing with the government chief records officer on the policy and process development will be complete by the end of March 2017 and will address the remaining recommendations.
The first recommendation required that…. The Auditor General recommended that the office of the chief information officer establish requirements to document assessments of the risks associated with new mobile device features and services, approvals of risk-mitigation plans and acceptance of residual risk. My office is developing the mobile device security standard, which will be complete by the end of March. That will include the risk assessment for all mobile devices that are in use within government.
The second recommendation is that my office update the policy framework to clearly identify applicability to mobile devices. We are looking at updating that policy framework to ensure that all of the policies are very clear in their applicability to mobile devices and to ensure that there is better clarity and better understandability of all of our policies.
The third recommendation was to help ministries develop a solution to maintain a detailed inventory of all mobile devices, with or without data plans, including key information such as assignee, manufacturer model, operating system level and relevant dates. I’m happy to say that the mobile device management service will fulfil this recommendation, and as mentioned, all devices within government will be on-boarded by the end of December of this year.
Recommendation 4 is that our office ensure that all key initial security settings are applied before a mobile device goes into service, and the mobile device management service will fulfil this recommendation.
Recommendation 5 requires that we establish a policy on maximum inactivity until locked time based on an assessment of the risks to the security of the sensitive government information and enforce this policy through technical means. That maximum inactivity until locked time has been changed to 15 minutes to comply with policy, and the mobile device management service will technically enforce this recommendation.
Recommendation 6 was that our office replace the existing mobile device management tool with one capable of installing and maintaining anti-malware software, preventing high-risk devices from connecting, and monitoring and logging mobile device security incidents. The mobile device management service will fulfil this recommendation.
The final recommendation was that my office analyze lost and stolen device reports for potential enhancements to security awareness programs. Our security awareness team is reviewing lost and stolen device reports to see opportunities where we can improve security awareness.
The Auditor General did recognize in her report that our centralized incident report, identification and recording processes were well done and that our office and ministries respond to reported cases of mobile device loss and theft in a way that protects sensitive information.
In summary, we have launched our mobile device management service. All 12,000 devices within government will be protected by the end of December. That service will address four of the seven Auditor General recommendations by the end of December. Our policy and process improvements will address the remaining three recommendations by the end of March 2017.
B. Ralston (Chair): Before I recognize Marvin, I just had…. Given the shortage of time, we’re not going to be able to have many questions.
I just wanted a response from the Auditor General. This does seem to be a fairly quick action on the recommendations that were made in the audit, including all devices by the end of the year and all policy by the end of March 2017.
I’m wondering, just as a general comment, what your sense of the follow-through is on the recommendations?
C. Bellringer: There is no question that progress was being made towards addressing everything while we were doing the audit. If you look at the dates on when we actually did the work and when we actually released
[ Page 1065 ]
the report, there’s a gap. It’s something we commonly do with IT-related reports. We don’t like to, as I say, provide a road map for hackers. So we would not have released any of the detail if we weren’t sure that the security changes had already been implemented.
M. Hunt: Well, we had another review happening south of us. So my question is a real simple one: what did we learn from Hillary, and what are we doing about it?
B. Hughes: I’m not sure how to answer that question.
M. Hunt: I’m not sure either, but it’s yours to answer. Obviously, we have an interesting situation to the south of us in government. Can that replicated here, and what are we doing about it?
B. Hughes: If you are referring to the security of government information and where it’s held, perhaps I can ask Ian and Cheryl to speak, technically, to the security of our data centres and how our information is held.
I. Bailey: I think you’re referring to the issue of her personal email system.
M. Hunt: Correct. And that data going to her system instead of staying within a government system.
I. Bailey: I think you might know that government has its own email system that runs within our Kamloops and Calgary data centres. All government employees are required to do all government business on those email servers. That would be completely against policy to be using a personal email system for government business.
C. Wenezenki-Yolland: I can add to that.
We have been out training across government in regard to what are good information management practices and the management of your records, which includes email and government policies. That training has included all ministers, all ministers’ office staff and senior executive. There is a comprehensive training package, as well, that will have been completed by all public servants by the end of March. Specifically, it reinforces the policy and brings to their attention that those are government records and that they are to be retained within the government system.
As Ian was mentioning, should employees not follow that, we do have ways of detecting that within government. There is another aspect of our program, which also includes compliance reviews and audits, to make sure that there is compliance with that policy.
K. Corrigan: I’d be interested in hearing a little more about what the vulnerabilities are through your phone, because I’m certainly no expert at all.
Email. We know we hear about systems being hacked and the example of Hillary Clinton’s emails being hacked — thousands and thousands of them. But two things. First of all, how much information and what capability would there be for somebody to access government information through a telephone? What are the limits?
I guess the second thing is: is government looking at particular types of positions in government that are really, really vulnerable, that we need to be really concerned about? I think back to some of the reports that the Auditor General did about JUSTIN, using computers. There was information that was very, very sensitive that lots of people had access to. So just a little sense of that, because I’m really quite new in understanding the vulnerabilities.
B. Hughes: I think that there are a couple of different areas that we can speak to. I think, from a policy perspective, our greatest vulnerability is probably the people that use these devices, so the tip sheet that was developed was very useful. Our staff contributed to the development of that in the training that we are doing and the awareness of staff so that they ensure that they understand the devices, they understand their responsibilities and accountabilities and what they can do to protect the personal information they may be using in the conduct of government business.
I think, from a human awareness and understanding, it’s ensuring that we do our best to make people aware of what they should be doing from a policy and practice perspective and also, from a technical security perspective, of what the device can do and of the requirement for them to on-board onto this mobile device management service. It’s mandatory. They will not be able to use their phone to connect to the government network unless they are on the system.
To the question about what are the vulnerabilities of the devices themselves, Ian, if I can ask you to speak to that.
I. Bailey: I think it depends on the age of your phone. BlackBerry phones have always been very secure devices, but the original Android phones that many of us have did have vulnerabilities in the software — iPhones, not so much. But the modern phones that we have today are very, very secure devices — much more secure than our laptops, for example.
I actually met with Samsung last week on this very topic. The vulnerabilities really now exist within the applications that we all use on our phones. Of course, there can be vulnerabilities within PDF files or Word documents. That’s how the vulnerability is introduced. There may be weaknesses in the application software, and that can occur.
The other vulnerability could be where a user might expose their password inadvertently, and then that can be used to do an attack against a government system. So it’s a combination. The devices themselves are very secure. Depending on where we download our applica-
[ Page 1066 ]
tions from, and now with the mDNS, we have visibility into all the applications that are installed on our phones. When we become aware of an issue with an application, we can resolve that.
It also, as Bette-Jo said, is very dependent on the behaviour of the user and what they’re using the phone for.
B. Hughes: For every device that’s introduced into the system, our staff do ensure that the device itself is secure. In terms of applications, we do have an applications store where those applications have been vetted, and there have been security assessments done on them. We have policy in place and, again, through training, ensure that staff understand that before they download an application, they go through the appropriate steps to ensure that the application is secure and that the information is being stored within Canada, for example.
B. Ralston (Chair): Simon, last question. It’s two minutes to four.
S. Gibson: What about individual accountability? I know a company where they make their employees super accountable for their mobile phones. I don’t see that as a recommendation. It seems like a good one to me, where: “Okay, Simon, this is yours. You guard it with your life, because if you let this thing go, it’s going to be problematic for government.” That’s my first quick question. The personal accountability, right?
B. Hughes: The accountability. As a public servant, I think there are a number of ways that is enforced — through our standards of conduct, through our code of conduct, through reviews that are done on an annual basis for the use of not only devices but of technology, of the Internet.
There are a number of places where — whether it’s our security policies, our information management policies, our HR policies, our financial policies…. All of those things provide a framework for public servants to ensure that they are operating in an ethical way and with a mode of integrity in how we manage, whether it’s the physical asset or the use of that asset, particularly around the protection of personal and sensitive information.
S. Gibson: Okay, super quick supplementary to this gentleman here, sir. My question is…. There’s a company in my town, my riding. They say: “We will unlock your phone for you.” I know somebody that was locked out of their phone. They were baffled. They went in there, and they got them into their phone.
Now, tell me about that. Passwords — they just get right in there. They must know more than we do. That’s a pretty scary thing.
I. Bailey: Well, I don’t know the details of what they’re doing, but that’s not possible on our phones that are controlled with the MDM system.
J. Bullock: They haven’t followed those top 15 tips.
S. Gibson: Okay. You’re very reassuring. I just wanted to scare you just a little bit here.
I. Bailey: I would say the most important part of a phone is that you encrypt the storage on the phone, and then that you have password protection or biometric protection to unlock your phone — that’s really the most important thing that you do — and then that you’re very careful about the apps that you install. From a privacy perspective, employees have to be very aware that some of the apps that they might be using could be storing information in the United States, as an example. They have to not do that.
Some of the things — for example, just using Siri on an iPhone, the artificial intelligence voice — can result in information going outside of Canada. So you have to be wary of those things. We provide that kind of information.
B. Ralston (Chair): I’m sure there are many more questions on this. We’re going to adjourn now.
S. Gibson: Thank you. That’s all I have.
B. Ralston (Chair): The Deputy Chair and I will discuss whether we can arrange a meeting in January or February. I’m not sure. I know other committees have been meeting in January. But that will be a decision that we’ll make, and you’ll all be consulted.
Thank you very much. I apologize to those people speaking on the last report that we didn’t finish. You’ll have the pleasant experience of returning to the committee.
We’re adjourned.
The committee adjourned at 4:02 p.m.
Copyright © 2016: British Columbia Hansard Services, Victoria, British Columbia, Canada