2015 Legislative Session: Fourth Session, 40th Parliament
SELECT STANDING COMMITTEE ON CHILDREN AND YOUTH
SELECT STANDING COMMITTEE ON CHILDREN AND YOUTH |
Wednesday, March 4, 2015
8:00 a.m.
Douglas Fir Committee Room
Parliament Buildings, Victoria, B.C.
Present: Jane Thornthwaite, MLA (Chair); Doug Donaldson, MLA (Deputy Chair); Donna Barnett, MLA; Mike Bernier, MLA; Carole James, MLA; Maurine Karagianis, MLA; John Martin, MLA; Dr. Darryl Plecas, MLA; Jennifer Rice, MLA; Dr. Moira Stilwell, MLA
1. The Chair called the Committee to order at 8:05 a.m.
2. The following witnesses appeared before the Committee and answered questions regarding the February 6, 2015 correspondence to the Select Standing Committee on Children and Youth from the Representative for Children and Youth and Deputy Minister, Ministry of Children and Family Development regarding the statutory review of the monitoring function pursuant to the Representative for Children and Youth Act, Section 6(1)(b):
Ministry of Children and Family Development:
• Mark Sieben, Deputy Minister
3. The following witnesses appeared before the Committee and presented an update on the status of implementing recommendations from the Office of the Representative for Children and Youth:
Ministry of Children and Family Development:
• Mark Sieben, Deputy Minister
• Chris Welch, Manager, Interface and Provincial Partnerships, Office of the Provincial Director and Child Welfare
• Janice Chow, Director of Quality Assurance
4. The Committee recessed from 9:22 a.m. to 9:30 a.m.
5. The following witnesses appeared before the Committee and answered questions regarding the Ministry of Children and Family Development Quality Assurance Program:
Ministry of Children and Family Development:
• Mark Sieben, Deputy Minister
• Alex Scheiber, Deputy Director of Child Welfare, Office of the Provincial Director and Child Welfare
• Chris Welch, Manager, Interface and Provincial Partnerships, Office of the Provincial Director and Child Welfare
• Janice Chow, Director of Quality Assurance
6. The Committee reviewed and amended the preliminary draft questions related to its on-going youth mental health project.
7. The Committee met in-camera from 10:39 a.m. to 10:54 a.m. to deliberate on the statutory review process and prepare the Committee’s draft report.
8. The Committee continued in public session at 10:54 a.m.
9. The Chair circulated a presentation titled “A “Collaborative” visit January 22, 2015” by Mountainside Secondary School.
10. The Committee adjourned to the call of the Chair at 10:57 a.m.
Jane Thornthwaite, MLA Chair | Kate Ryan-Lloyd |
The following electronic version is for informational purposes only.
The printed version remains the official version.
WEDNESDAY, MARCH 4, 2015
Issue No. 16
ISSN 1911-1932 (Print)
ISSN 1911-1940 (Online)
CONTENTS | |
Page | |
Statutory Review: Representative for Children and Youth Act | 375 |
M. Sieben | |
Ministry of Children and Family Development: Status Update on Implementation of Recommendations of Representative for Children and Youth | 377 |
M. Sieben | |
J. Chow | |
C. Welch | |
A. Scheiber | |
Youth Mental Health Project: Consultation Process | 392 |
Other Business | 395 |
Chair: | Jane Thornthwaite (North Vancouver–Seymour BC Liberal) |
Deputy Chair: | Doug Donaldson (Stikine NDP) |
Members: | Donna Barnett (Cariboo-Chilcotin BC Liberal) |
Mike Bernier (Peace River South BC Liberal) | |
Carole James (Victoria–Beacon Hill NDP) | |
Maurine Karagianis (Esquimalt–Royal Roads NDP) | |
John Martin (Chilliwack BC Liberal) | |
Dr. Darryl Plecas (Abbotsford South BC Liberal) | |
Jennifer Rice (North Coast NDP) | |
Dr. Moira Stilwell (Vancouver-Langara BC Liberal) | |
Clerk: | Kate Ryan-Lloyd |
WEDNESDAY, MARCH 4, 2015
The committee met at 8:05 a.m.
[J. Thornthwaite in the chair.]
J. Thornthwaite (Chair): Good morning, everyone. Thank you very much for getting here this morning. Today we are starting off with a presentation from the ministry.
Mark, welcome.
He’s going to talk about the correspondence regarding the statutory review. As you remember, we had the representative here last week. Then you’ll go on to a general update from the ministry on the status of the implementation of the recommendations from the representative as well as quality assurance initiatives. We’re getting handouts, by the way, handed out to all of the members as we speak.
Go ahead, Mark, and welcome.
Statutory Review:
Representative for Children and Youth Act
M. Sieben: Good morning, Chair, and good morning, Deputy Chair and committee members. I’m pleased to be with you this morning to speak a little bit about some of the ministry’s business relating to quality assurance and also our work with and around the representative’s report recommendations. First off, I’ll respond to the invitation to speak to the item pertaining to the review of the representative’s legislation.
I don’t really have too much to add to the correspondence sent from myself and the representative, Mary Ellen Turpel-Lafond, on February 6. What I would note to begin with, having reviewed the Hansard from the representative’s time with you last week….
I would first, on behalf of MCFD, offer my congratulations to Dawn Thomas-Wightman, who has moved into the role as deputy representative there. Up until November Dawn was the executive director for aboriginal services, reporting up to Cory Heavener, our provincial director of child welfare.
We’re really pleased and, frankly, slightly proud of Dawn being able to move back and take that leadership role at the representative’s office, while at the same time we continue to have the benefit of working with John Greschner, who’s here with us today, as the executive lead for stakeholder relations.
I get the pleasure of working with both Dawn and John in these new roles. John has a couple of specific files that he and I are working on together, and then Dawn ends up, in this new role, as being my primary contact in the representative’s office. I wanted to make a point of saying, on behalf of MCFD, congratulations to Dawn.
In regard to the issue of the required review around section 6(1)(b) of the representative’s act, I only have a couple of points to make, really. I would first note that I appreciate, both reflected in the correspondence as well as in Mary Ellen’s statements last week, the recognition of some improvement on the ministry’s behalf relating to outcome measurement and of reviewing our own business. With that being said, we’d be the first to identify that there is certainly more work to do.
I would preface that in addition to any statement I might make as the Deputy Minister for MCFD at this point, I was part of the team of MCFD folks that had the opportunity to present to Mr. Hughes in conjunction with his review back in, I guess it was, 2005 or thereabouts. I also had a lead role with the drafting team that developed the enabling legislation for the representative’s office.
I speak on this issue of the review somewhat reflective of that past experience as well as our current contacts working with the representative’s office and MCFD’s strong desire to continue the work, which it really began and has started to accelerate over the course of this last couple of years, pertaining to performance outcomes review and monitoring our six service lines of business in a way that is consistent, really, with responsibility.
It would strike me at this point as a bit of a distraction for MCFD to be sort of misguided or lost in contemplating some restriction or constraint of the representative’s function in this area. I don’t necessarily think that we’re ready for it yet. My bet is that the public would agree, although I haven’t, certainly, done any survey to that end.
I would also note, too, that from MCFD’s perspective we’re very accepting of the context that we’re working in as inclusive of the representative’s office. While not always comfortable, naturally, for us at times, essentially, they make up part of our sector at this point.
As we’ve been able to sort of demonstrate — either with Dawn, having served in both organizations, or Cory Heavener, who’s done the same, or Janice Chow, who’s going to join us tomorrow — it ends up being a common talent pool, to a certain extent, and it ends up being a common work experience within a relatively small sector. My view is that that’s fairly consistent with what Mr. Hughes was trying to emphasize in his report — that we find some means by which to work collectively together and to park some of the distractions off to the side.
What I’m really pleased to see happening and anticipate being able to pursue even more so is, between MCFD and the representative’s office, being able to find some common interests, or some common files even, from time to time — such as where we’ve been able to work to a certain degree together, relating to adoptions and permanency, or even a couple of the matters that John and I are working on, relating to establishment of funds for post-secondary opportunities for former children in care.
[ Page 376 ]
At the same time, while we can stand somewhat in the same place for short periods of times on those files, I have every expectation that, when the representative’s office deems appropriate, they’ll let us know when they think that there is room for improvement. That is sort of the natural context which we’re working in. Certainly, in my role as the Deputy Minister for MCFD, I not only accept that, but I wholly endorse it. I’m certainly not looking for any short-term restraint or constriction of the representative’s powers generally, but specifically relating to section 6(1)(b).
I think I’d just as soon leave it at that. I don’t really have any further statement to make, other than sort of referencing committee members back to the correspondence from myself and Mary Ellen, back from September 6. I look forward to responding to any question that a member might have, relating to this first agenda item, before I welcome my colleagues up here to speak about how we respond to the representative’s recommendations.
J. Thornthwaite (Chair): Thanks, Mark.
Any comments or questions with regard to the correspondence — the February 6, 2015, correspondence? We did have the representative here last week.
D. Donaldson (Deputy Chair): Thank you for the joint letter that you and the representative signed regarding the review. My question…. I posed the same question to the representative, and I said I’d be posing it to you as well. So I’ll follow through on that.
On the second page you both said that you’re of the view that some progress has been made, but “considerable work is required to achieve the performance and outcomes reporting envisioned” by Ted Hughes in April 2006. That’s nine years ago now. Why do you think it’s taken so long to come up with the performance measures which are at the basis of the reporting out?
M. Sieben: My response is twofold, I think. First of all, within the general context of the evolution of, broadly defined, child welfare, nine years, in my mind, really isn’t all that long of a period of time. The issue is that while always, at the end of the day, allocation of resources is something that a deputy minister benefits from, there is also almost the need for growth and maturity in any organization.
Keeping in mind that MCFD, in and of itself, was established — what would it have been? — some 20 years ago, we continue to still grow. This isn’t unique to B.C. It’s consistent with other jurisdictions too. Played against that backdrop, from 1997 till now, we’re continuing to mature.
With that being said, part of the reason why I referenced the opportunity to have participated in a couple of the meetings with Mr. Hughes is…. My view is — and this is just me at this point, frankly — that he was commenting on a system based on his inquiries and presentations from MCFD at the time and others, as it existed then, in 2005 and 2004.
I had the opportunity to be the provincial director of child welfare towards the back end of 2005 and then up to around 2007. It was a different type of system then. Cory isn’t able to join us here this morning for a couple of reasons. She worked in…. We called her the associate provincial director of child welfare. That’s a position that’s somewhat akin to what Alex Scheiber currently occupies. He’ll be joining us a little bit later.
At that time we had what we viewed as a relatively robust quality assurance function within that branch. In some ways, we’ve been looking to accelerate the re-establishment of some of those functions over the course of the last couple of years with myself and then inclusive of my predecessor, Stephen Brown.
What I would note is I’d consider that there was a focus on priorities and issues other than a strong central approach to quality assurance in the intervening years, a lot more focus on establishment of practice and quality assurance within the regions for that intervening time, let’s say, between 2007 and 2010.
In my mind, we’re beginning to sort of retrace some of our routes to some of the capacity that we had in the organization that Mr. Hughes commented on, on the quality assurance side. I think that’s some of what the representative’s office is seeing.
Both Cory and myself believe in a strong central view of quality assurance and having the capacity to be able to not only take into account what’s occurring across the province but then to engage and intervene through the provincial director’s office as the provincial director sort of requires and identifies.
I would also note that having the benefit of some knowledge cross-jurisdictionally around what happens in other provinces particularly but also other states and countries, too, regarding the establishment of performance measures in and of themselves….
It sounds easy, but it ends up being quite difficult, as is the marrying of not only what you want to be able to measure but what capacity you have within the organization to capture that data and then what the data actually allows you to measure, and then being able to have some form of agreement relating to how best to define what those different measures might be.
We’ll later on today…. In fact, one of our last slides in one of the presentations will outline the progress that has been made through MCFD’s performance management report. We’ve got the fourth cycle, the fourth version, of the report that’s recently been posted. Martin Wright, who has previously appeared before the committee, would welcome an opportunity to come and update relating to the most recent version of the report. He
[ Page 377 ]
has done a tremendous job in leading its development and its growth.
Again, there is much more left to do, and the representative has, rightfully, pointed out that there is room for improvement in that growing report on data and indicators across all six lines of MCFD’s business. It is, in my mind, the most comprehensive outcome- and output-based report relating to child welfare data within a jurisdiction — at least across Canada, if not in other places, too.
We have seen progress. We’ve seen progress in some of our core areas on the quality assurance side that we’ll share a little bit more about with you later in the morning. We have seen progress, too, relating to the establishment of performance measures. Again, in my mind, while more is left to do, it’s some of the best work in this country, in any event.
Again, I would see that as work — and that includes improvement of that work — that isn’t necessarily tied to any requirement to sort of absolve the representative’s office from that monitoring function. In some ways, how we improve and how best we improve are linked to the sort of ongoing connection and comment that’s available to us through the representative’s office.
J. Thornthwaite (Chair): Thank you, Mark. I’ll note on item 3 in the agenda that the committee will have an in-camera deliberation on the statutory review and will be having a report out on that, as well, at a later date.
I’m seeing no questions or comments coming up. We’ll move on to item 2, which is the general update from the Ministry of Children and Family Development. You have a slide presentation.
Ministry of Children and
Family Development: Status Update on
Implementation of Recommendations of
Representative for Children and Youth
M. Sieben: We do, Chair.
I’ll welcome Janice Chow and Chris Welch to the front area with me.
To my far left is Janice Chow. What do we call you now, Janice?
J. Chow: Director of quality assurance.
M. Sieben: Janice is the director of quality assurance in the office of the provincial director in MCFD. I’m forgiven the memory blank on her title because Janice has only been with us in that capacity since this past January.
J. Thornthwaite (Chair): So, Mark, Janice will have to move to a microphone if and when she’s speaking.
M. Sieben: Chris has got the deck.
J. Thornthwaite (Chair): And then you’ll switch.
M. Sieben: Janice is likely to have some input on some of the questions that might arise from the committee members.
J. Thornthwaite (Chair): Okay. Carry on. Thank you.
M. Sieben: Prior to assuming this role, Janice, as I noted earlier, was working at the representative’s office. She was there from 2007 to 2015, where she was the director of research in the monitoring, research, evaluation and audit program. Prior to her appointment at the RCY’s office, Janice was a manager with MCFD in our former decision support branch.
To my immediate left is Chris Welch. Chris Welch is a long-time employee of MCFD. He started in the late 1980s… Is that correct?
C. Welch: In ‘87. Correct.
M. Sieben: …as a child protection worker and then worked through staff training. Then about the same time I came to the provincial office in 1998, Chris came too.
Chris has functioned in a number of roles in terms of policy development and the managing of the delegation system for child welfare staff across the province and is currently the manager of the interface and provincial partnerships office in the office of the provincial director.
A part of Chris’s responsibilities is to assist Cory and her team collect and track the various recommendations that come from all of the representative’s reports — with a specific focus for those, of course, that pertain to MCFD — and assist in organizing appropriate response, and also tracking recommendations that also apply to other parts of government.
Chris is going to lead us through a short presentation relating to what the current standing of that is, having had the benefit of a recent third-quarter meeting and report with the representative’s office relating to her recommendations.
A quick tidbit of information relating to Chris. This is his eighth year as general manager for the Victoria Shamrocks, so we somewhat get Chris almost like half-time, because that’s a full-time gig in itself.
C. Welch: Thank you very much, Mark.
Good morning, everybody. This is, I believe, the third time the ministry has had the opportunity to come before the committee to present on our work in responding to RCY recommendations and to provide a status update.
We were here previously, as this slide shows, in November 2013 and last year, in February of 2014. I think what the committee over time will recognize is a somewhat evolving process in the ministry to respond to the RCY, to work with the recommendations.
[ Page 378 ]
Initially, between the inception of the RCY and about 2011, the ministry relied heavily on the Strong, Safe and Supported action plan as the common response to RCY recommendations. Then in 2011 our former deputy minister, Stephen Brown, implemented a more collaborative working relationship with the RCY and standardized a response process, which in some ways is still in effect today. It’s evolving over time.
We spoke about that process in those two previous presentations. Today we can update — if we can just quickly go to the next slide — that back in October of 2014 we revised the process that Stephen Brown had arrived at with the representative’s office to reflect a new approach in the ministry of how we accept recommendations internally, weaving it into our corporate and strategic planning and reporting processes, which better positions us to assess and prioritize the recommendations and to put that against all other ongoing commitments and other priorities.
It’s helped smooth the transition between the recommendation coming in and a lead getting assigned and then being able to report out on progress.
Starting in this fiscal year, quarter 1, back in July, I believe it was, we issued our first quarterly progress report to the RCY on the status of the deliverables in the action plans that have been developed.
The new process looks very similar to the 2011 process from a high level. It’s at the working level where some of that deeper integration and incorporation of the recommendations take place.
But just to quickly orient you again to our process. Prior to the release of a report the RCY will, hopefully, meet with us and discuss the upcoming report and recommendations. We’ll get an embargoed copy of the draft report for administrative fairness purposes and have an opportunity to respond to that within a timeline, which we do.
Once the report is released — this where our standardized process kicks in — our minister will respond to the report with a letter that will confirm our acceptance, consideration or rejection of recommendations. I’m not aware of too many that have been rejected. I’m aware of a few that are being considered, and the vast majority are always accepted.
Sometime after this acceptance the ministry executive has an opportunity to be oriented to the report and the recommendations, and they have the opportunity to sign an executive lead for each accepted recommendation. Then we meet with the RCY, if necessary, to discuss the report, clarify assumptions and discuss proposed deliverables for what will go into our action plan. A lot of work is going on internally, behind the scenes, in the ministry at this point to coordinate all of this with a lot of the other work that we’re already doing.
Once we finalize our action plan, it is sent to the representative. Then, as I’ve said, starting this fiscal year, we provide quarterly status update reports, the most recent of which — our quarter three report — was just sent over, I believe, a week ago. Then when all deliverables in the report are complete, the ministry sends a close-out report to the RCY.
Some of these steps have been in place since 2011 — in fact, most of them. As I said before, it’s at the sort of deeper level where we see the changes of incorporating the recommendations and acceptance of recommendations into our other corporate planning and strategic processes.
As everyone knows, the representative released a report back in October called Not Fully Invested. It was their first comprehensive review of the status of recommendations that they’ve made in 22 reports. It wasn’t all reports, but it was 22 reports between January ‘08 and December of 2013.
According to our analysis, and this is the MCFD analysis, 80 percent of the recommendations in that report that were directed to MCFD have been completed, and the remainder are underway. That’s not the end of my status update — I’ll get into more detail on that — but because I’m going to update on the recommendations as a whole, this is just pertaining to the recommendations covered in the Not Fully Invested report.
It presented, frankly, a few challenges for us — the report. One of those challenges is in how recommendations are counted. There’s a handout that you should have, a long piece of paper with a list of all the reports on it. I would just turn your attention to that now. It looks like this.
What that does is table all of the 27 reports that the RCY has issued to us since their inception in 2007. It’ll show the release date, the title of the report, the total number of recommendations in the report, the ministry recommendations — those that have been directed to us — the additional details that each recommendation contains, the number of recommendations in a given report that are completed, the number that are underway, the number that are not underway and the status of the report. This is our status.
You’ll see there that a number of the reports are closed. We have sent, over the years, a total of 17 close-out reports to the RCY. They’ve confirmed closure on five of them. On the other 12, we’re left to assess, kind of for ourselves a little bit, as to where we stand in relation to those recommendations. The Not Fully Invested report gave us some help in doing that, but it presented some other challenges. Fortunately, we were able to work with the representative’s office to get behind the data in the report a little bit, to look at the specific recommendations that they were talking about that had no progress or some progress and that we felt maybe were a little further along. They gave us an opportunity to enter into that dialogue before the report was released. But our count of recommendations is a little different than their count.
[ Page 379 ]
Our approach is that when a report is released and we go to the recommendations, when it’s directed to the ministry, we count that as one. Even though each recommendation will contain different parts and different details, we still count it as one for the purpose of the count. The representative had been doing that, we think, as well, as reflected in their annual reports. But with Not Fully Invested, there was a change in that approach to where some of the details were now being counted as stand-alone recommendations.
We saw a jump in their count, for example, in the 2008 report. We call it the northern report. It’s known as the Amanda, Savannah, Rowen and Serena: From Loss to Learning report. We saw that as containing 12 total recommendations, 11 to the ministry. But in Not Fully Invested it says it contains 32. Some of the details have now been lifted, so there are those differences.
I just wanted to highlight that for you in case you contrast what we’ve produced here today and the Not Fully Invested report. You’ll see that we’re not trying to play any games. We’re just trying to explain what our counting approach is.
Back to the slide deck. In terms of the overall recommendations, as the table will illustrate, in the period of 2000 to the present, the RCY has issued us 27 reports with 120 recommendations directed to MCFD, along with 545 additional details, some of which may stand now as recommendations. Be that as it may, by our analysis, 75 of those 120 recommendations are completed — 62.5 percent; 33, or 27.5 percent, are underway; and 12 of the recommendations, 10 percent, remain outstanding.
In terms of some activity, over the last 12 months, since we were last here, the RCY has released five reports that contain 18 recommendations, along with 95 additional details for the ministry. We have those reports tabled there: Lost in the Shadows, On Their Own, Finding Forever Families, Children at Risk and Who Cares? This is over and above the Not Fully Invested report.
These are just reports that contain recommendations directed to MCFD. You’ll see there that we have action plans underway in three of those reports, and for two of those we’re still developing action plans. I should note that we’re still developing an action plan for an earlier report. This speaks to some of the complexity that we have in dealing with recommendations and the implications that they contain.
You may recall the Still Waiting report, which directed recommendations to both the Ministry of Health and MCFD. In January of this year a letter went over to the representative, outlining that the ministry has a draft action plan, but we’re working in support of the Ministry of Health’s lead in responding to those recommendations. Their work plan for Healthy Minds, Healthy People will constitute their action plan. Then, in conjunction with our action plan, supporting, we’ll have, I guess, the first sort of hybrid action plan in response to an RCY report.
That’s a quick glance of where we are and where we’ve come. I’m happy to take any questions.
C. James: Thank you for the report. I appreciate the description of the differences between the reporting out that comes often from MCFD and the rep’s office. I also recognize — I raised this with the rep as well a couple of weeks ago, when we had her here — that there isn’t always going to be agreement around the recommendations as well, that there are times when MCFD may disagree with the recommendations or may feel there is a different way to be able to meet the recommendation that’s in the report.
I’ll speak for myself, not other committee members, but I think that the difference in reporting creates challenges. It creates challenges for the committee in being able to take a look and determine whether recommendations are met or not. But I think it also creates challenges for the public, when the public is looking to see whether there is accountability around recommendations and whether the intent of the Hughes report really is occurring, whether those things are happening.
I wondered if there has been any progress or any discussion on, perhaps, reaching agreement around the number of recommendations that are there when reports come out. I know you get the reports embargoed. I know there’s a back-and-forth that goes on while the report is being written.
I wonder whether there’s any discussion about having that kind of conversation before the report comes out. Then there can be a common reporting back from both the rep’s office and MCFD around the recommendations in the report. I think, from the committee’s point of view but also from the public’s point of view, it certainly would be easier to build that kind of support, even if there’s disagreement around the recommendations, if we had a common set of documents to be able to look at.
M. Sieben: I would note that — in conjunction with the development of a report and around the same period of time in which MCFD has the opportunity to review, for administrative review purposes, an embargoed copy of a report — there are sometimes requests. Often there is a request from the representative’s office to participate in a meeting relating to the development of recommendations.
Staff, often Cory and then others, subject to the nature of what the report is about, would participate with senior staff from the representative’s office in order to basically provide input. I’m sure there are additional tables that the representative’s office would consult with. They have a multidisciplinary team that often informs development of recommendations in their report as well as the internal discussions themselves.
[ Page 380 ]
That discussion goes into the mix. Then, usually, the executive summary and the recommendations come, in the final version of the report, to MCFD a week or so in advance of the release so that we have the opportunity to prepare some form of a response. There is what I think is a fair opportunity for us to have engagement, and I’m very appreciative of that opportunity that’s extended to us.
I certainly agree. Where we can find commonality or give indication of where we think we can sort of do more work in order to improve in an area, then we like to be able to identify that. At the same time, the representative’s office may still identify new or different things or choose the wording most appropriate to make the point that the representative’s office is looking to make.
After the report comes out, as Chris has noted, there really is quite a process that’s in play. It begins with us trying to get clarity relating to what lies beneath any recommendation, notwithstanding having the opportunity to participate in advance. Often the meaning of the recommendation is really in those detail boxes — which, in the initial reports, were quite substantive and, perhaps appropriately, almost ended up being counted as individual recommendations.
The main means by which we try to seek that resolution that the member is speaking to, relating to the meaning of a recommendation, is simply by talking it out — which we seek to do a lot, not only inclusive of what the nature of the recommendation is and how best we might approach it from the representative’s office view but the extent to which we actually have demonstrated progress or the extent to which it might be completed or not.
There have been a few occasions on which we think we’ve come to agreement relating to completion of a recommendation and then discover, from reference in a future report, that there may be more work to do, from the representative’s office. That certainly is our desire too. Part of what Chris referenced in terms of the third-quarter report that was sent to the representative’s office to report on our progress is certainly an intent to continue that dialogue and to try to get to that common language. It might be a point of interest with the committee as well.
D. Donaldson (Deputy Chair): Thanks for the presentation. I appreciate that we’re talking about, overall, how your ministry responds to the recommendations by the representative.
You brought up Not Fully Invested, so I’m going to ask specifically about that. I’m going to ask about it in relation to how MCFD handles some of the points that the representative made in that document — really, the cross-ministry recommendations, the cross-ministry issues that she pointed out.
I’ll quote from the report. “Of the nine recommendations made to the B.C. government as a whole, the ones that require the greatest cross-ministry involvement and organization, seven have been largely disregarded.” These include several significant recommendations that are central to improving the lives of B.C.’s vulnerable children and youth. The representative pointed to organizational leadership and the adequate deployment of resources that have too often been lacking.
There are some recommendations that pertain directly to MCFD from this report, but there are a number of those seven that pertain to cross-ministry. I’m curious. When you are faced with that kind of situation, what’s the process that you take to bring these kinds of cross-ministry deficits that are pointed out by the representative to a larger discussion amongst your deputy minister colleagues? What kind of traction do you get on that, and where does it go? It doesn’t appear to have gone very far.
M. Sieben: That’s a really good question. I’ll start, and Chris — and perhaps Janice, even, from her viewpoint — may have something to add.
There’s certainly more clarity when the recommendation is something within MCFD’s own wheelhouse. It’s certainly easier for us to approach, regardless of whether we agree completely, partially or, in very few instances, not at all. I can’t think of a recommendation where the latter category actually applies.
What MCFD has contemplated and what we’ve instituted is…. At the end of the day, notwithstanding where the recommendation is directed, when there is a representative’s report and a recommendation, people are going to look here first. So we make an effort, as the chart indicates, to track not only those recommendations that pertain to us but pertain to others too.
As Chris noted, part of our process involves discussion of each of the individual reports at the ministry executive team. When we get clarity relating to what we think the individual recommendations really mean for us, then leads are established according to each of the individual recommendations or some of the detail that goes behind that.
At the same time, if there are recommendations that we think have cross-ministry application or clearly are more the responsibility of another ministry, then we look to identify that and make sure that some of our colleague ministries might share that view or make sure they’re not waiting for us to do something. Sometimes there’s a place in the middle, and sometimes it’s quite clear from the nature and the direction of the recommendation that it’s the Ministry of Health or it’s the coroner’s office or it’s the government of B.C.
When that’s the case, staff in Cory’s office follow up directly through those ministries to identify where those linkages are. There is a deputy ministers committee sort of more on the social services side that meets up on occasion as well. When that’s the case, we use that forum quite often to identify recommendations that might belong elsewhere or might at least be shared. It’s been an
[ Page 381 ]
evolution. It probably is an area of business that we can do better.
At the same time, I can say with complete candour there that those cross-ministry recommendations — or the bigger, broader ones — are often most difficult to approach. They tend to be dense. They’re multipartied and usually involve levels of government beyond ministries or even the province, often involving First Nations leadership and governance and often involving engagement with the federal government.
Those recommendations, in my mind, tend to be those that we may share an interest in pursuing and move in a means by which we engage with other levels of government in order to approach the recommendation. But they often take some more time and more steps than one might contemplate in reviewing the original wording in the recommendation. They’re simply harder to do.
Do you have anything else, Chris, to add?
C. Welch: I would just add that, as Mark has clearly pointed out, some of these require an extensive amount of collaboration across large ministries. Sometimes that collaboration is a little easier to find when folks can see what’s being recommended relating to their current and ongoing work. It’s about finding those linkages and connections.
As I talked about before, Still Waiting is a good example. Despite all the time it’s taken to get to this point, Health has been able to look at their Healthy Minds, Healthy People work plan going forward and take the recommendations in Still Waiting and help them augment the work plan that they’re working on for that initiative. Similarly for MCFD, we’re then able to come in and, through some of our initiatives, support their initiatives.
That kind of collaboration is a little complicated. When it takes place earlier in the process, like when recommendations are being contemplated, it tends to work a little better, in my experience. An example of that is the most recent report that was released in December, the Who Cares? report that was on our residential care system for kids with complex needs.
While not an extensive external collaboration, certainly with the RCY, they invited us to come in and talk about the recommendations. We said: “Look, we’ve got a number of things going on around complex care, and can we weave those in to your recommendations? We’ve got some interest in moving forward on quality assurance in the area of our residential resources. Is there a way we can weave some of that into your recommendations?”
Of the three recommendations in that report directed to MCFD, two of them are basically our own in there, and that’s an example of the collaboration that we’re now seeing. I think if we can continue to develop recommendations in that vein, it will be very productive for all of us, and things will move forward. But when recommendations come out of the blue and require extensive work, it’s going to be a lot more challenging and slower and arduous.
Of course, as MCFD, the RCY for sure is our oversight, and we’re motivated to work very collaboratively and to try and respond to these as effectively as we can, but even we have a challenge keeping step.
M. Stilwell: Thanks, Chris, for those last comments. I think they were very helpful. I was just going to say that of course we appreciate the oversight function and the importance of, first of all, the reports for describing systemic issues within the system. At the same time, tracking these recommendation and then having a column with “closed” takes up, it sounds like, a huge, huge amount of time, but what we really want to hear is that the outcome has changed.
Obviously, as an outsider and not hugely informed, it just sounds like there’s a huge amount of time and effort going into tracking recommendations, which is really just a framework to report that fewer children are dying in care, children in care are completing high school at the same rate as the rest of the population. It’s only really a very beginning.
I just wanted to comment on that and on seeing that some status is closed. Whereas really it can only ever be closed when the results have improved and there is a robust quality assurance program to maintain that. I will be interested later in the presentation about your strategic plan for quality assurance improvement program and how this — whether it’s underway, TBD or closed — relates to the reality of the outcome.
M. Sieben: I might respond quickly that on the whole we would agree. That’s part of the reason why I have a fair amount of focus on the continuing evolution of our performance management report.
It’s important for MCFD, I think, to invest the energy and the resources to do justice to the recommendations and to be able to track them, report back on what our performance has been and then compare notes with the representative’s office as to the extent to which they share that view or whether it’s a little bit short of what we feel. Then we discuss those gap areas. At the end of the day, that’s the summary of the presentation that Chris has given.
The real proof, the real work is on our continued pursuit of being able to provide output data and performance measurement data across all six lines of our business and do that in a way that makes it accessible for the representative, as well as for many others — to be able to take stock relating to our performance in providing services to children and families in B.C. I think we have to do both.
D. Plecas: Mark, I wanted to congratulate you on what I think is an excellent system for responding and monitoring. I think it’s awesome.
[ Page 382 ]
One of the things I note, just going through the numbers here and doing some quick math, is that it looks like you basically have completed on all of 75 percent of recommendations. And if you include those underway, we’re pushing 90 percent of the recommendations. If they’re not completed, they’re en route to being completed. That’s if you take it to the end of 2013, which is understandable because they just came out last year. The reports that came out in ‘14, obviously, you haven’t had a chance to fully respond to yet.
How long does it take to get to the place where you can say, “We’re completing on the recommendations of a report,” generally?
M. Sieben: I’ll get Chris to respond to the question in the aggregate. I would say in regard to the individual recommendations, it varies quite a bit. There’s the odd occasion where MCFD, as Chris has noted, already had instigated some initiative that was related to the matter.
Keep in mind that by design, through the enabling legislation for the representative’s office, their opportunity to engage, particularly in the case-related reviews, happens sometimes after the actual events and almost always after the director has at least considered, if not developed, their own review and recommendations.
We should have already started some work in an area that requires some prominent action, so on a recommendation-by-recommendation basis, it varies. They can be done already, or we could get to it relatively quickly sometimes. Others take more time. It ends up being the collective of the recommendation that we work through the process.
It takes a few weeks after the original release of a report for us to come together with the representative’s office to discuss — so that period between when we might have been involved in the initial recommendation development meeting to what the actual recommendation says — and to make sure we understand the detail, where much of the meeting actually occurs. And then work ensues.
I’ll look to Chris to inform us if there’s a sort of median or normal period of time between when the report comes out and when we end up filing what we believe are the action plans where we’ve done most of the work — keeping in mind, as Chris has said, that this process was originally agreed to just in 2011. We’ve looked to tweak it a little bit over the course of this past year.
C. Welch: I think when we come back the next time, we’ll be probably better positioned to answer the question of how long does it take. We’re still fairly fresh and coming off of Not Fully Invested. As I said, most of our sense of closure has come out of our read of that report.
In the process that we have with the RCY, we send a close-out report. As I said, we’ve sent 17 of those, and five have been confirmed closed. Most of those were fairly quick closures. For example, there was a special report that they did in 2010 which related to reporting of critical injuries and deaths.
There was one very straightforward recommendation, and it was completely in our wheelhouse, as Mark said. It was something we had immediate control over, and we were able to close it very quickly. Unfortunately, that’s not always the case. Oftentimes the reports contain more complicated recommendations.
We don’t have a great sense of the time. I think when the initial process was born, the 2011 standardized agreement that Stephen Brown reached with the representative…. I think it was anticipated at that time that a report would take anywhere from between a year to three years — 12 to 36 months — to reach closure. I think we need to maybe revisit that sense a little bit, as we continue to develop the recommendation development and response processes.
D. Plecas: One other question. Again, when I go through the numbers here and look at what I would say is how well you’ve done in terms of responding to the recommendations, there’s one report which stands out back from 2008 on addressing special needs of children. It’s interesting that, in that one, of the 12 recommendations, eight are not underway yet. It seems to be out of sync with a later report out of 2011 addressing special needs again, where you basically addressed almost all of the recommendations.
What am I missing there?
C. Welch: We had sent a close-out report on the 2008 report to the RCY, thinking we were done. We were awaiting response to that close-out report when the Not Fully Invested report came out. Through that, we saw what they considered to be incomplete work — some very specific things, some of it around performance measurement and a lot of tie-ins to other related initiatives. So we’ve taken a second look, and that report, if you will, is reopened in our mind. We had considered it for some time to be closed but in the light of the Not Fully Invested report consider reopened.
Now we’re looking to…. We have a strategic initiative in our strategic plan around children and youth with special needs. We think that some of the pieces that remain outstanding from that 2008 report are covered off, but we’re not fully confident in the firmness of those linkages yet, I don’t think. So we have some work to do around those outstanding 12.
I hope that helps illustrate a little bit how that one has popped open again.
M. Sieben: I might add just quickly that part of the challenge with CYSN is that…. Some members on both sides — given their experience with us, either in estimates or as ministers — might recall that there are data
[ Page 383 ]
challenges on CYSN. Most of the services come through community agencies, and we do our best to work with them to get the information we need in order to let people know how we’re doing. Part of, I think, the challenge with meeting some of the appropriate guidance that might come to the representative’s office is the data challenges associated with that area.
M. Karagianis: I’m just curious. While it’s interesting to kind of see a checklist of items that have been either addressed or not addressed, can you just briefly, perhaps, address the repercussions that these changes or addressing some of these recommendations are causing throughout the ministry? Are there costs associated with this? Are there personnel changes or additions? Are there staffing issues that have changed or had to be augmented as a result of these?
Certainly, for every recommendation that you actually carry forward or deal with, it’s not just the children’s rep’s report. There are actually repercussions within the ministry. I’d be interested to know what those are and whether that’s part of the reporting process — to say, you know, here are the actual fiscal repercussions, or the staffing or personnel repercussions, that this has involved as well. I think that would be interesting to know.
M. Sieben: The member beside you noted earlier in the session that there was almost expectation that MCFD would probably have a mix of responses to add to recommendations — whether we completely agree, partially agree or choose not to agree and want to do something else. My belief is that it isn’t necessarily MCFD’s responsibility to respond fully to all the recommendations in the absolute — in their individual, absolute existence.
Part of my challenge as deputy minister, supported by staff and, certainly, with direction from Minister Cadieux, is how best to approach those recommendations within the direction that comes to us from government, within the strategic priorities that we might set within our strategic plan, as outlined in our service plan, and certainly within the budget allocation that’s available to us to, first of all, provide services to children and families and then make improvements on that existing system.
I make a point of never underestimating the amount of energy and the amount of focus that necessarily has to go on just keeping the wheels running in this place at an office level. The opportunity for improvement, which is constant and needs to be pursued with vigour, in my mind still needs to be focused and approached in a way that allows the greatest deal of success.
I would rather do fewer things well than try to do more things less successfully. Sometimes that means choices. More often it means choices just in terms of time frame or how best we might link a recommendation, as Chris has said, to some existing initiative. Again, as I think we’ve tried to demonstrate, we likely have some awareness of where greater improvement could be and have already instigated some work, and then we look to sort of overlay and inform our work with that recommendation.
There are times when there’s been a relatively clear-cut response that might result in a specific allocation of resources or staffing. For example, in one of the recommendations from the representative’s report that spoke to the need to ensure adequate capacity to provide a set of resources for behavioural-challenged youth, the ministry’s response was to develop a complex care unit.
It took longer than what was recommended in the representative’s report for us to be able to do that, but we’ve done it. We’ve got kids now receiving care through that unit. We could likely identify what the exact resource allocation was, what the cost was and what staffing complement that is. In many of the areas it’s more subtle. It’s how we simply begin to shape what we’re doing already or reemphasize the direction that we’ve given to staff in a way that is consistent with the recommendation.
Again, in my mind, I do not think that it would be a successful child welfare system if it was based solely on the implementation of recommendations in and of themselves.
There is a need to lay those recommendations against the foundation of, certainly, our six service lines, some sense of strategic direction from the government of the day and then some development of a strategic approach through internal MCFD planning and as outlined in our service plan.
J. Thornthwaite (Chair): I have a question — actually, a couple. One of them is that last week, when the rep was here, she was talking about the budget item, through the ministry, of an increase in staff. There were 500-some FTEs or whatever. I was just wondering if you could clarify that and where those people are going — if that decision has been made.
Then the other question I have. I think it was Chris that was talking about this response to the “still waiting,” which encompassed a recommendation for Health and MCFD. You’re in the process right now of working together with an action plan. I think the term that you used was “hybrid” action plan. I was wondering if you could give us a little bit more information on that and when we would see something on that.
M. Sieben: I think the number Mary Ellen was referencing relates to some combination of staffing figures from a number of years ago compared to now, but it’s one of the inquiries that we’re going to have to make. I saw the same number in Hansard.
What I can tell you, certainly, is what we’re doing on the child welfare side. As Minister Cadieux indicated back in November, our commitment has been to bring
[ Page 384 ]
on what amounts to another 150-ish staff, in addition to the 60 to 70 staff that were engaged in providing support for front-line people, providing input into ICM, for a total of around 200 — and that being done in conjunction with a service redesign. So it’s not simply adding to what we have. It’s growing in a way that is consistent with re-identifying some of the common paths by which we provide services.
Many decades ago I, too, was a child protection social worker, and some of our business processes, frankly, would not be unfamiliar to me if I stepped back and had the courage to go back and try to do what I did when I was much younger. There is work that we can do in order to improve some of our processes and assist some of our staff to focus on what they want to do, which is spend more time with children and families. Our hiring initiative is geared towards also supporting that — sort of recrafting some of the business processes. That amounts to a couple of hundred staff.
I’d foresee us being able to grow a little bit at the provincial office in order to support some of the initiatives that MCFD has been asked to shoulder, particularly in the early-years area, where Minister Cadieux has a healthy set of accountabilities and where MCFD has received a lift in its budget.
That’s what we’re planning. I can also inform the committee that we hire constantly. It’s not a matter of us not hiring. We’re currently doing work with PSA in order to find ways for us to utilize their centralized hiring system, which all ministries necessarily have to use, so that we can mitigate the recruitment lag between when we have a vacancy and when we can get a posting up and when we can actually recruit and hire a qualified candidate into that role. Right now, in our mind, it takes too long, so we’re looking to minimize that.
From MCFD’s perspective, that’s a quick outline of where we are on that.
C. Welch: To respond to the specific question regarding the response to the Still Waiting report. That report came out in April of 2009. The Ministry of Health is the lead in the response to the recommendations, with the Ministry of Children and Family Development supporting.
As I said before, there had been some recent correspondence from the assistant deputy minister of health services policy and quality assurance in the Ministry of Health, as well as the provincial director of child welfare in MCFD, to the representative, indicating that in early 2015 the representative would be receiving two documents that, together, would outline the approach to responding to the recommendations in the Still Waiting report.
One of those documents will be an MCFD action plan that will outline the deliverables that MCFD is undertaking in response. The Ministry of Health will be attaching to that their workplan for Healthy Minds, Healthy People, the 2014-2017 update.
The collaboration that’s been required as a result of the implications in the Still Waiting report and recommendations has been extensive, and a lot of work has been underway. We informed the representative that recently the two ministries facilitated an extensive engagement process in which five engagement workshops were held, with approximately 120 participants from a wide range of sectors, systems and interests, including staff from the representative. The workshops were designed to identify approaches and strategies utilized in Healthy Minds, Healthy People that have been most effective and informing the go-forward.
As they work on finalizing their updated workplan, we’ll be attaching our action plan, which contains a number of initiatives that you may already be familiar with. Mark spoke to the complex care initiative, expanding the new child and youth mental health intake clinic model to all child and youth mental health offices, a clinical resource toolkit for CYMH, an e-services map.
There are a number of initiatives underway there that will be outlined in that workplan that dovetail with the work going on in Healthy Minds, Healthy People, hopefully, presenting — perhaps a better word than “hybrid” would be “joint”— a joint response on the part of the two ministries to that report and its recommendations.
D. Barnett: I just have a question. I keep hearing you talking about centralization. Could you explain to me how this centralization is of benefit to our small rural communities that need service delivered, sometimes, in a different manner? I don’t hear anything about regionalization or rural communities. I just hear about everything as centralized.
M. Sieben: MCFD is a particularly decentralized organization in comparison to other ministries, parts of government. Certainly, my experience is that the deputy minister at what is now the Ministry of Social Development and Social Innovation sort of reinforces that context.
There is a great amount of discretion in decision-making at local or regional or what is now within our service delivery areas. MCFD allocates its budgets so that almost all of its budget, other than in a few provincial service areas such as child care subsidy and autism, goes out to the individual SDAs, and the supervision and direction for all of that work occurs at a local level.
Fundamentally, if you’re talking about any of the community-based service delivery programs — that’s child and youth mental health, youth justice and child welfare — the critical relationship in terms of response in almost all cases is that relationship between individual social
[ Page 385 ]
worker, clinician or probation officer and team leader. That’s where the action is. That’s what we look to support.
At the same time, I think it is incumbent and important for MCFD as a provincial ministry to have a strong capacity to be able to read almost everything that happens in a local area and be able to roll that up and give advice where we think there is advice to be given.
The reference specifically to the hiring component is something that I’d looked at, informed MCFD staff because, particularly at a community-based level, appropriately, their attachment is to their communities. MCFD isn’t a vacuum in and of itself. We’re part of a larger corporate entity of the public service.
All ministries hire through the Public Service Agency. Therefore, we’re obliged to work with the PSA, as all other ministries are, in terms of recruiting the staff that we need, through all of our program areas, in order to provide services. The reference that I made to the centralized hiring was simply that the hiring has to occur through PSA, not only for ourselves but for all ministries.
J. Thornthwaite (Chair): Seeing no more questions or comments, thank you, Mark. I think that’s a good segue right into your next presentation on quality assurance.
M. Sieben: Great. We’ll get set up here. I may request the assistance of the Clerk in order to set up the next presentation.
J. Thornthwaite (Chair): Why don’t we have a five-minute break for coffee.
The committee recessed from 9:22 a.m. to 9:30 a.m.
[J. Thornthwaite in the chair.]
J. Thornthwaite (Chair): Alex and Janice, would you like to begin, then?
A. Scheiber: Thank you, members, for the opportunity to present to you on quality assurance today. My name is Alex Scheiber. I’m deputy director of child welfare. This is Janice Chow. Janice was introduced before as director of quality assurance.
It was a real scoop for me to have been able to recruit Janice from the Representative for Children and Youth. I’m still working on getting more of their staff over to join me, but we’ll get there.
I’m going to present to you a number of slides on quality assurance. Janice is going to talk to you about a new and exciting project that we have, which is the next level of quality assurance — aggregate analysis.
When I started with the provincial director of child welfare back in 2011 — that was when Doug Hughes had just started — we were just on the threshold of reintroducing our quality assurance program. At that time, we had completed very few case reviews and not a lot of audits. We really had a lot of work to do on the whole area of quality assurance.
Three years later I’m proud to say we have — and I’ll talk about this in a few minutes — a group of about 56 practice analysts, complaint specialists, working across the province in about 13 communities and every service delivery area, working hard every day on analyzing the work that we do in child welfare and across all of the six service streams.
The quality assurance program gets its authority through the Child, Family and Community Service Act as well as the administrative authority through our ministry to conduct quality assurance activities in every program and every service that we provide in the ministry — all the way from child safety, early childhood development and child care, child and youth mental health, child and youth with special needs, youth justice, and adoption.
Because, arguably, one of our most vulnerable areas is child safety, we have been concentrating since 2011, the first couple of years, in getting those quality assurance systems up and running in the area of child welfare. Latterly, we’ve been working on getting more, concentrating more and expanding our quality assurance program to the other six service streams. We’re now getting into adoption, and we’re poised to expand our quality assurance program into youth justice and CYMH.
We’ve worked hard to build a foundation, but we know we have a lot of work to do to get there.
There are essentially five programs within the quality assurance program. There are reviews of reportable circumstances. In the ministry we get about a little over 1,000 reportable circumstance reports every year. These are reports where children known to the ministry or in care have died, have sustained critical injuries, or there are serious incidents, like reports or allegations of children being abused in foster homes, and so forth.
This is a mechanism for notifying the director as well as other senior management in the ministry about these serious incidents, so that we can…. Not only is it the first step for conducting a case review, but it also gives us the opportunity to insert ourselves into that process and provide some practice direction and support if that’s required.
The second process is case reviews. We conduct, on average, about 20 to 25 case reviews a year. There are two types. There are file reviews and comprehensive reviews. These are an in-depth analysis of a case — the key decisions, actions and assessments that have been made by social workers and other practitioners in the ministry regarding a tragic event or a serious incident that has happened to a child.
When we get a reportable circumstance on a fatality, for example — a child who is known to the ministry or in care has died — then we look at the initial circumstances and we determine whether a case review is required.
[ Page 386 ]
If so, we conduct one of those types of case reviews. Those take about three to eight months, depending on the type of review that we’re doing. Some of them are looking at files only. Some of them involve interviews of staff and others involved in the case.
They result in an action plan. We used to call them recommendations. Now we call them action plans. Each case review has an action plan that we track and monitor through the implementation process.
Practice audits are an area that we’ve worked hard on in the last few years to have a much more robust system. We conduct practice audits at the service delivery area. We look at files, again, from practitioners to assess the degree to which social workers and others have met the standards that we have in place — practice standards.
Each audit report also has, as well as compliance rates along a number of critical measures, an action plan. We can get a very good picture of the degree to which our staff in the ministry are meeting our standards and where we have improvement. There are always improvements in practice audits. That’s why we have these action plans — to address the concerns in the audits.
The complaints process is another area that we’ve expanded since 2013. Traditionally, we’ve had…. There are two streams in the complaints process.
There’s the complaints resolution stream. When a client has a complaint about the service they’ve received, then they go to a complaints specialist. We have 11 of these across the province. There are two options that the client has. They can go through a complaints resolution process, which is more of a mediated, facilitated process for trying to reach resolution.
The other is a process that we introduced in 2013 called an administrative review. That’s where an uninvolved manager, senior manager, reviews the case and makes a report, based on the concerns expressed by the client. So the issues are sorted out, and there are findings about whether the complaint is founded or not. Then they also result in recommendations, which we track and monitor as well. That’s a process that we now have up and running as well — the administrative review process.
Finally, there’s the accreditation process. I have a small team of accreditation analysts in my branch that essentially work to fund all of the service providers that we fund over $500,000, to ensure that they go through the accreditation process through either the Council on Accreditation or CARF.
We have about 200-plus agencies, organizations, that we have contracts with across B.C. that have contracts of over $500,000. We work with those agencies through the accreditation cycle. Again, that’s a quality assurance process because, obviously, those two accreditation bodies look at these agencies that do a lot of our work in the ministry and provide a lot of service to ensure that they are meeting basic standards as well.
As I said earlier, in 2012 and ‘13 we reinstated the practice audit program. I’m happy to say, proud to say, that this audit program is…. While we’re still working and we still have a way to go, we’ve actually got a really robust methodology now. We have sampling strategies that are quite scientific, to make sure that the data we’re getting is a true picture of practice out there. We’ve also expanded the types of audits we do. Traditionally, the ministry has conducted audits — you know, child safety audits and child service audits, which are guardianship.
Now what we do is we’re expanding to include our resource workers and the work that they do with foster parents and the adoption program as well. We have our sights set on auditing, for the first time, CYMH and youth justice as well.
We have practice analysts that specialize in all of those different areas. We’re, as I said before, poised to expand this, but we want to take it one step at a time and make sure that we’ve got really sound, robust methodologies before we expand this.
I mentioned before the complaints program has also been expanded. Our reportable circumstances policy has also been recently updated and clarified. I mentioned before those 1,000 or so reportable circumstances reports we get every year that are generated by our front-line staff.
It hasn’t always been clear in the past what all of the circumstances are that require a reportable circumstance. These new policies that we’re just in the process of implementing…. We’re actually out there orienting staff this month on them, and the guidelines we have clarify the circumstances that require a reportable circumstance. As well, we’ve included those templates into our ICM system, so we have an easier procedure for staff to generate these reports.
The last area is improved public reporting of quality assurance activities. I’m going to hand this over to Janice, but I’d just like to say that although we’ve done a lot of work in quality assurance, we still have streams of programs. We have our audits that generate those action plans, the case reviews and so forth. What we need to do now is take it to the next level, which is to aggregate all of that information and really look at the big picture and what it’s saying about our practice on the front line, our policies, our procurement procedures, our operations and so forth.
I’ve brought in Janice from the RCY, as I said before, on a 12-month temporary assignment — it may be longer than that; we’ll see how things go — to help us take the quality assurance program to the next level, which is to really aggregate our analysis of our quality assurance activities. Really, until we close that loop, we’re just producing action plans that are in isolation, and we really don’t know what they say about the big picture for the ministry. So it’s really important that we close that quality assurance loop.
[ Page 387 ]
J. Chow: Good morning. Just to further build and strengthen the ministry’s quality assurance program, the ministry has identified three areas of work as part of its three-year strategic plan. Those are — again, as Alex mentioned — one, to complete an aggregate analysis of findings from quality assurance activities and report on themes and trends; second, to develop a quality improvement working group to inform practice in the ministry; and, third, to collate and support the implementation of action plans and recommendations from quality assurance activities.
In order to really meet the goals of the ministry’s strategic plan, it is really critical to implement a structure within our quality assurance branch to conduct and carry out a quality assurance monitoring and improvement function. This function will further build and strengthen the ministry’s QA programs. It will enable the ministry to aggregate, analyze, evaluate and monitor quality assurance data and information from the work that we do and to identify key practice strengths, issues and concerns and areas of improvement for policy, practice and standards.
I believe this approach is very well aligned with the Hughes review and comments and reports from the representative’s office. This aggregate analysis work will require an extensive examination across all of our service lines.
I think what’s really critical to being able to do this work is to take a look at what consistent practice issues we are actually seeing across case reviews, practice audits, our admin reviews — being able to take a look at those reports and say: “What observations or learnings are we seeing, and how do we use that information to inform our policies, our standards and our service delivery?”
I think this work is also critical to take a look to see if we see any improvements in how we deliver services to children and families in the province and to take this work to really identify the areas of practice that the ministry will need to prioritize.
This work and this plan are really necessary to build and strengthen our capacity within quality assurance to publicly report on them and to use that information for improvement in practice and outcomes. I think that for a lot of the aggregate work, we’re trying to use that information to inform a quality assurance and improvement group.
We’re excited to sort of develop this working group, hopefully for the early summer, where this working group will be responsible for leading an organization-wide culture of continuous quality improvement. The aggregate analysis work and the reporting work will be informing this committee so that they’re able to inform executive on which areas to prioritize and what areas we need to improve.
I’m very excited to support this work for the ministry, to be able to take a look at all of the quality assurance work — so from case review, from audit, from ministry reviews — and really say: “What are we learning from this work? How do we take a look across this work and identify some key practice themes that we’re seeing? How do we sort of strategize on some of the practice improvements in the province?”
I think that this work will be well aligned with some of the performance measurement and outcomes work that the ministry has already set that Mark had mentioned earlier.
M. Sieben: I get the last line.
We referenced the MCFD performance management report a number of times over the course of the morning. Since 2012-13 the ministry has publicly reported on its performance four times through this format. Each time the report has evolved and gotten a little bit deeper and taken to the next step.
These semi-annual reports are designed to foster improvements in our approach to monitoring and client outcomes and service delivery while also being accountable to the public for the resources that MCFD spends. These reports are posted on the MCFD websites so that they’re accessible to everybody.
A couple of weeks ago it was commented on by one of the columnists who was able to sort of discover it and spent his column referencing the extent of data. We were pleased to see that, in fact, it was being found and beginning to be talked about.
Over the past two years these reports have been broadened to reflect the ministry’s overall mandate. So not just focusing on the child welfare area but across all six of our service lines. For example, for many performance indicators, data for aboriginal children, youth and families is now explicitly set out across the service lines. The representative had made this recommendation in her report When Talk Trumped Service. It was something that we agreed necessarily needed to be pursued.
Our most recent report includes much greater discussion on trends and performance and the well-being of children and youth in care. We’re wholly open to looking to be increasingly accountable relating to the experience of the young people who are in the care of the director in MCFD.
Martin Wright — as I’ve noted, who is our executive director and chief information officer — is the lead for this continuing evolution of work. Given the recent posting of this sort of fourth iteration of the report, should the committee be interested, I’m sure Martin would be pleased to come and spend a bit of a chunk of time with you to run through how it’s set up and what’s changed from the past reports to what we find in this one and what the plans are for the future.
D. Plecas: Thanks very much for that.
Once again, this sounds very impressive. When I think
[ Page 388 ]
of evaluation, I think of the first part of that being “provide evidence that you’re doing what you say you’re doing.” It sounds like this is what this is all about.
Secondly, it’s about…. Once we’ve established that, does it somehow result in moving people from point A to point B? Are you seeing some improvement in individuals?
Thirdly is the question of: “So what?” What is the outcome of all of that? What difference does it make?
I just wonder to what extent the work that you’re doing is geared to something more than an assessment and consideration of processing, individual processes. How much of it is focused on the whole matter of the efficacy of what you’re doing with clients? Does that make sense?
A. Scheiber: Thank you for the question. It’s a good question. One of the things that we’ve been focusing on in our quality assurance program at this point, as I’ve said before, is compliance with standards. That’s one measure. It’s a proxy for practice. It doesn’t necessarily tell us — and I think this is partly what your question is — about the quality of service that we’re providing or the outcomes or whether our outcomes ultimately are good or not.
That’s part of the reason why we’re taking this quality assurance program to this next level, because we have a lot of data right now that tells us about how well practitioners are meeting standards. We have a lot of data that tells us about their assessments and decisions and plans in the wake of tragic events. But we don’t have a lot of data yet, at an aggregate level, that informs the ministry about changing policies, changing operations, changing our staffing, all of that.
That’s what we need to do now. We need to take all that data, pull it together, look at the themes. What’s this telling us about front-line practitioners? What is it telling us about staffing levels? What’s it telling us about our policies? We think we have a number of hypotheses about why we’re seeing certain areas of concern in the ministry, but until we test those hypotheses, then we don’t really know for sure.
I don’t know, Janice, if you had any further comments about the sort of aggregate approach that we’re taking.
J. Chow: Just to add to Alex’s comments, I think, certainly, what we want to be able to do is take a look at policies that were implemented or practice directives. A lot of this work is, again, continued out of the representative’s reports and is really taking a look at: how effective have they been as we implement them? Did we actually meet the intent of that particular piece of work?
Doing the aggregate is being able to take a look at that work. I think what we’re probably seeing is some consistent themes in some of the findings across all of our quality assurance work. Now it’s about taking that information and how we connect that to the other areas within the ministry and other service lines, whether it’s connected to finance or with corporate or service delivery, practice, etc. It’s about being able to connect and share that information in a way that people understand the relevance of quality assurance activities, how that informs the business and strategic direction of the ministry.
D. Plecas: Just a follow-up. I just want to commend you again on that, because I think you could search the world over, looking at organizations, and the one thing they fail to do in evaluation is exactly what you’re doing as a first step. But again, of course, it doesn’t mean a whole lot if we don’t have the other steps. Presumably that is, as you indicated, where you’re going next, right?
M. Sieben: I’d add, just as a closer on the strong point that Janice made, that we absolutely agree with and are looking to bring the program-based quality assurance reporting up to a deeper and more aggregate level.
People, of course, don’t live their lives in ministry silos, and that’s certainly the case with those that receive service or children who are in the care of the ministries. It’s difficult to bring the cross-ministry aggregate data together. We have some capacity to do that through information-sharing agreements with other ministries.
But also, the representative’s office has a provision in their statute which is broad and doesn’t really exist in any other jurisdiction in the country, which allows the representative to seek information from the Ministry of Health and the Ministry of Education or other public bodies and to data-match that to information that MCFD has. As a result, we have the benefit of some pieces of work over the course of the last nine years or thereabouts relating to the outcomes for kids in care and kids generally in B.C. that doesn’t really exist elsewhere.
Our work necessarily takes into account the learnings that come from those descriptions of outcomes of not only MCFD programs but other ministry programs and young people’s experience here in B.C.
M. Stilwell: I wanted to follow up on Darryl’s questions and comments.
I wonder, Janice, can you familiarize us with what the quality assurance and improvement working group will look like, based on your plans now? For example, talking about a culture, I am assuming that people from Education and Health will be on that committee.
The second question relates to external professional bodies. I assume they exist and have standards and recommendations. I’m interested in the subdivision of them, kind of along what Darryl was talking about, in that they’re obviously business-process quality assurance standards, which is important to support the real work of what you’re doing.
[ Page 389 ]
At the same time, I’m interested if there are content specialists who work in the quality area, specifically around vulnerable children, children in care. How will that inform your group, your work?
J. Chow: I think for the quality assurance quality improvement working group committee, we’re still in the early stages of really defining what we see as the role and function of that committee.
Certainly, I think one of the things that was really important was ensuring that we have broad representation on that committee — those who do have expertise in informing the ministry about what is meant by continuous quality improvement, being able to understand the information, combing the aggregate work, and how the application of that is actually working in the system.
I do think that, potentially, there may be some external advisers who may be invited to inform the working group. I think we’ve also talked about perhaps including some expertise from the front line to get the practice perspective lens as well.
Certainly, I think that a lot of the quality assurance activity in the reports and information we do…. We need to ensure that it is meaningful, is understandable for folks who are part of the system to understand what we mean by prioritizing practice improvement.
In terms of experts in the quality assurance branch. I think that Alex alluded to that, too, in terms of all the different practice analysts and our directors of practice, our executives, who do have a lot of practice and service delivery expertise, to those who inform the work for the QA group as well. I think that that’s most likely a requirement to ensure the functionality of the working group committee.
M. Stilwell: I have a second question. Do you want me to wait till the end or ask it now?
J. Thornthwaite (Chair): Is it related to this one?
M. Stilwell: Yes, I can do that.
What’s interesting about all of this so far, including the reports from the representative, is its retrospective probing of data, which, you know, can’t really get you where you want to go.
I was interested in hearing how, Alex, your comments about…. I think you’re alluding to more prospective data-gathering, in that you have hypotheses that you want to test.
That’s interesting, obviously, because to get the real information out of it, you need controlled double-blind studies, which you can’t really do with children in care over long terms. I’m interested in what you said about hypotheses and aggregate data and the quality assurance not always being retrospective, which doesn’t really tell you where your system is at or going. If that makes sense.
A. Scheiber: I’m on thin ice. I’m talking with a doctor about this, but I’ll give this a try.
Yes, you’re right. Most of our analysis or data is retrospective — all of our case reviews, our audits and so forth.
What we need to do now is to gather that information and look at our policies and practices going forward. To this point, what we’ve done, more or less, is we’ve developed action plans that will address specific cases and specific audits. We haven’t really looked at systemic or organizational, big-picture plans going forward. Really, the purpose of this is to look…. It’s very forward-looking.
We really haven’t had, I don’t think — for many years, anyway — an organized way in the ministry of taking that data and working with people across the ministry, as well as some of our service providers. I’ll also include the delegated agencies in that, because they’re important. Our quality assurance programs cover those agencies as well. But an organized way of getting those folks together and looking at this data and making some good, forward-looking plans….
I don’t know, Mark, if you had any….
M. Sieben: This is an area that, again, Martin would really welcome an opportunity to speak to the committee about.
To sum it up quickly, it’s one thing to produce the reports and come up with the data, but then what do you do with it? A part of the challenge is making the data accessible and meaningful both to decision-makers at a policy level as well as practitioners at a field or case-related level too. That’s a part of what Martin is looking to continue to develop, in addition to the report — how to put it in the hands of our leaders at each level of the organization to be able to make, then, informed decisions and choices and how to move a variable or two.
For example, in conjunction with the report relating to educational outcomes for kids in care, work was done cross-ministry in order to develop a greater engagement at a school level for schools that had a higher number of kids in care and, basically, that there’d be some amount of monitoring and paying attention beyond the individual IEPs. We’ve seen, as a result, not great leaps and bounds but some marked improvement for educational outcomes, particularly for aboriginal kids in care, which Martin could speak to in the report.
That’s a small example of the type of thing that we’re trying to get to, where we can take the information, make it accessible to our staff and have them consider how best to make a change either in the practice of their office, or they might want to re-utilize a contract.
Something that is an interesting exercise to go through at a leader-of-service-delivery-area level or within CSMs is bringing a set of experiences or data about a particular type of business, so something like rates of response to child welfare concerns through family development re-
[ Page 390 ]
sponse or rates of removal. As you’ll see in the data tables, they vary somewhat from SDA to SDA.
That isn’t necessarily wrong, but it certainly is useful to be able to bring the leads in SDAs together and to point out the variances that, to some extent, exist from neighbouring SDA to neighbouring SDA and say: “You know, that’s interesting. Let’s talk about that.” There could be very good reasons that are going on demographically or in terms of specific economic or social challenges in one community or the other. Or it could simply be practice and direction that happens at a team-leader level.
The challenge is how to get that information in the hands of the actual doers in the system in a way that is accessible and in a way that allows them to incorporate it into all of the other different things that we ask them to do for us each day.
D. Donaldson (Deputy Chair): I have a couple of questions, but I’ll just pose one and then come around to the others.
First of all, congratulations on the performance management report, the newest one that was released in January, at least. I congratulate you on adding more data and more breakdown of data. I also hope that as you continue to do that, we won’t lose the continuity of how the data is analyzed and presented between reports, just so that we can actually make a valid comparison between apples and apples and oranges and oranges over the years rather than…. The long-form census comes to mind, that we aren’t able to anymore make comparisons.
There’s something that’s niggling me here. It’s around the quality assurance monitoring improvement. I’d like your comments on it. I believe that the experts are the families and the children. They are the clients. I don’t like the word “clients,” because it implies a one-way transaction. They are the people who provide feedback as well as receive the services. I don’t know who could be the better experts at determining quality than the people that we’re supposed to serve — not experts from within the ministry, not experts from outside the ministry but the actual families and children who are the experts. They’re living the day-to-day experience.
I’m interested in knowing how the actual families — not organizations that represent them but the actual families who’ve had the experiences — are going to be invited into a working group or invited into determining quality assurance and monitoring and improvement.
A. Scheiber: Well, I’ll just say this.
And maybe you want to add to this, Mark.
First of all, I think it’s important to realize that a lot of the data that we have through our quality assurance programs is actually the voice of the clients. For better or for worse, we receive over a thousand complaints every year. Every one of those clients has something to say about the service that they’ve received from the ministry.
That information is organized and reproduced on our data: what they say about the service, their experiences and if they go on to an administrative review. There’s quite a lot of information that’s gathered about that, and we report out on that. What we haven’t done yet, as we said before, is look at the big picture.
We sometimes interview clients, children and families, when we conduct case reviews as well. So there is that voice.
I take your point, though, that the quality assurance and improvement working group that we have, although it’s still in the developmental stages — we haven’t yet defined the terms of reference — doesn’t have direct involvement from children and families, but I think that that’s something that we plan to look at over the long term once we get up and running. We do need the people who receive services from us to be able to look at this data as well and comment on it at an aggregate level. I think it’s a good point.
M. Sieben: I would add just a little bit to that. What my mind goes to is…. From my experience again at my previous ministry, at Social Development where we…. Our experience was that there wasn’t a whole lot of feedback or engagement directly with the service recipients, and there was almost an assumption about what their response was going to be.
Instead, we chose to actually ask them about what their experience was like in those offices and then also asked them to respond to inquiries relating to a change in approach. It didn’t nearly measure up with what people would assume were going to be some of the negative comments back. I’d acknowledge that.
I think there is more that we can do in addition to what Alex has noted. I’m not adverse at all about finding some means by which to sample specifically some of what the experiences are of recipients from MCFD services.
I would note that — in addition to a couple of the points Alex noted — we have a longstanding relationship with the Youth in Care Network. And Cory Heavener has plans for developing a youth advisory committee that we hope to get off the ground over the course of this next fiscal year.
Much of our funding beyond the three dominant MCFD directly provided provisional service: child and youth mental health, child welfare and youth justice…. The services beyond that — much of the funding and services go through community agencies. Most of those community agencies have some feedback loop at a community or an agency-specific level. We do look to accumulate the collective that comes back from that through groups such as BCACDI.
But I would acknowledge that there’s more we can do for direct feedback from our service base.
[ Page 391 ]
C. James: Just a couple of questions. I think one, Mark, you’ve spoken to. That is the issue of what you do with the information once it’s gathered, what happens with the data. I would agree, I appreciate the gathering of more data. I think it’s helpful. I think it will, again, help with transparency and hopefully help with trust and help with building relationships with families, as my colleague has mentioned.
I think it would be very helpful to have Martin come and give a description and go through the process. I think that’s certainly one of my questions. Once the data is up there, what happens with it? How does it get back to the practitioners? How do offices in aboriginal agencies and others actually utilize that data? Rather than just saying, “Oh, this is interesting. We’ve now seen this information; how does it become practice.” I think that’s a key issue that needs to be looked at.
I also wondered about how you determine what gets measured. Will that be the working group? Will the working group take a look and say: “This needs to be expanded,” or “This is an area we feel in quality assurance would be an interesting piece of data for us to be able to gather”? How is that determination going to be made? I think that’s a critical piece in all of this.
Then just the last place…. Again, you may want to wait until we get a further presentation on this. It’s interesting to me to read the comment that the strategic plan has identified quality assurance as an important piece, as an important part of the plan.
Maybe it’s just semantics. To me, the quality assurance is a way to implement the strategic plan, and there should be a link somehow between the goals and objectives, the priorities, the performance measures in the ministry. Quality assurance then becomes a tool to help you carry out the strategic plan, to help you make a determination. I just wondered where the link is happening there.
I’ll just look at discussions that we’ve had in estimates previously. I’ll use the example of youth mental health, where telehealth calls was one of the priorities or one of the measures, performance measures. Well, to me that’s not a great performance measure around youth mental health.
Is there a discussion, then, about how quality assurance will help you carry out the kinds of objectives that are there within the strategic plan?
M. Sieben: That’s a really good example. I would agree that’s more of an output, really, than it is an outcome measure — what amounts to more of a frequency count, almost.
Many of the points that were made, I think, would best be responded to and likely agreed with and then expanded on by Martin in his presentation. I look forward to him becoming available for that purpose.
I would note that in response to the question about how you decide what gets measured, part of it is taking the time to bring the right people together, as Janice is now lead for and is going to be doing, in order to consider how best to determine what can be measured, based on what experience in other jurisdictions and what other measures are being used. But a big part of it, frankly, is what information we can gather that allows us to measure it.
In some of our program areas we have the benefit of a longer history and can move the data frontways, sideways and backwards, as we have set a trend — data going back literally decades.
ICM provides a richness of data that wasn’t available in the past too. So the part of what committee members and others are able to see in the performance management report is a result of that. We look forward to that evolving in the future.
In an area such as child and youth mental health, there are probably some cultural issues, let’s call them, with the community relating to providing some consistency in approach, which we’re looking to further develop across the province. Enhanced use of the data system CARIS is behind that, and then consideration of the types of measures and outcomes that we can agree upon, probably with the Ministry of Health, given the synergies and the connections between the matters.
I’d lastly say that I agree with the members’ position that it would make sense to have the quality assurance lead the strategic plan as opposed to have the strategic plans lead the other way. That’s where we’re looking to get to. At this point, however, consistent with the initial discussion point from the day around the ongoing monitoring function in the representative’s office, we’re not mature enough yet.
In order to allow that to happen…. We could probably identify in a program stream or two that we have that level of maturity and sophistication. We don’t have it across all six of our service lines. That’s why it’s important for me, as deputy minister, to identify that that’s a commitment that we intend to make — a continued emphasis on recreating the quality assurance system within MCFD.
At different times over the course of the last decade or 15 years or so of more focus on quality assurance at a local level as opposed to being done centrally has been a discussion point and a decision point. The reference to the emphasis within the strategic plan in the service plan is that it is definitely an important area for us going forward for the next few years. I want not only people outside but people within the ministry to wholly understand that.
J. Thornthwaite (Chair): Thank you very much, Mark.
I’ve got a couple of comments, just to finish off. It goes along with the next steps, actually, with the special project that we’re doing on child and youth mental health, because we will be having ministries come in and give us their perspectives. You’ve kind of alluded to that in your remarks previously.
[ Page 392 ]
Again, for Janice, I’d like to just reiterate what some of my colleagues have already mentioned. It is really, really important to get the perspectives of the people that are in the system. You mentioned I think, Mark, the Youth in Care Network but also groups like the FORCE — Keli Anderson and the FORCE — and the Institutes of Families, because those are the ones that are actually down on the ground dealing with the issues with regards to child and youth mental health.
Just making a comment about what’s measured…. This, of course, will come up in phase 2 of the special project with child and youth mental health. It’s data and what to do with it. If we have an increase, for instance…. I don’t even know whether or not we have this data.
If we had an increase in admissions to hospital emergency systems with child and youth mental health issues, that might be a bad thing. But if we had an increase in intake of child and youth services at, for instance, the operators or the service providers, that means we’re getting to those kids sooner and they’re getting help sooner, so that might be a good thing.
These are the types of questions that the committee is grappling with, with regards to whether or not in fact this data is available and whether or not it’s good or bad and which direction we should be going.
I thought I’d just put that in. It’s really, really apropos and timely with regards to your really good work that you’re working on with quality assurance, and we thank you for that.
I’m not seeing any other requests for comment or questions, so thank you very much for coming in.
Youth Mental Health Project:
Consultation Process
J. Thornthwaite (Chair): Our committee…. We’ve got just a few more items that we would like to discuss, particularly because I know for a fact Donna has to leave. There might be others that have to leave a little bit early. I’d like to get your input on the next stage, Donna, on the child and youth mental health project. So we’ll just take a moment to pass this stuff out, and then we’ll resume in one minute or two minutes, max.
Kate is just handing out this very draft document — Phase 2: Special Project on Youth Mental Health. This document was kindly put together by Byron. Thank you. He took recommendations from Doug. Doug submitted sample questions. I submitted sample questions. We took ideas and suggestions of possible witnesses based on the discussions we’ve been having the last few meetings. Obviously, our questions have to be streamlined and targeted, depending on who we’re asking to come in.
I encourage you to take a look at this document. We can discuss it briefly. Again, I just want to stress that it’s a draft, and we really do need to streamline it if we’re going to be able to get the information that we want.
M. Karagianis: Just a clarification. Are we actually switching items 3 and 4 on our agenda?
J. Thornthwaite (Chair): You’re absolutely correct. Sorry. I didn’t mention that. The reason why is because I noticed, with Kate, that item 3 is in camera, so we thought we’d continue on with this and then go in camera.
M. Karagianis: Okay. I’m just thinking for Hansard purposes and for anybody following this, it would be nice for them to know what’s going on.
J. Thornthwaite (Chair): They would be very confused, yes. Thank you for bringing that up and clarifying.
Could we just take a brief moment to look at this document? Then we’ll continue on with actual item 3 in an in-camera.
I actually did get a chance to go through it. I thought a lot of the information that the folks that just presented, to do with quality assurance, was very informative to this document. A lot of it is to do with data collection and what’s important and where that data is coming from.
What we’ve done here is just kind of divided it up into categories. We’ve got the consultation methods — obviously, the public meetings that the people will be coming in — and the key stakeholders, including ministries, health authorities, service providers, academics and other expert witnesses.
I would like to suggest — and it might be a bit provocative here or impossible — getting a lot of the ministries to come in together. I’m not sure whether or not we could do that, but it might be interesting. If a question was asked to a ministry and they did not have a complete answer, then maybe another ministry might be able to augment that. We know that the services to do with child and youth mental health are cross-jurisdictional and cross-ministerial. So I just kind of throw that out there to think about.
The other thing is that we could add something to do with the so-called hybrid action plans or hybrid model that was mentioned by the folks that just presented.
The questions are general, health, education, community service providers, First Nations and some other possible topics — and then, on the last page, the draft questions for public written submissions.
Before I throw it out to the group, I just wanted to make it really clear that we do want to look for solutions here. We’ve delved into a lot of the problems, and I think we’ve kind of got that all squared away as to where the problems are. The emphasis on the phase 2 is to find and to make recommendations for solutions.
D. Barnett: I’m sorry. I have to go, and I’ve just sort of scanned this.
The early-years support. The only place I see it in here is “Other possible topics.” To me, that’s a topic that should be No. 1 on our question list, because that’s where it starts.
J. Thornthwaite (Chair): We have in the last part 3, draft questions for public written submissions…. There is actually “early intervention” mentioned. “How could services for early intervention and assessment, treatment and prevention of mental health issues in youth be better integrated?”
But we can do more. We can also stick it in the health and the education.
C. James: The other group I would add under the draft questions for stakeholders would be families and youth themselves. I think it was one of the real strengths of our first process, that we actually asked the families and we asked the youth. I think we need to have a section in there for them as well.
Just to your point around successes. I think we’ll get a lot of information here, again, around the challenges. I think that’s just the reality of youth mental health. People will want to share that, but I think we need to be even a bit more specific around the successes and ask people: what has worked? What have you seen that has worked in your community for you?
Whether we’re talking about the service providers, the families, the youth, I think we want a very specific question — what have you seen that’s been successful, why has it been successful, and what have been the strengths of that success? — so that we get some very specific examples of what’s working in the province. I think we need to be very specific in that area.
As I said, I would suggest we also add that in with the written submissions — but add that in with another group of stakeholders, which would be the families themselves, and the youth.
D. Plecas: I noticed that from “written submissions” we’re asking people to consider what actions could be taken to reduce stigma around youth mental health, but we’re not asking the same question for others — so stakeholders. If we could include that.
J. Thornthwaite (Chair): Yeah, I think that’s good. One of our presenters did mention, at one point, that stigma is everywhere, and it’s also in the systems as well. So that’s a good point.
D. Plecas: I was also, through these questions, thinking: I hope people don’t give us loosey-goosey responses, where they’re not responses which really point to things we can really do. I think there’s always a risk of people coming to us and really just giving us more of the same. “Let me tell you what the problem is, what’s making it difficult” as opposed to, “Here are some concrete ideas of what, for example, I would do if I was in the command module” — a “this is what needs to happen” kind of a thing.
If there could be some kind of emphasis on that, which goes back to our original point that we were making in the beginning.
J. Thornthwaite (Chair): I agree. It has to be solutions-based — what would make a huge difference, and what would have made a difference in your life? — and, also, to support the successes that are already going on and to, hopefully, enhance or expand those successes.
D. Plecas: If I could just make one more point on that. I just had the opportunity to mention to Maurine on this yesterday. I just came across a rather significant article on the state of affairs with regard to mental health in general. It’s not clear to me that people get the extent to which this is coming at us like a freight train.
This is, some people would say, one of the most significant problems facing the world. The last thing I read is it’s something like one in five people worldwide — this is worldwide — who are facing mental health issues, and it’s climbing. Our responses are so out of sync with that reality when you consider the huge volume of people.
It’s almost like we risk tinkering with something that demands a much more powerful response, that demands some urgency. So I’m saying: if we could somehow get people to be in consideration of that.
D. Donaldson (Deputy Chair): I think that, yeah, the witnesses who come better be the ones who have a working solution already that we want to analyze further, find out why it’s successful and how it can be expanded. I agree. I hope we don’t have presenters coming in and saying…. Yeah, I don’t think we need to focus on…. My focus would be on what’s happening that’s successful now.
The general questions — I recognize they’re drafts, and they need a lot of whittling down. The questions I had thought of are: what’s your model approach solution, and what makes it successful? Then we can get to the specifics. What would it take to expand or replicate your approach on a wider basis — regionally or provincially, for example? So trying to find those specific examples that we know exist. Is there expandability? What kind of additional resources are required to maximize the impact of the approach you’re taking? So somehow a mixture between those really succinct things and some of the directive ones here.
I think maybe we’re getting a little too directive in the general questions. I think if we have those four — and then our questioning of the witnesses could draw out some of the directive stuff. That’s just under the general points. The health, education — those are much more specific, for sure.
[ Page 394 ]
C. James: I think the one other thing that we have to be careful of — it’s clear, particularly in the area of mental health, but we’ve heard it from MCFD as well — is looking for the one solution. I think everyone will acknowledge that there isn’t one fix. There isn’t one program that’s going to address…. Mental is not a uniform challenge, and it has to involve Education. It has to involve Health. It has to involve MCFD sometimes.
I like the idea of the general questions. I think the general questions give us an opportunity to do that. I think adding in a simple question — just around: what other groups and organizations are you working with? — so we get that integration piece there is important.
But I think it’s just a little bit of a caution that we have to be careful not to be looking for the one fix, because there isn’t one fix for this issue. We’re talking about integrated approaches, we’re talking about community approaches, we’re talking about a range of areas, and we want to hear those successes that have track records right now that we can look at.
J. Thornthwaite (Chair): Well, thank you very much, everybody, for that. I think we’re pretty much….
Maurine.
M. Karagianis: Are we going to move on to the fourth point? I wanted to address some comments to the fourth point here, the draft of potential witnesses.
J. Thornthwaite (Chair): Okay. I think that….
Byron, you got some feedback on that. We just have to be a little bit more succinct.
Maurine, did you have specific examples or more…?
M. Karagianis: It’s just that we’ve got a great draft list of potential witnesses, but there are a couple of areas that I also think we should include, one being northern British Columbia. I see that when we look at community service providers — although Williams Lake, I appreciate, is central, but it’s rural…. I think there needs to be some representation from northern British Columbia, because I think the challenges are different in urban centres than they are the further you go north.
I also would like us to think about including someone representative of the homelessness service sector, because youth mental health issues are very closely linked to youth homelessness. I know that when you look at the statistics here in Victoria for the Coalition to End Homelessness, the youth statistics and the youth information is kind of shocking and disturbing. I think they might also have some observations that would be very useful and helpful to us.
Those were just a couple of pieces that I thought I’d like to see included.
J. Thornthwaite (Chair): Okay, thank you very much, everyone. We’ll go back and kind of streamline. Depending on the actual questions for the actual people that are coming, I think our intent is to whittle down and prevent the possibility that we’re just going to get, as you say, more of the same. We want to get to the solutions.
M. Karagianis: Sorry, Chair, to maybe just add to that. I think the reality, as we know from having sat through these sessions, is that it’s an enormous amount of information to absorb. So I don’t want it to just become an endless stream of long answers and long, complicated discussions. I think if we’re going to do serious work, succinct questions will get us more succinct answers and responses.
I can see that, given the list of witnesses, that’s a lot of work. The idea that we would bring all the ministries in is like having the Supreme Court sitting in front of you, all the judges, and you’ve got to kind of address them. That’s a lot of bodies to try to get information to and from. So I just think, in practical terms, we do have to make it also easy for our presenters to come in and not have to do a 15-page essay in order to feel that they’ve responded appropriately to our call.
J. Thornthwaite (Chair): So when we send out the questions, then when they come, they just deal with the questions. Our entire decision of what those questions are will shape what information we’re going to get.
I personally like the idea of getting…. Your comment about getting the Supreme Court in one room — I totally understand that. If I was to be provocative, it would be nice to get…. Let’s face it, the majority of the stuff that we hear is a fragmentation, uncoordinated — all of these separate little fiefdoms that are occurring in different ministries, different jurisdictions. Obviously, we’re not going to get the Supreme Court of Canada or the federal government in here. But at least amongst, say, the Ministry of Health, the Ministry of Children and Families, the Ministry of Education — let’s not forget Justice, and there are others — those main ministries have services that are feeding this whole issue.
Obviously, we’ve seen efforts today, and in the past, of a better collaboration and understanding that they have to break down those silos. But we’ve also heard today that it’s a huge, long process. So for us, the committee, to get a better appreciation of what’s out there will also help inform others that there are services that are out there, but there could be redundancy, and people don’t know about it.
I think we should think about that as far as how we can get them together. It’s been suggested to me by others that the only way you find out and get the true answers is when everybody’s in the same room together, because then you can’t do the blame game — right? Everybody’s accountable because they’re there. But I don’t know if we’ll be able to do that.
[ Page 395 ]
Thank you very much for that brief discussion. We’ll send it off to Byron again to adjust.
For the next one, do we have to make a motion to go in camera?
K. Ryan-Lloyd (Deputy Clerk and Clerk of Committees): Yes, I think that would be helpful.
J. Thornthwaite (Chair): So for the deliberations on the statutory review, the Representative for Children and Youth Act, can I have a motion to go in camera?
M. Karagianis: So moved.
The committee continued in camera from 10:39 a.m. to 10:54 a.m.
[J. Thornthwaite in the chair.]
J. Thornthwaite (Chair): Okay, we’re now on air.
M. Karagianis: Are we required to write a report? Is that the appropriate step on this?
K. Ryan-Lloyd (Clerk of Committees): On the draft report?
M. Karagianis: Well, coming out of the in-camera meeting, do we need to rise and report in order to get the report released? Or is it just an assumption, because we’re no longer in camera, that our report from that in-camera session is now going to go public?
K. Ryan-Lloyd (Clerk of Committees): We have not, of course, prepared a draft report. Subject to the advice and input that we have just received from members, we’ll begin that process now. We will be working closely with the Chair and the Deputy Chair. But there is not yet a report to be adopted by the committee until we have a sense that we’ve been able to capture the interests of the committee succinctly and accurately.
Our intention would be to bring you a draft report at your next meeting. If there is an opportunity before March 25, we’d be pleased to share it with you at that time. If not, then I suspect at the March 25 meeting we’ll be looking for a motion to either amend or approve that report.
M. Karagianis: Great.
Other Business
J. Thornthwaite (Chair): The last item of business, then, is any other business, but everybody should have gotten this in their e-mails. This is just a presentation of a tour, actually, that myself and Keli Anderson and Val Tregillus from the FORCE and others from the Ministry of Education went to this alternative secondary school in North Vancouver, and I thought you might want to take a look at it.
Just take a look at it. It’s quite good. It sounds like they’re on the right track, and they have kindly agreed to come and present to the committee, as one of our presenters, on the collaborative approach that they do in schools.
Anything else? I’ve got one more item about April 13. Apparently, April 13 works. It was the only day that we could get Dr. Morrison and Dr. Peterson from New Brunswick to come and present.
A Voice: What day of the week is that?
J. Thornthwaite (Chair): It’s a Monday.
K. Ryan-Lloyd (Clerk of Committees): Just to clarify, we did send a notice out to members proposing Monday, April 13, from 8 a.m. to 9:45 a.m. as a new meeting opportunity to hear from Drs. Peterson and Morrison from the University of New Brunswick. That appears to work for all committee members, so we’ll be sending out confirmation of that meeting shortly.
In addition to that discussion, the proposal also included an opportunity for an informal lunch discussion with interested members of the committee. I know that not everyone was available for the luncheon portion, but for those that are available and interested, we will be confirming details with respect to that follow-up discussion as soon as possible.
J. Thornthwaite (Chair): Any other business?
We don’t have Donna here to put the motion to adjourn, so somebody else is going to have to do that.
Carole and then Mike.
The committee adjourned at 10:57 a.m.
Copyright © 2015: British Columbia Hansard Services, Victoria, British Columbia, Canada