2014 Legislative Session: Second Session, 40th Parliament

SELECT STANDING COMMITTEE ON CHILDREN AND YOUTH

MINUTES AND HANSARD


MINUTES

SELECT STANDING COMMITTEE ON CHILDREN AND YOUTH

Wednesday, February 26, 2014

9:00 a.m.

Douglas Fir Committee Room
Parliament Buildings, Victoria, B.C.

Present: Jane Thornthwaite, MLA (Chair); Carole James, MLA (Deputy Chair); Donna Barnett, MLA; Mike Bernier, MLA; Doug Donaldson, MLA; Maurine Karagianis, MLA; John Martin, MLA; Darryl Plecas, MLA

Unavoidably Absent: Jennifer Rice, MLA; Dr. Moira Stilwell, MLA

1. There not yet being a Chair elected to serve the Committee, the meeting was called to order at 9:07 a.m. by the Deputy Clerk and Clerk of Committees.

2. Resolved, that Jane Thornthwaite, MLA, be elected Chair of the Select Standing Committee on Children and Youth. (Donna Barnett, MLA)

3. Resolved, that Carole James, MLA, be elected Deputy Chair of the Select Standing Committee on Children and Youth. (Maurine Karagianis, MLA)

4. The following witnesses appeared before the Committee regarding the status of implementation of the recommendations by the Office of the Representative for Children and Youth.

Witnesses:

Ministry of Children and Family Development

• Mark Sieben, Deputy Minister

• Cory Heavener, Provincial Director of Child Welfare

5. The Committee recessed from 10:21 a.m. to 10:31 a.m.

6. The following witnesses appeared before the Committee and answered questions regarding the report entitled Ministry of Children and Family Development Operational and Strategic Directional Plan, Volume 2, September 2013.

Witnesses:

Ministry of Children and Family Development

• Mark Sieben, Deputy Minister

• Martin Wright, Executive Director and Chief Information Officer

7. The Committee recessed from 11:27 a.m. to 11:31 a.m.

8. The Chair, Deputy Chair and Clerk to the Committee provided the Committee with an update on the proposal for the special project on Youth Mental Health.

9. The Committee adjourned to the call of the Chair at 11:42 a.m.

Jane Thornthwaite, MLA 
Chair

Kate Ryan-Lloyd
Deputy Clerk and
Clerk of Committees


The following electronic version is for informational purposes only.
The printed version remains the official version.

REPORT OF PROCEEDINGS
(Hansard)

SELECT STANDING COMMITTEE ON
CHILDREN AND YOUTH

WEDNESDAY, FEBRUARY 26, 2014

Issue No. 6

ISSN 1911-1932 (Print)
ISSN 1911-1940 (Online)


CONTENTS

Election of Chair and Deputy Chair

153

Ministry of Children and Family Development: Status Update on Implementation of Recommendations of Representative for Children and Youth

153

M. Sieben

C. Heavener

MCFD Operational and Strategic Directional Plan

165

M. Wright

M. Sieben

Update on Proposal for Youth Mental Health Project

172

Other Business

174


Chair:

* Jane Thornthwaite (North Vancouver–Seymour BC Liberal)

Deputy Chair:

* Carole James (Victoria–Beacon Hill NDP)

Members:

* Donna Barnett (Cariboo-Chilcotin BC Liberal)


* Mike Bernier (Peace River South BC Liberal)


* Doug Donaldson (Stikine NDP)


* Maurine Karagianis (Esquimalt–Royal Roads NDP)


* John Martin (Chilliwack BC Liberal)


* Darryl Plecas (Abbotsford South BC Liberal)


Jennifer Rice (North Coast NDP)


Dr. Moira Stilwell (Vancouver-Langara BC Liberal)


* denotes member present

Clerk:

Kate Ryan-Lloyd

Committee Staff:

Aaron Ellingsen (Committee Researcher)

Byron Plant (Committee Research Analyst)


Witnesses:

Cory Heavener (Ministry of Children and Family Development)

Mark Sieben (Deputy Minister, Ministry of Children and Family Development)

Martin Wright (Ministry of Children and Family Development)



[ Page 153 ]

WEDNESDAY, FEBRUARY 26, 2014

The committee met at 9:07 a.m.

Election of Chair and Deputy Chair

K. Ryan-Lloyd (Deputy Clerk and Clerk of Committees): Good morning, everyone. As this is the first meeting of the Select Standing Committee on Children and Youth in the new second session of this parliament, and the committee has not yet met to elect a Chair, the first item of business is the election of Chair.

I would like to open the call for nominations to that position.

D. Barnett: I would like to nominate MLA Jane Thornthwaite.

K. Ryan-Lloyd (Clerk of Committees): Thank you. I'll make a note of that. Jane Thornthwaite has been nominated for the position of Chair. Are there any further nominations? Any further nominations? Any further nominations?

Seeing none, I will put the question.

Motion approved.

[J. Thornthwaite in the chair.]

K. Ryan-Lloyd (Clerk of Committees): Congratulations, Madam Chair.

J. Thornthwaite (Chair): Thank you very much. It's always a bit of stress there.

We have to do the vote for the vice-Chair as well. Are there any nominations for vice-Chair?

M. Karagianis: I'd like to nominate Carole James for Deputy Chair of the committee.

J. Thornthwaite (Chair): Are there any other volunteers to the vice-Chair? Going once? Going twice?

Motion approved.

J. Thornthwaite (Chair): Congratulations, Carole.

All right, we have a few items today. I just wanted to make a brief correction for the end time for the committee. Many of us are going to be wanting to go out in the front of the Legislature for Pink Shirt Day. We're going to probably end the meeting at 11:45, so we're going to be a little bit tighter than the original 12 o'clock.

In either case, we have, as our guests, Mark Sieben and Cory Heavener from the ministry, who are going to be presenting two items. The first one is the update on the status of implementation of recommendations from the Representative for Children and Youth. Then the second item is the MCFD Operational and Strategic Directional Plan. We'll have opportunities for questions and comments after those two items.

[0910]

Then the last item. We just want to give an update of where we're at with our special project on youth mental health, get feedback from the committee members and move forward for an action plan on that.

Is everybody okay with the agenda items as put forward? I'm seeing nods, so that's good.

Maybe we'll just throw it open right away to Mark and Cory on item 3: update on the status of implementation of recommendations from the rep.

Ministry of Children and
Family Development: Status Update on
Implementation of Recommendations of
Representative for Children and Youth

M. Sieben: Congratulations, Chair and vice-Chair, on your reappointments, and on behalf of MCFD our appreciation to all committee members for your work and interest in our work. We're very appreciative of the opportunity to come back and visit you so recently after our last visit in November.

What we have for you this morning is some further discussion on a couple of topics that we touched on and identified some further interest amongst committee members. The first pertains to our work with our colleagues and our oversight body, the Representative for Children and Youth office, specifically, the work that is done in MCFD involved in receiving of, responding to and then tracking the recommendations that come from the various reports emanating from the representative's office.

Cory is going to lead that discussion with you picking up pretty much where she left off last in November. We're looking to get into some level of detail regarding at least a couple of the reports, so you get a real sense of what we do with individual recommendations from individual reports.

Following that, we also have with us, as we did back in November, Martin Wright, our chief information officer. We spoke back in November regarding a body of work that the ministry had devoted some amount of resources to in order to facilitate some progress regarding how we monitored outcome-related measurement in MCFD across all of our service lines.

Martin will present the second iteration of our performance measurement document. This is an evolving piece of work. We recognize there's always more to do, but we see it as something that is unique across child welfare–related organizations and ministries across Canada. While there is more good work to do, it's about as comprehensive an approach as exists right now. That's the second half of our show.
[ Page 154 ]

We'll start off now with Cory.

C. Heavener: Good morning, everybody, and thank you for the opportunity to attend today to give you a further update and start, basically, from where I left off last time on the ministry's response to the representative's recommendations.

Before you, you should have two documents. One will be an actual small report, which looks something like this, that speaks to MCFD's status and our analysis of where we stand with the recommendations. Then there's also a brief PowerPoint presentation, which will likely look quite similar to the report in and of itself. It's actually overhead on the screen as well.

[Audiovisual presentation.]

I thought what I would do this morning is walk you through not the report, page by page, at this point but the PowerPoint presentation and highlight some key areas of the report.

The last time we met in November, I spoke briefly about the ministry's interface and partnerships team and started to have a brief discussion around the ministry actions, pre and post, when an RCY report is received. Today I'm going to highlight the ministry's work in responding to the report recommendations.

I'm going to provide an overview of the ministry's analysis of the current implementation status of 24 reports that we've been following up on. I'm going to profile two of those reports to show you what a closed report looks like, where we've actually completed the implementation of the recommendations. In addition, I will show you a relatively recent report and some of the actions that are underway — that respect. Then I'll provide a little more information around the go-forward approach and how we're addressing the recommendations internally in the ministry, in and of itself, and also with the representative's office.

[0915]

The next slide is actually appendix 2 in the update report. You may remember this from our last presentation. There are a couple more bubbles on it this time. This is really to give you some idea of our interconnections and the work that we do at MCFD, the work that the interface and partnerships team does.

What I wanted to emphasize is that this office is situated in the provincial director of child welfare's office, but we're responsible for the coordination of the recommendation implementation responses across our ministry. At times we also reach out across to other government ministries to support them in their response as well.

There are a number of bubbles there. The orange bubbles are the advocacy work that we do with the representative. The purple bubbles are some of the work that I do with the representative's office and the provincial director's office does. We're a member of the Children's Forum, and we're also a member of the RCY multidisciplinary team.

The green bubbles illustrate the number of information-sharing processes and work that we do across the various divisions of our ministry, including with the interface team.

Today I'm really going to focus on most of the blue bubbles and all the work we do right from the time that we're informed that an RCY investigation or review is being undertaken until the actual report comes out and until we actually complete our action plans and have them signed off. The purpose of this diagram is to give you some idea of the work that the interface team does.

Just to build on that very quickly — and most of this you would have seen before — we provide information and data requests with the RCY. We organize training requests and systems access for the RCY staff. We coordinate pre- and post-RCY report activities. We facilitate the development of the MCFD action plans in response to the reports within this team.

We coordinate information briefings with the RCY on the ministry deliverables in the action plan. We track and monitor the implementation of the actions plans. And we prepare reports to share with the RCY on the completion of our action plans. This is specific interface teamwork with respect to the RCY recommendations.

Just before I go on to the next slide and we start to go into the actual number of recommendations and the ministry's analysis of the status of those recommendations, I did want to comment that when a recommendation does come to our team, quite often it will span across a number of different service delivery areas within our ministry.

A recommendation may come to the Ministry of Children and Family Development, but it may in fact be a recommendation that our service delivery area, our policy, our legislation, our training folks, that we all need to have a role to play in. There's a fair amount of work that happens when a recommendation comes forth to the ministry around the coordination and in fact establishing who will be the lead on the recommendation and developing project plans around that.

Later on in the presentation this morning I'll give you an example of two recommendations that came directly to me recently and the work that we've done around that. But I did want to emphasize that many times the recommendations may come across the ministry, and in fact they may touch on other ministries as well.

Currently we have 24 RCY reports that contain 130 recommendations since 2007. If you look on page 6 of the actual report, it will show you all of the various reports since 2007 and the number of recommendations in each of those reports. We've also included the numbers of details.

When a recommendation comes out, there's quite
[ Page 155 ]
often detail that helps us to understand the purpose and the intent of the recommendation. It's detail that we take very seriously as we're looking at implementing the recommendation. Then it also goes through our process around whether we've sent our letter out, if an action plan is developed, our close-out reports and the current status. I will come back to this.

Of the 24 reports, as I mentioned, there have been 131 recommendations directed to government and others since 2007. Of those recommendations, 107 have been directed to the Ministry of Children and Family Development. Amongst those 107 recommendations there are approximately 473 details that are attached to those various recommendations.

The analysis that we're done to date is six of the reports, which contain 25 recommendations, have been assessed by ourselves and the representative's office as being complete.

[0920]

If you go to page 6 of the report, where I just mentioned, you can see that it says "closed" in the green on the final column. Ten of the reports, containing approximately 50-plus recommendations, we've submitted close-out reports. So we've done a fair bit of work. We may not have finished all the work yet, or we've gone back, and I'll talk a bit more about close-out reports shortly. But they've been submitted to the RCY, and we're awaiting response.

Currently we have eight reports, containing approximately 29 recommendations, that are currently in the active stage. If you go to page 6, you'll see from No. 17 down on the left, starting with Honouring Kaitlynne, Max and Cordon, that those are the reports where there are currently actions underway.

J. Thornthwaite (Chair): Can I just ask a quick question? With the "awaiting response," that means awaiting response from the RCY?

C. Heavener: Yes. I should say that awaiting response is a process. I have a slide coming up — more information in a few moments. When we get to a place with our work and our action plans where we feel that we've accomplished and implemented the recommendations, or we may be looking at a recommendation that we had started within our own work and we're doing something a little different than exactly what the recommendation states, we put forward a report. That doesn't mean that we put forward a report and it's closed.

We continue to have a conversation with the representative's office around those recommendations. At times, and you'll see coming up, we do need to come back and do a bit more work, because we haven't exactly landed on where that's at.

D. Plecas: I'm just wondering if it is possible at some point for you to walk us through page 6, paying particular attention to…. Maybe it's just not clear to me whether or not recommendations which came out of 2007, for example…. What has not been addressed? I know the numbers are here, but it would be helpful to know: what is it out of 2007-2008 that still is outstanding?

C. Heavener: I can touch on that. I don't have all of that detail with me, but I can give you some idea of what that looks like.

D. Donaldson: Just a quick question, I suppose related to page 4 and the presentation. On the recommendations from the RCY that pertain to MCFD, who makes the decision which ones are acted upon, and how is that decision made?

C. Heavener: The decision is made…. It's shifted recently, so I should mention that. Up until 2011 there was a process in place where the representative's reports came forward, and the response from the ministry was…. There was a strategic plan within the ministry that would respond to those recommendations. That was in place until about January 1, 2011.

After January 1, 2011, there was an agreed-upon process with the ministry and the representative's office around what the process would look like and how we would respond. We would respond to each report, and the process is outlined in the next slide, actually, that's coming up.

Part of that process was looking within the ministry's own planning and policy cycles and budget cycles. Where did the recommendations fit in? Generally, the recommendations were accepted, and they were built into that process.

Currently we've actually shifted that process. To specifically answer your question directly, all RCY reports are brought to our executive team. We review the report. We review the findings. We review each of the recommendations, one at a time, and we make decisions around who will be the lead on that recommendation and where they fit into the current work that we have underway around our own strategic planning.

That's how that decision is made, and then folks go away. They do the work amongst groups and, as I mentioned earlier, at times that means bringing in folks from the service delivery area, the policy area, the practice area, and they come up with plans of how we're going to implement the recommendations.

It may be that we're already down a path on a certain recommendation, so we need to go back and look at…. For example, in quality assurance, we may be looking at, or we're already starting a certain audit program. How can we beef that up based on what the representative's recommendation is?

[0925]

If there are recommendations that we're not going to
[ Page 156 ]
do, but we have a different way or a different approach or something else is underway, then we will look at delivering that message to the representative as well. What I can tell you, as well, is that we're looking at tightening up that process at this time around our response at the executive level and building it not into our planning cycle but also our reporting cycle with the representative's office.

C. James (Deputy Chair): Just a quick question on the interface teams. You mentioned the interface team deals with a whole number of areas, including the representative's reports, and you said you have a very specific interface team to deal with those recommendations. How many people are on that interface team within the ministry?

C. Heavener: There are two full-time people on the interface team, and then there are a number of folks in the deputy minister's office that work directly with the interface team on ensuring that these recommendations are built into our operational and strategic directional plan as well as any of our divisional plans.

Mark just mentioned to me that the most recent report, Lost in the Shadows, was actually discussed at our executive meeting on Monday and leads were assigned to various components of those recommendations.

I probably should say — Mark, feel free to jump in here — that it's not as simple as saying: "Here's rec 1. What are we going to do?" We have a very fulsome discussion. With the last report, we are looking at it in conjunction with When Talk Trumped Service, the report that was released in November.

We break down those recommendations and look at: where do those fit in, and are there certain components to them? It may not be that one recommendation is assigned to one person as a lead. It may be that components of it are assigned to three or four different people.

M. Karagianis: I had some questions about the relationship and the discussions back and forth between the ministry and the representative. In cases where you may not agree on some of the recommendations, how do you manage that? And what is the process if the children's representative then rejects or doesn't agree with you on completion of any of these? What is the process around any of that?

C. Heavener: We do have an agreed-upon post-release process — how often we meet. Usually what happens is that a report will come out and we will meet with the representative's office to talk about the recommendations — we call that an assumptions meeting — so that we can learn exactly what was behind the recommendations and what we were looking at.

We go away, and then we come back to the representative's office and verbally talk to them about: "This is how we're going to approach this recommendation. This is what we're thinking." We get their feedback, and at that point we go away and finalize an action plan that we then present to them.

At times when we don't agree or we may be going a different route, we have that conversation. We make decisions around what we're able to do within the scope of our planning and priorities cycle in our budget as well.

M. Karagianis: Just a follow-up to that. What happens, Cory, in the case where budget constraints may prohibit you from following through as well? How do you manage that, and is that a recurring problem?

M. Sieben: I suppose the quick answer is yes and no. The budget is always a limitation on what the ministry can do. But from the sake of our executives' platform from which we can work, the budget is the ground floor in approaching any of the recommendations.

Where we start is what we have and what we anticipate we're going to be able to receive otherwise, either in conjunction with other ministries or through the ministry itself. That ends up really informing the conversation that Cory and her staff have with the representative's office regarding how we're going to approach a recommendation.

J. Thornthwaite (Chair): Carry on, Cory.

C. Heavener: I won't go into the next slide in detail. The responding to the recommendations — I think, I've covered most of that process.

[0930]

The next slide, which is actually page 2 in the update report, goes through the current status of the 107 recommendations. We have assessed with RCY that 25 of those recommendations are fully completed, as I mentioned. We have assessed that 37½ percent are substantially completed and 15.5 are partially completed.

The 37½ and the 15.5 add up to the ones that are before the representative right now, where we're awaiting their response on the additional actions that we've taken. And 29 of the recommendations would be in progress and would be related to the reports that I mentioned that are underway with our office at this time.

The next slide just gives a little bit more information. Again, I won't speak to it in detail because I have spoken to it around the close-out process with the RCY. We will send them a report — I'll give you an example shortly — saying that we believe we have met these recommendations. They will then review it, and they will confirm that status with us or not. This is where I mentioned earlier on there may be some discussions and further briefings about what additional work needs to be done around those reports.

We will then go away, assess that information that we've been given at that time and look at how we can ad-
[ Page 157 ]
dress the outstanding work. Then we can come back and do a second round of a close-out report with the representative. Again, this is the process that we put in place in 2011, which I mentioned.

The next slide gives you a quick overview of the actual process in play with one of the reports. This report came out in April 2011 from the representative. This was an issue report regarding phallometric testing in the B.C. youth justice system. This gives you an example from when the report came out and the post-release work that was done — the letter going out, meeting with RCY, an action plan being completed — through the fall of 2011.

The action plan was underway between the fall and the spring, 2011-2012. The ministry sent a close-out report to the RCY in the fall of 2012, and then the actual report and the implementation of the actions were completed in March of 2013. This would be one of the reports that, since our new process has been put in place, is seen as closed. No. 14 on page 6.

If you turn over the page…. I just wanted to highlight: what does that actually mean? What actually did we do? This is the youth justice side of the ministry. This report contained three recommendations around PPG testing in British Columbia. Prior to the report coming out this had been an issue in 2010 within the ministry. At that time the ministry made a decision to suspend this type of testing. The report recommended that it not be reinstated, that there be a comprehensive review of the youth sex offender treatment program and that we review our research approval process to ensure that the ethical standards are met.

The ministry did do a comprehensive review. It was done by an independent contractor. It did result in some recommendations and actions for the ministry to take. They developed a plan, and they followed up on that. The ministry reviewed and revised the research approval process to ensure it met ethical standards and made a decision that the use of PPG testing in B.C. would be discontinued. That happened over a 17-month period.

During that time — not to simplify what the recommendations were and what the ministry did — that process was put in place. The ministry would have met with the RCY, would have discussed the findings and the assumptions around the recommendations, drafted an action plan, presented it to the representative and, once that was completed, would have sent to the representative: "Here's our close-out report with all the actions that we've taken to implement these recommendations." That gives you one example of a close-out report.

I was going to give you an example next of a report that came out this September. This is the report I mentioned earlier where there were two recommendations that were directed to the provincial director of child welfare. This is a report that came out from the representative's office on September 17, 2013. It was a special report called Out of Sight: How One Aboriginal Child’s Best Interests Were Lost Between Two Provinces. This will give you some idea of the actual process that we're in now.

The report came out, as I mentioned, September 17. We sent a letter to the representative at that time accepting the recommendations. We've since had a meeting with the representative to discuss the recommendations that are in that report, to talk about some of the assumptions so that we're on track with what the representative was looking for. At that meeting we discussed with the representative some of the actions we had underway or were planning with respect to those two recommendations.

[0935]

We're in the process now of putting that into a specific action plan that we will share with the representative in the coming weeks. Our plan is that the actions will be underway for the next couple of months, and we're hoping that by sometime this summer we'll be able to report out that we're well on way to implementing those recommendations.

The next slide will tell you exactly what we're doing.

This report contained two recommendations, both directed to the provincial director of child welfare. The first recommendation was that we review all of our policies and standards for out-of-province placement for children in care, guardianship of the province, including cases in which guardianship is transferred to another individual under the Family Law Act.

The second recommendation was that the provincial director of child welfare recommend to the other provincial and territorial directors across Canada and the territories that there be a review of the protocol. There is a protocol that exists that talks about information sharing, child welfare alerts, how we work together when a child is either transferred out of our province or into another province. The recommendation was around actually having a look at that protocol and reviewing it.

The other piece was around issuing a practice directive — that the provincial director of child welfare issue a practice directive to all social workers in this province, around out-of-province placements, what the requirements are, what the responsibilities and the roles are. That is in our standards, and we wanted to look at that and how that lines up with the protocol.

I just wanted to give you some idea, with a report of that nature, with two recommendations…. There's been quite a bit of work underway, both within our ministry and within the province but also within our partners across the various provinces and territories.

As far as the first recommendation goes, we are looking at our policies and our standards to ensure that they're rigorous and that they're linked with the protocol so that social workers fully understand what is required when a child moves out of province or what our responsibilities are when a child moves into our province and is from another province.
[ Page 158 ]

We are going to issue a practice directive within the coming months. We have a working group made up of staff across the various regions of the ministry, including staff from the provincial director's office, looking at this issue and what exactly we need to ensure that social workers know and how we can help to inform them and support them to follow all of the required steps when this happens.

The other piece I wanted to mention is the tracking system for provincial placements. All of the service delivery areas now have tracking systems. They track all of the children in care that are moving out of the province. They also track the children, as I mentioned, that move into the province from other jurisdictions.

We've been working with that group of people to ensure that we develop a provincial tracking system. That tracking system will be monitored and reported out through our office. The provincial director of child welfare is taking oversight of the tracking of all out-of-province placements. That would also include out-of-country placements.

What we're doing now is reviewing all of the current tracking systems that are in place, and we're also trying to go one step further. We don't want it to be just a tracking tool. It will be somewhat of an audit tool. Have all the appropriate steps been followed when a child has been placed? Is there an appropriate plan of care? Were all the appropriate contacts made? Who were they made with? What is the follow-up? All of that type of information. We'll be doing something similar for the children that come into our province.

Do we know the children that we place out of province? Yes, we do. We track that. We want to be more rigorous around tracking that, and it's something that we're going to take oversight for at our office.

As far as the review of the interprovincial protocol, the report, as I mentioned, made a recommendation that we recommend that that be followed up on. That's something that we've done on various levels. We have made that recommendation to our partners. We made that in November. The minister also contacted her colleagues across the provincial governments and sent a letter making that same recommendation.

The work has begun. B.C. is taking the lead. We've decided that we want to lead this process, and we have a project plan around the process. We have monthly meetings, and we have a contractor that's helping to support this work to be done in a timely manner.

[0940]

As you can imagine, it's across all of the provinces and the territories, so it will take us some time to actually get the protocol to be finalized. I think we've gotten off to a really good start, and there's been some really good work done.

What we've asked that table to consider is all of the details that the representative's report recommends as well. It doesn't just mention to review the protocol. It also has a number of specific details that it asks us to look at when we are reviewing the protocol. That gives you some idea of one of the reports that is a work in progress.

J. Thornthwaite (Chair): I have a question, if I may interrupt you. It's related to exactly what you just said.

You just answered the question that I had, which was the definition of review. What happens if you don't get the response that you expect or want from either other provinces or territories or even the federal government?

C. Heavener: Well, that does happen.

J. Thornthwaite (Chair): I'm sure it does. That's why I asked the question. Where do we go? We can put recommendations, but we can't make other jurisdictions do what we think they should be doing.

C. Heavener: I can tell you that with respect to the protocol, so far we haven't had that occur.

J. Thornthwaite (Chair): Okay.

C. Heavener: That's been good. If something like that was to occur, there are a number of different avenues. One of them would be…. Because it is a provincial director — so my colleagues across Canada — leading this work, I would go to my deputy minister, who has contacts with his colleagues, and see if it would be something that could be reviewed and discussed and dealt with at that level.

Then the deputy minister would also have a number of different options for how we would look at that.

I think, as well, it would depend on exactly what the issue is that's being put forward and if was there another way to mitigate that or to address that.

J. Thornthwaite (Chair): You mentioned that B.C. is taking the lead — and I think that this committee would probably be very happy to hear that — with regards to moving forward on this interprovincial protocol. But my point is…. Are they also dealing with the same issues? Or are things different province by province? Maybe they don't have the calibre of responses that our ministry has with our reps? How does that work?

M. Sieben: It's an extremely interesting dynamic. I'll try to remember, and if the Chair reminds me at the end of Cory's presentation, I can give an example in conjunction with one of the most recent recommendations we've received from the representative's office, from her last report.

J. Thornthwaite (Chair): Okay. I think that would be very helpful. Does anybody else have any comments be-
[ Page 159 ]
fore she moves forward? Okay.

Sorry to interrupt you, Cory. Go ahead.

C. Heavener: No, that's fine. I'm close to the end here — sort of circling back now to where I started at the beginning, just around responding to the recommendations.

Again, I just want to highlight that our current approach has been in place since 2011. We have a really collaborative, strong working relationship with the representative's office around the sharing of information and where we're moving on the recommendations.

Our go-forward approach is that we want to ensure that we clarify our process and that we provide the representative's office with what they need in response to the recommendations, so we're hoping to do some work shortly around that — I know I mentioned that before — just to ensure that the process we put in…. It's probably timely, since it's been 2011, to have a look at it. How is it working? Do we want to make any adjustments to ensure that it meets everybody's needs?

Then the last slide is something that I've actually spoken to a lot already today. It's something else that I've already spoken to and that we had a question on earlier. We are working extremely hard in the ministry to integrate the recommendations into our strategic agenda.

As I mentioned, whether that's a strategic operational plan, whether it's our divisional plans, whether it's some of our branch planning, we now work closely with the deputy minister's office, and we're building it into our reporting.

Our goal is that when the time comes where we need to report to the representative on the status of recommendations, it's routinely built into our reporting cycle, because it's part of our day-to-day work that we're doing around our planning.

That was the PowerPoint presentation for this morning. I could go back to one of the questions around page 6 and the status of the report and some of the things that would be seen as outstanding and talk a little bit about that. But I take your advice on where to go.

[0945]

J. Thornthwaite (Chair): Yeah, that would be great. Thank you very much.

C. Heavener: On page 6 are the 24 reports. Maybe I'll just go through a couple here. If I go to, actually, Honouring Christian Lee as well as Honouring Kaitlynne, Max and Cordon — the Schoenborns — those reports regarding circumstances where children lived in domestic violence — one of the pieces of work that's still outstanding but that would be underway is that there was an amendment made to the CF&CSA legislation. That was last March. It hasn't been officially brought into effect because we made a commitment that that work would be brought into effect once we did sufficient training in partnership with the anti-violence sector. That work is about to take place. Then, that recommendation will be in effect at a later date.

I don't have the information with me to go through recommendation by recommendation per report, as per your question, to say what's outstanding. I have no problem gathering that and bringing that back. We do have that information. But that would be the type of information that would be outstanding.

The other piece would be…. Sometimes there are recommendations made that we are planning on going a different route as a government, not just as a ministry, than where the actual recommendation is. So we will go and talk to the representative about where we're at with a certain recommendation and how we're proposing that we intend to address that recommendation that may be a little different.

There are also recommendations outstanding from a generic standpoint. Martin will go through with you some of the performance measures around some of the reporting mechanisms that have been suggested in other reports where we haven't quite done that work yet.

J. Thornthwaite (Chair): That was in answer to your question, Darryl. Did you have any other…?

D. Plecas: I'm thinking that it would be great, Chair, if we could get that information, at some point — a listing of which ones haven't been attended to.

J. Thornthwaite (Chair): Okay. Thanks.

C. James (Deputy Chair): I would agree with the member across the way on the recommendations. I think it would be very helpful. Because this a new process for the committee to be looking at the recommendations and which ones have been completed and which haven't, I think there's a bit of catch-up to do. My hope is that because this will be a regular routine around the committee, eventually it won't be that difficult for each report to come forward and have specific recommendations spoken to. It'll be part of our routine, and we won't be playing catch-up with all of these reports.

I recognize that right now it's a bit of extra work to look at the specific recommendations, but I think it will be helpful for the committee. There's a big difference between very straightforward recommendations and changing practice on very specific recommendations that may have budget implications or other things that will be helpful for the committee to know. I think that kind of information would be very helpful.

Just a question around page 5 in the report, where you talk about the pre-release, post-release and the action plan and developing the action plan. I just wonder: when you develop the action plans, how specific do they get? I
[ Page 160 ]
think it's pretty clear if you look at the post-release. You meet, you talk about the report, and you agree on which recommendations are going to be looked and then put an action plan together and finalize it. But I think the key really is: how long does it take between those action plans?

I recognize not all of them can be actioned quickly. I think you give a perfect example of protocols with other provinces that are not in your control and it's not something that you would expect the ministry to be able to develop in a particular timeline because of other jurisdictions and other work. But some of them are within the ministry guideline. So I wondered what kinds of specifics are in those actions plans around timelines and reporting back to the representative on those timelines.

Then, I think the other piece, just to touch on your last point, is around integrating it into the ministry strategic plan. I would guess, again, that's going to help with timelines, because it'll become part of the regular ministry work rather than something extra on the outside that needs to be developed. That certainly, to me, is a better way of approaching it. It becomes part of the ministry practice because it's something that needs to be done to improve things in the ministry.

C. Heavener: The first question. There is a template, and it's fair to say it's been a work in progress, the template for the action plans. It's gone anywhere from being: here's a recommendation, and here's everything we're doing, plus…. So it lists a number of different things and the timeline. Some things are directly related and others are linked to our work.

[0950]

We're trying to tighten that action plan up and being really clear about: "Here's the recommendation. What are you doing? Tell us what you're doing." And we don't need to know, like, 50 other things that may be linked to it. And where we want to know more information or you suggest that, then we can provide that additional information. That's the process we're working through right now.

What we are providing to the representative's office, as per the 2011 approach, would be each of the recommendations, then a bullet type of response — some are obviously longer than others — and then some links to various policy or training events or things that we've done in that respect. We are, though — as part of the last slide and the work we're doing with the ministry — trying to tighten that up to ensure that we're providing what would be helpful and what the representative would see to be necessary to achieve that recommendation.

That is a work in progress. I think we're getting to a better place, especially around the reporting side. The templates also do have a time frame when we're planning to have this work done, and we'll be building that into our planning cycle as well. So that reporting, as you noted, should become easier for us to report out on.

D. Barnett: Thank you for all the great work you're doing. I just have one question that relates to what my colleague here, MLA Plecas, asked for.

If the reasoning on why we can't implement a lot of these issues here that we're looking for answers to…. Is there any simplified way that you can tell us why we can't implement them? I know there are reasons why. I know some of it doesn't work within legislation or policy or whatever. But it'd make it easier for me to understand some of these issues — a simplified why not.

M. Sieben: The ministry's view is that, on the whole, we do. We do implement the recommendations. We may be compromising in terms of meeting them letter by letter, but our focus is on doing the very best by the recommendations that we can. And the table in the report that Cory has provided, in our view, reflects that.

The process we've put in place is of report by report, in order to make sure, first, that we understand what the assumptions are beneath each recommendation and, then, have an opportunity to develop an action plan that is shared with the representative and effectively signed off with them, which would be our guide towards delivering on the recommendation. And then, the time, basically, that it takes in order to actually complete much of what it says often — as one of the members has noted — takes at least months sometimes, if not years.

Frankly, there are very few recommendations where the ministry would say: "No, we do not agree." In fact, I can't think of any. There may be one or two from the early mid-2000s that I may have glossed over. But it's more a matter of how we achieve the recommendation and over what period of time. Frankly, part of the challenge for us is, as Cory has sort of noted, isn't the recommendations themselves. It's the detail associated with the recommendations.

Each of the recommendations, as I'm sure members of the committee have noted, often has a number of details associated with it, and the details are in fact little paragraphs regarding how we should approach each of the recommendations there. Really, they're mini-recommendations in and of themselves. It's not just the recommendation that gets discussed with the representative's office. It's each of the little, individual details and how we respond to those details in the individual action plans.

Summing up, again, where Cory has gone with how we're responding to each of the recommendations, MCFD feels extremely confident in its ability to speak to each and every recommendation that comes from the representative's office.

[0955]

However, it ends up being a patient process because there are so many and because there is so much work associated with tracking and delivering on each of the
[ Page 161 ]
recommendations.

We would look forward to an opportunity to explore report by report or by some sampling of reports — practically, probably, that makes more sense, frankly — to give you some amount of feel for the nature of the work that goes into responding to each of the recommendations. But they're treated with earnest attention from the ministry.

M. Karagianis: My question probably follows nicely on Donna's question here.

As I asked earlier, about budget implications, there have got to be cost pressures associated with a number of these. I'd like to know how many of them are currently perhaps not concluded because of cost pressures. And what's the trade-off? If following through on some of these recommendations takes money from the ministry, how do you reconcile that?

There's been a commitment, generally, and I think unanimously by government, on accepting reports by the children's representative. So where there are cost implications that are significant, how do you deal with that internally? What's the trade-off there? And is it possible for us to know how many of these have cost implications that perhaps have some influence on how quickly they can be instituted and not instituted within the ministry?

M. Sieben: They all have cost implications. The process that we've outlined for you here has cost implications in and of itself. This is a process and this is an allocation that simply did not exist pre-2006. This has really evolved as a result of the relationship with the representative's office and the ministry's response to the individual reports has evolved.

I'd also note — I promise I'll come back to the question — that in addition to each of the individual steps associated with responding to each of the individual reports, as Cory has noted, there are also regular recommendation tracking meetings that Cory and her staff have with staff from the representative's office, where we do much as what has been discussed here. The chart that you see there, basically, is visited to make sure that we have clarity on where one another stands and what's going to happen next so that we have the same understanding of where we're at in that process.

In response to the question regarding when reports have budget implications for the ministry and how we manage that, my view is that the ministry's primary responsibility is to deliver services according to our six service lines. The Minister of Children and Family Development has a letter of accountability, which the ministry is pleased to assist the minister in delivering on.

Those are our primary responsibilities, in my mind. The representative's reports and her recommendations are guidance to us on how we can achieve those primary responsibilities more effectively. I wouldn't think the representative's office would see it the other way around, either. Their guidance, for us, is how we can do a better job of delivering service under those six service lines. But that is the primary responsibility — the actual delivering of service — and a budget is allocated by government for that purpose.

When we approach the recommendations, my view is that we approach the recommendations within the budget that government has given us or within the budget that is otherwise available to government if it's in a matter that is relevant but not part of MCFD's six lines of service delivery.

That's how we inform our action plans, and that's how we inform our discussion with the representative's office. It's likely more the representative's office's purview to comment on whether or not that is adequate and where there is a shortcoming.

[1000]

D. Plecas: Mark, can you tell us basically what the expected cost is per year of delivering on those recommendations? On the six lines that you have be attentive to…. You must know, because we've been doing this since 2006, that there is going to be a certain added cost as a consequence of responding to recommendations.

I'm asking that because I'm wondering — you didn't imply this — if there should be some added budget provision to enable you to respond to those recommendations as you think you should. Particularly since you said that you respond to, virtually, and agree with every one that comes to you, one would think that there would be some kind of budget that's going to allow you to respond.

M. Sieben: That would certainly make things easier for ourselves and the representative's office, I think.

It's an interesting question. I can't say that I have such a valuation. It would be a very interesting exercise, I think, both for ourselves and for the representative's office, and I'm sure many others would find that of interest too.

In my mind, our focus has been on…. We are, by all accounts, a lean ministry. Our management-to-service-delivery ratio is modest in comparison to other parts of government. We're oriented towards action. Our focus is on what we can do as opposed to what we can't do, and our discussion, through the action plans, is what we can do to achieve the outcomes anticipated through the recommendations, through the resources, through the staffing and through the guile that we have available, as opposed to what is left outstanding.

Again, the representative's office is extremely able to comment on where there is a gap and where there might be more.

C. James (Deputy Chair): Two areas to ask a question on. The first one is: who takes accountability for recommendations that have been routed to other ministries?
[ Page 162 ]
Does the accountability ultimately still stay with the ministry, or does the representative's office then have to have accountability back with the other ministry? The one I think about, of course, is report 21 on the list, which is about youth mental health, where it says that the recommendations are being led by the Ministry of Health, but youth mental health is within the Ministry of Children and Families. So there is a link there.

I think of the recommendation around a secretary of state for youth mental health. That was a recommendation that came forward that hasn't been implemented and that may not be agreed to, and that may be a recommendation that has gone to Health, which is why it wasn't included.

Just a question around accountability, and then I'll come back to my second issue.

M. Sieben: That's a really good question, because there are a number of recommendations that have a number of different layers of complexity for government to try to digest and determine who best is able to respond, and the work that was led on the domestic violence front is a really good example of that. Sometimes it takes a while for consideration of who is best able to lead and where the responsibility and the onus, at the end of the day, really lies.

In anticipation of a representative's report being received, there is some amount of discussion within government about where the primary responsibility lies and which ministry and which minister is going to speak to that. In conjunction with that specific report, it's the Ministry of Health and the Health Minister that has assumed the role as a lead on the overall response, which is a new experience for MCFD, frankly.

[1005]

Usually it's MCFD that's in the lead. When we're in the lead, the way we see it, there are certain component parts that we're primarily responsible for. Then there is some amount of cajoling and accountability that is required for us to make sure we have enough to go forward and speak with the representative about when we have our regular meetings.

The representative's office, to give them credit, in conjunction with their report being delivered, often visit with a number of ministers and a number of deputy ministers in order to let them know what's in the report and what is relevant to them, and then their ministry decides who's going to lead. So sometimes some of those meetings that Cory has referenced may have senior leaders from other ministries participating in them too.

I might comment further and tie back to the Chair's question regarding where some of the challenge might be with some of the other players, including the federal government. For example, the most recent report that we received….

The third recommendation from the representative's office — I'll read it quickly:

"That the Ministry of Children and Family Development, in consultation with delegated aboriginal agencies, the Ministry of Education and Aboriginal Affairs and Northern Development Canada, ensure that special needs services are provided to First Nations children and youth living on reserve on at least an equal basis with all other children in a manner that is effective and responsible to the needs of children and youth."

And then there are nine paragraphs that are listed as details beneath that, with the request that "the effective provision of CYSN services for First Nations children and youth living on reserve will be implemented no later than October 1, 2014."

That's a challenging recommendation, one of six recommendations that are associated with that report, and recommendations 1, 2 and 3 are equally as big. So in conjunction with the previous report, the government's report that the representative delivered and which MCFD is responding to and accepted the recommendations, I sought out a meeting with the assistant deputy minister responsible for child welfare in Aboriginal Affairs and Northern Development Canada. I'm looking forward to having a discussion with that assistant deputy minister next week.

In the meantime, we have had the benefit of receiving this further report. The Chair's question is right — ensuring that there is impetus and importance placed on the recommendation and the work beneath it, which is taken as seriously in other places as it is at MCFD and can be a challenge. At the same time, I'm aware that the representative's office and the representative herself have been in touch directly with some of those same officials in order to perhaps lay down some of the foundation for us to further those discussions.

So in conjunction with this recommendation, and given the process that Cory has laid out, what we have before us, as Cory noted…. We reviewed recommendations associated with this report at our ministry executive this past Monday. It takes that amount of time between when the report is released and now to digest it, to question some of the assumptions and ask for clarification.

And then there is a question about taking a recommendation like the one that I described to you: "Okay, now, exactly who among our leadership team, small but mighty though we might be…? Who's going to do that?"

Who exactly is going to be responsible for coordinating a meeting with the Ministry of Education and working with delegated agencies and then hopefully being able to encourage or inveigle INAC to come forward in order to discuss special needs services on reserve? Everyone on the provincial government side would certainly agree it's an area of federal responsibility to which the MCFD is already occupying some amount of space.

[1010]

That makes a very challenging working environment, keeping in mind that again, as I noted before, our primary responsibility is delivering services under our existing six service lines. So that October 1, 2014 date is
[ Page 163 ]
a challenging date.

Part of what our discussion would be with the representative's office is, really, what is, on a practical and realistic and even ambitious time frame, likely to happen on or before or shortly after October 1.

C. James (Deputy Chair): A second area just to touch on is, again, the follow-up to the reports and perhaps its definition, as well — looking at definition.

When you talk about closing out a report with the representative's office, does that mean that you have agreed, then, on the strategy and the action plan that's in place? Or does that mean that there is agreement that the implementation has happened? It's a question around definition.

Following up on that has to do with…. Perhaps this will come in our discussion later in the morning when we're talking about data, but the follow-up that occurs after a recommendation has begun implementation.

Just to give you an example, a recommendation that more plans of care be implemented, in a report that had come in the previous time — is there tracking then that happens after that has been implemented, where you say: "Yes, we're going to put more plans of care in place"?

Is there tracking then to say, "We've improved; year over year we can show that we've done a better job of plans of care or specialized residential beds," for example? What kind of follow-up is then done with your data-collecting and with your performance measures to show that these recommendations actually have occurred?

C. Heavener: The first question around the close-out reports. The close-out reports, as I mentioned, are a process where we would put together the actions that we've taken in response to a recommendation and send that forward. Then there would quite often be discussion of: have we met what the intent of the recommendation was or not?

Our intention is to have a discussion with the representative's office by providing the close-out report, saying: "We think we're there. Do you think we're there?" So it's not as simple as: we send the report saying we're done, it's closed out, and we leave it at that. It does have further discussion.

The ones that are awaiting response now — we're looking forward to hearing back on what the status of that is. It may be that we need to go back and do a bit more work on some of those. But it's more, I think at this point, of a process where we start to say: "This is where we think we're at. We think we're at the end of the completion of those recommendations."

M. Sieben: The vice-Chair is right. I think some of that we'll be able to cover in our performance management discussion. However, generally, for things such as completion rates for plans of care, those have a monitoring process that we have in place in any event.

It's just a matter of us being able to make sure that we can provide the information when it's called upon, not only for our own needs but also within the timelines as suggested by the representative's office. Or if we can't meet those timelines, then we can identify what the reason is and what we're going to need, whether that is information or a resource or time-wise in order to live around that.

J. Thornthwaite (Chair): I'm seeing no other questions.

In closing out, I just wanted to summarize. The Clerk had recommended this with regard to Darryl's question. If you could just provide us, in writing, with some more specifics that Cory had referred to, then we can get back to you if there's anything more that we need. Then we don't have to get you to come back for another meeting. Just give us something in writing. Then Darryl can take a look at it, we can follow it up in writing, and that can be given to the group. That might simply the process.

The other thing, just before we move on. I really do appreciate your "small but mighty" leadership team and all the work that you have to do. There's a lot that you do that isn't actually rep-related. You've got a whole other part of your job.

My question is a little bit more provocative. I appreciate that the rep is doing her work with the other jurisdictions, the federal government, etc. But there seems to be…. In a lot of these reports a lot of the jurisdiction requirements are actually not ours, meaning the province's. They're somebody else's.

[1015]

I'm looking forward to the rep really, really reaching out to those other jurisdictions, including the federal government.

But from your perspective, the ministry's perspective, do you have an ideal, crystal ball plan that you would like, as far as working with these other jurisdictions, or do you have any recommendations that would be something that you would like to see back as far as working with these other jurisdictions?

M. Sieben: The response likely varies a little bit, subject to what the topic or the theme is.

J. Thornthwaite (Chair): I see.

M. Sieben: I think on the whole, the representative's office and MCFD share many of the same goals, particularly when it comes to improving services for First Nations children on reserve. The federal government, whether it's the Health Ministry or whether it's the Aboriginal Affairs Ministry, is simply a central player to those discussions.

There are discussions that occur cross-jurisdictionally
[ Page 164 ]
regarding the bilateral agreements that are in place in other provinces that aren't in place here in B.C. and how we can work more effectively to try to achieve what we think is deserved here for First Nations children and families on reserve. That is work I think we have a common interest in. The recommendations are sort of helpful in that regard.

We're aware that while the recommendations aren't…. The representative's office is an entity of provincial jurisdiction. Their energy and the representative's effort to make sure that there is discussion, relevance and awareness amongst federal officials of the nature of the need — frankly, not only here in B.C. but in some other places — is extremely useful to all of us and gives a strong platform from which discussions from deputy ministers and ministers can proceed.

Some of the other themes that are more cross-jurisdictional in matter, such as the interprovincial protocol…. There are a number of tables by which that can be facilitated. The provincial directors of child welfare table is a longstanding forum that I used to sit at and that Cory now sits at. That is focused primarily on child welfare policy and practice initiatives.

In addition to that, there is a deputy ministers of social services table that gets together, let's say occasionally. It had a history of not meeting for a period of about four years but in the last couple of years has met twice. Similarly, there is a table of social services ministers, both PT and FPT, that Alberta is the current lead of. I know there is discussion amongst the ministers regarding sharing interests on common social services matters.

I should note that part of the challenge is the breadth of duties and responsibilities to ministries and governments on the social services side. The social services deputies table and the ministers table encapsulate everything from jobs and training to income assistance to Community Living B.C. to child care to child welfare. All of those are topics that you can invest any number of weeks in exploring.

[1020]

Something, for example, like the federal government's somewhat recent decision regarding the labour market development agreement ends up becoming a primary focus to many social services deputies across the province, given the change in service delivery and given the dollars that are attached to that. The different topics necessarily compete for time amongst those decisions.

Those are some of the primary forums which we use in order to make the points that arise from the representative's report a little bit more real.

J. Thornthwaite (Chair): You've given us a better appreciation of your work as well. I appreciate that.

Carole's got a quick follow-up, and then we'll move on.

C. James (Deputy Chair): I want to say my appreciation, as well, to staff in the ministry. I know one of the first requests of this committee was to receive updates on a regular basis on the recommendations from the representative's office and how the ministry was dealing with them.

I appreciate that it takes some work until we get into a regular routine, but I think the fact that the committee has asked for this a couple of times a year will ease the workload in bringing it forward, once we start catching up on the recommendations. I think it provides good accountability for us as committee members around the reports and a better understanding of how those are implemented.

I appreciate it and look forward to this being now a regular occurrence a couple of times a year, where we're able to look at the recommendations and get updates on them and do the follow-up that needs to be done. So thank you.

J. Thornthwaite (Chair): Thank you very much. It's been very enlightening for us, for the committee, and I really cannot overstate the thanks that the committee has.

Why don't we take a little break, just because we're going to change a few chairs, and maybe grab a coffee. Maybe three minutes, and then we'll come back. Is everybody okay with that?

The committee recessed from 10:21 a.m. to 10:31 a.m.

[J. Thornthwaite in the chair.]

J. Thornthwaite (Chair): (Chair): Mark, we'll just throw it over to you, and you can continue on.

M. Sieben: Sure. This is a big piece of work, as I noted in the introduction, that continues to evolve. I would note that it's also a project of a lot of interest to the representative's office. They continue to provide advice regarding shape and form and look for MCFD to do more in this area, as they should.

At the same time — as we'll cover from a Canadian jurisdiction perspective, in any event — we're thinking we're providing more than exists anywhere else. That isn't a platitude or a platform from which we should get laurels; it simply gives us an opportunity to do better. As Martin will note, there are many points in which we can and should and will do better in terms of having data and development of outcomes across all of our six service lines.

With that said, I'll turn it over to Martin to lead us through the delivery of the second version of the Operational Performance and Strategic Management Report, and identify when we anticipate getting the third volume.
[ Page 165 ]

MCFD Operational and
Strategic Directional Plan

M. Wright: Thank you, Mark.

Good morning, committee members. I'm very happy to be back to talk a little bit more about the performance management system within the Ministry of Children and Family Development and, particularly, the Operational Performance and Strategic Management Report. I know it's rather a mouthful. It's certainly connected with the strategic operational and directional plan, but they're two distinct things. I just wanted to make that point.

The minister had made a public commitment, as you'll no doubt recall, some time ago that we would publicly report, twice annually, on performance indicators related to the ministry. We've committed to do that — once in the winter and again in the summer. We've done two reports to date, and as Mark mentioned, we're busily working on the third. We anticipate that will be ready for publication sometime later this winter.

The objectives of the report. Really, public accountability, of course, is number one. We want to be able to tell the public what we're doing and how we're doing it. Numbers 1 and 3 there really are related to public accountability, where we facilitate the public understanding of the work of MCFD as well as inform people about what we do and how we do it.

The other key objective too, of course, is that we want to improve. We want a culture of continuous improvement within the ministry. We have one now. We want to continue to foster that. This report certainly is conducive to that end. That's really a central thrust of this report.

Before we get into some of the content, I just wanted to give you a little graphic here of what we're really talking about with performance management within the Ministry of Children and Family Development. All of our service lines are included. The green boxes there — each one represents one of our six service lines. At the top there you'll see "Written operations" and at the bottom, "Outcomes."

[1035]

These are the two dimensions on top of the six service lines and the geographic depiction of our data both provincially and across our service delivery areas. "Operations" refers to things internal to the ministry. So we might look at utilization rates. We might look at unit costs or particular aspects of practice that are preferred measures of practice.

On the outcome side, this is where we really start to look at the well-being of the children, youth and families that we serve. The two are certainly related, but we draw that distinction. Really, all this slide talks to is that we're taking a comprehensive approach to performance improvement within the ministry. We're not focusing just on one service line in particular or aspects of outcomes in particular. We're looking at lots of things.

That said, we are further ahead in some areas than we are in others. While we have information and some activities around performance improvement in each of the service lines, in some where we're blessed with more data, we're able to produce more performance indicators. As has always been the intention with this report, of course, it's evolving. It's being developed and implemented incrementally over time.

I just wanted to provide some content here. Members, of course you have the actual report in front of you as well as the slide deck. These are just some of the highlights that I've looked at. If any of the members want to talk about any of the other elements of the report, I'd be happy to do so.

When we last spoke about this in November, the placement stability was of interest to the members. So I honed in on this one first of all. It's a key performance indicator for the ministry, and this is what I would call an outcome performance indicator.

We understand that when a child comes into care, stability of placement is key for not only the security and feeling of safety of that child but also an opportunity for attachment. If a child is going to remain in care, an attachment to a caregiver is fundamental to well-being.

Amongst the indicators of placement stability is looking at the first year of care. The reason for doing that is that about half of all moves occur within the first year of care. Then the other half occur usually for children who have already moved within that first year of care. So the idea here is that if we can focus on children when they first come into care, within that first year, and we can provide placement stability for those kids, we prevent additional instability later. So we can have the largest impact in that first year.

Here we've got one indicator but two data points, one representing each of the two reports. In red there, that's the first report. April 1, 2012, to September 30, 2012, is the time period. In the blue is the second report, the six-month period after that between October and March 2012 to 2013.

What we're looking at here, first off, on the left-hand side is "did not move." Those are the children within their first year of care that did not move. They have placement stability. So you can see there for the initial report, the first report, that number was 67.6. That increased by 1.1 percentage points to 68.7. So that's good. That's what we want to see.

The children that moved once declined slightly, as did the children that moved twice or more. Again, not unexpected of course, because the children that did not move increased. So this in and of itself is an important performance indicator and is telling us that between those two reporting periods, the ministry moved in the right direction. Clearly, with a ministry as large as ours, we're not going to get huge shifts in this indicator. It's going to be incremental.
[ Page 166 ]

I wanted to show this to you as well. This is something which you won't find in the report because we've really just done this, but it will probably be in the next one. I wanted to show you how this is performing over time, this particular indicator of children that did not move in their first year of care.

[1040]

We do some work back at the ministry with our service delivery areas in providing them with additional information related to the indicators that you'll see in the report. When we trace this back to April 2011 to October 2013 — two and a half years — you'll see that there's some volatility. But overall, the proportion of children in care in their first year who had placement stability is increasing. Again, from a performance improvement standpoint this is the kind of thing that we would want to see.

I do also want to point out that the other reason for showing you this is that when looking at performance indicators in a ministry like ours, sometimes we get some small numbers involved in some of our performance indicators. We're not the Ministry of Health; we're the Ministry of Children and Family Development, so we deal with far fewer cases, in many respects, in our programs. So you do get volatility.

If we're going to draw great conclusions about how we're doing, we need to be able to distinguish between: what is a change in performance — be it good or not — and what is volatility? What would we normally see?

That's one of the reasons why we've mapped the data from month to month — to see over a longer period of time what's actually happening. Can we say, with some statistical certainty, that things are indeed getting better?

Here's another indicator that we've used. This one is more on the operational side, an example where we're looking at child and family services. We have a service called family development response, which is one response to a protection report. It's a collaborative response, working with families in order to provide services quickly, predominantly to families on the lower-risk end of the continuum. The idea is that we can provide services quickly to families to keep children safe without the need for a more intrusive and less cooperative child protection investigation.

On the right-hand side there, if I can draw your attention to the squiggly lines on the right-hand side of that graph, in the blue is the proportion of protection reports that we investigated. So we did do a comprehensive and fairly intrusive investigation. The green is the family development response. There is a break there. It represents a little bit of data loss because of the switchover to ICM in April 2012. But you can see that we've got data past then.

You can see very clearly, then, the change in the practice, where we're doing fewer investigations and more family development response. That's good. That's what the ministry wants to be able to do. It wants to do more FDR and fewer intrusive investigations.

Not to say that we'll end up doing no investigations. We don't. We still do. You can see there that we do. About 20 percent of all child protection reports are investigated, but only 40 percent of those investigations have a finding "in need of protection." So maybe we don't need to do as many labour-intensive, intrusive, trust-squashing investigations with families. Perhaps we'll get further with FDR, and we know that to be the case.

One of the indicators that we have in the performance management report, which you'll also see in the service plan, is the actual ratio of FDR to investigations. You can see there that that number is 2.2 over the period of October 2012 to March 2013. In other words, we did more than twice as many FDR responses than we did investigations. The pie chart there shows you the actual numbers. That's an example of an operational performance indicator.

This is where we sort of need to look at two sides of the coin in terms of what's an operational indicator and what's an outcome indicator. I'm going to show, again, the family development response and how this now flips around to an outcome indicator for children, youth and families.

We've just talked about how the family development response numbers have been increasing dramatically — more than 20-fold since the inception of this program. But at the same time, how has that affected children in particular?

[1045]

One way to look at that is to see how many of those children, after an FDR has closed, actually come back into the child welfare system. We get another report, but this time we have to do an investigation because it's serious, and we find there is in fact a need for protection. This happens because no service here is foolproof. Circumstances change, or we have breakdown.

You can see there, between April 2011 and this one right up to March 2013, about 8 percent in April 2011 — just over 8 percent of FDRs — did in fact break down. We ended up doing an investigation and having a finding in need of protection. But by March 2013 you can see that's 4.9 percent.

Again, this is what we would expect to see in the ministry. This is what we want to see. We want to see that recurrence rate of need for service and protection decline. So for family development response, this is that. Again, there will be some volatility from month to month, so we've looked at it over a period of time to just try and make sure that in fact that is a real change and not just natural volatility.

Another example of an outcome indicator, of course, is adoption for children and youth in care, particularly for the continuing custody orders I'm talking about now — permanent wards. Ideally, of course, we would like to place all children that are eligible for adoption. Sadly, that's not the case.
[ Page 167 ]

Looking at the two reports we've produced so far, we've looked at the proportion of those children that are eligible for adoption and how many of those were actually placed over a 12-month period. In the red you'll see the initial report. We reported that 13 percent of those children were placed. Then we've looked at another 12-month period, between April 2012 and March 2013, where you'll see that 12 percent of those children were placed. At the bottom there I've just provided you with how that looks over time.

This is an area in the ministry where we know we would like to redouble our efforts and increase the proportion of kids that are eligible for adoption who can be adopted. This is an area we'll be looking at in the future and expecting to see some improvement.

Youth justice — another service line. I wanted to share with you what we have in the report for youth justice. Again, it's just a highlight of some of the key indicators that we have there. Recidivism — the youth that commit a new offence or in this case do not commit a new offence. We've looked at three elements of youth justice.

Diversion, which really is outside of the sentencing structure, is an agreement between the Crown and the youth in order for them to be able to agree to do some other community service, perhaps, outside of the sentencing structure. It's a community-based approach. You'll see there "not committing a new offence" — those numbers are very high. In 2006, 69.8 percent of the young offenders did not commit a new offence within five years, over the preceding five years. That increased to 71.4 percent in 2007.

First community sentence — that is inside, of course, the sentencing structure. It's a first sentence and community-oriented. We had 49.2 percent in 2006, and that increased to 49.6 percent. Again, these are the percentages of the children or the youth that do not commit a new offence within five years.

Then we have the first custody sentence, which are the most serious offences and the most entrenched cases. You would expect those recidivism numbers here to be lower, and they are. In 2006, 19.3 percent of those youth within a first custody sentence did not commit a new offence within the next five years. That increased to 20.7 percent in 2007.

[1050]

That's sort of the edited highlights, if I can be so crass as to say that, of what's in the report. There is a plethora of indicators in there. We're working on more for the next report. In particular, we have some additional education indicators for children and youth in care. We hope to be able to take that to our executive later on this winter for the third report to be published later this year.

I'll be happy to field any questions.

J. Thornthwaite (Chair): Thank you very much, Martin.

D. Donaldson: Thank you for the report. I like charts and graphs, so I've got a few questions for you. I know these are highlights that you've picked out of the main report, although, just quickly, I don't see…. Is this kind of chart work in the main report too — the ones you've highlighted here? Is that in the main report, in the same chart format?

M. Wright: The data are in there. For the purposes of today, I did present them slightly differently, just so that it would work on the PowerPoint. But the data are in there. You will find them in there.

D. Donaldson: I'll just do one question. Then I'll come back on round 2.

The charts around the performance indicators about stability, placement stability — is there a goal? That's not marked on here. I mean, is the achievement…? What is the indicator that the ministry has in that placement stability around the number of moves in the first year?

I recall a report from the Representative for Children and Youth on not being able to ascertain the number of times a child moves while in care, not in the first year. Is that information in the report? Is that being able to be tracked now — the number of times, when the child is in care, they've moved not just in the first year of care but overall?

M. Wright: I'll do the second question first. For the number of times a child is moved, we are tracking that. That information is available to our workers and everybody in the ministry in our corporate data warehouse. That's fairly new. Yeah, that's new within the last few months.

M. Sieben: On the question of whether or not the ministry has a target for that measure — at this point, no. Part of the rationale for that is there hasn't been agreement or comfort, with some, in actually establishing targets on a number of child welfare dimensions.

I like targets, so we'll be introducing, increasingly, targets associated with child welfare performance. At this point, through the two reports that we have right now, what we have is a desire to go better. There isn't an indication, in my mind, to represent, through the first two bars, that we're doing well. I don't accept, frankly, that 68.7 percent is an indicator of doing really well. We can do better.

Also, something to keep in mind is that no matter how well we do on that particular indicator, each of those numbers is associated with a child. For the "moved twice or more" graph, as small as it is, those are individual cases that in the absolute sense still need to be treated in a focused, concerted effort.

D. Barnett: What is the average length of time, provincially, from when somebody applies to adopt a child to
[ Page 168 ]
when the actual adoption takes place? Do you have any provincial averages on that?

M. Wright: Yes. Do I know what they are? We can certainly provide that, yeah.

[1055]

D. Plecas: Martin, if we could just flip to the chart on the recidivism rates….

M. Wright: For youth justice?

D. Plecas: Yes.

I guess I found those numbers rather disturbing. I mean, I know what great progress we've made in youth justice in B.C. relative to the rest of Canada. It's a pretty impressive history, but this chart here is not inspiring, with an 80-plus-percent recidivism rate on a first custody sentence. That's pretty much off the charts in terms of recidivism. In effect, all of half of people given their first custody sentence are recidivating — and an unusually high percentage of recidivism rate on diversion. I guess, for me, there are two things. I think it's more….

Looking at the rest of your performance indicators and thinking how impressive they are, I'm not sure that this adds credibility — for them to be presented in this way. I would call this minimizing the reality of the situation. It would be more honest if we were presenting them in the reverse. But I guess the question for me is: what is going on, or more properly, what is not going on with those kinds of recidivism rates? Somebody is not being helped here.

I'm reminded of what I know about recidivism rates for the worst of the worst of the worst offenders in this country, and it is not a quarter of that.

M. Wright: I think, certainly in this particular chart, what's missing that's essential context is the number of youth that have actually been incarcerated.

I wonder if I could ask the members to turn to page 69 of their reports. You'll see a table there of the number of clients receiving first custody sentence services, and in 2007 the number of clients is 184.

Certainly, that's a lot of youth, but within the context of what that actually means, on the next page, on page 70, the graph at the bottom there, you'll see the youth custody per 10,000 youth in British Columbia and Canada. Then you'll see a graph between 2000 and 2001 and 2011 and '12. There you can see the decline in the custody rates. Not only can you see the decline within British Columbia and Canada, but you'll see that British Columbia has less than half the Canadian rate of youth incarcerated for youth in custody.

I think the member makes a very good point in that nobody wants to see youth reoffend, but particularly with the first custody sentence youth, of course, these are perhaps the more challenging cases to deal with. I'm not an expert on youth justice, and I would defer to my colleagues in youth justice on that. But just in terms of context around the numbers, overall, the numbers are very small, and within the Canadian context, British Columbia has the least incarcerated youth.

C. James (Deputy Chair): Thank you for the information. I think this is going to be very helpful, particularly, as you say, over time. When this starts coming out on a regular basis, I think it'll be important information.

I have a couple of questions. One is just around how you make a decision around which indicators to choose. I think that's key in any kind of data that's collected. It's very easy to collect certain kinds of data for certain kinds of outcomes. I wonder if you could talk a little bit about the process around choosing indicators.

[1100]

Certainly, in looking through this, I can think of a couple of different…. There's caseload, which by itself doesn't necessarily indicate good or bad practice but does have one piece of information, or perhaps personal contacts between workers and clients. Again, we've talked about that as an important indicator around a connection for the young person or for the person in care. I wonder if you could speak about that.

Then the second question I have…. And it may come back to your point at the beginning that you're starting to gather more information. As you do, these will become more robust and you'll have more information to put in.

On the youth mental health, for example, you have client satisfaction questions included in there — helpful information, I think vital information. Is there a plan to look at those kinds of questions to be expanded? I would think of children leaving care, for example — kids aging out — satisfaction questions, questions around how things have gone. Those kinds of client questions, I think, could really add to the information that's being gathered here. I wonder whether that was a start in youth mental health or why you picked that area.

M. Wright: I'll perhaps begin with the question around how indicators are determined, and then we'll go from there.

We have number of principles that we agree to in the ministry on indicator development. I should, as well, point out that in this particular report, included in here, in addition to performance indicators, is a description of the numbers of children, youth and families that we serve, as well as expenditures in budget. Not all of them are what we would call performance indicators. They're there for contextual purposes.

Some of the principles that we have for performance indicators are a number of things. First of all, we want them to be based in evidence so that when we're choosing a particular indicator, there's a strong rationale there
[ Page 169 ]
in the secondary literature that it actually makes a difference for kids. For example, if we're looking at numbers of moves in care, which we've been talking about, or looking at recurrence rates of maltreatment or looking at educational indicators, we want to be sure that we're covering the right things and in the right way. So evidence-based is really key.

Another principle we want is that we want people to be able to have buy-in to these. We want not only the program areas but stakeholders as well to be able to see, in those indicators, that they actually have relevance. Ultimately, what we're trying to do here is just inform our practitioners of what's actually happening so that we can improve. There needs to be buy-in for that. Program areas are involved and stakeholders are involved in looking at what the indicators are. We have a table for each service line that helps us to look and evaluate each of the indicators.

A third area is what we often call SMART, which are measureable and achievable indicators, really. That's just trying to ensure that what we do come up with is actually measureable. Often it's the case that we would love to have a number of indicators for which we'd have no data, so it becomes either something that we look at for the future or that we put aside. Those, in essence, are some of the principles that we have.

Some of the others that I think are really important in a report like this, when we're producing them every six months and want to inform our practitioners, are we want to be able to have indicators that are responsive to practice. Often it's the case in performance indicators that you have to wait three years or something to see if something's changed. Now, I've just shown you one here in youth justice where we're talking about five years in that case.

In many respects, what we've done in other indicators throughout the report is we've tried to make them responsive to change so that we can pick up on what's actually happening out there, so people can see their practice reflected quickly in the indicators. That was something that was important to us. That's essentially how we've selected the indicators.

Moving on now to child and youth mental health in particular, again, we have a process for selecting indicators. We have a number that we would love to look at. We know, for example, that anxiety is hugely important in child and youth mental health, as is depression, self-regulation — those kinds of things.

[1105]

We have some data on those when children present. We don't have enough data on them, really, to be able to say: "Okay, what happened after six months of treatment?"

We're working on strengthening the data at the front end when children present so that we know some of the conditions and their functioning. As well, we're looking at ways to strengthen, after a period of time of receiving treatment, what is actually happening to those children.

That's the direction we're trying to move in. There is a fair bit of change management that needs to take place on that. Some of that involves information systems and changes to information systems.

One example is some wait-list information around child and youth mental health. Some of those systems changes are occurring right now. We hope to be able to start looking at some of those data later on this year.

Things are happening in child and youth mental health. It's an area that we know is vitally important. Unfortunately, we don't have enough information right now, but we will. We will get there, I'm confident of that, in the near future.

In terms of doing client-satisfaction surveys and so forth, I think it's very important, and I think it's been tried a number of times in the ministry. It's something which we're going to look at again, in particular for children and youth who are leaving care. That's something which we will bring back to the table in determining what the performance indicators are and how we would actually do that. That particular item is being looked at and is up for discussion.

M. Karagianis: I want to ask a couple of questions here on your adoption numbers. First of all, on the chart that you had displayed to us…. You've got the "Eligible for adoption" chart here. The blue line is from April 2012 to March 2013. The orange line is from October 2011 to September 2012.

I mean, those overlap, actually — those data-collecting periods. I'm curious as to why they're not the same — the April to March appears to be a fiscal year — and why those two overlap rather than show us clear indicators of that.

Also, I noted in the larger report here…. If adoptions have dropped by about 19 percent, then your chart on page 65, where it says the proportion of children aged 12 and older for adoption has been increasing — those children are aging within that graph. Again, what it says is that children are not being adopted at an earlier age. Over the course of this time period they're obviously aging into that specific demographic.

Just trying to sort this information out to make it really relevant. What are the challenges, if any, that we should be looking at around these adoptions and what that data says to us?

M. Wright: A very astute observation around the numbers. That's great.

The reason, very simply, is that the reports are over a six-month period. In many of these indicators, we're comparing six months with the next months and so on and so forth.

With respect to adoptions, we're looking at a 12-month rolling average — a rolling number — over every six
[ Page 170 ]
months. The reason for that, believe it or not, is that there is some seasonality in the data. There is in many of our data.

Some of that relates to how work is performed and how children and administration go through the system. When we looked at putting this one together, we found that when we compared the latter six months with the former six months, it really didn't compare. In order to be able to make comparisons, we're just sticking with a 12-month total, and every six months that'll be renewed. So there will be some overlap each time.

[1110]

From a performance perspective, looking at that over time, it's not going to compromise the measure at all. If anything, it'll probably dampen any change that's occurred. But it's not really going to cause us any challenges. Really, it's an artifact of the nature of the data, which is why we did that, so that we have comparable numbers. Thank you for the question.

M. Karagianis: Maybe Mark could address the other piece.

M. Sieben: I'll give it a shot. I'm going to make a couple of quick points on the broader issue regarding what the ministry intent and responsibility is to use data as it becomes available, and then I'll speak specifically to adoption permanency and what we're looking to do here.

First, it's one thing to be able to…. To a certain extent, it's a luxury of riches. We have a lot of information. But to the former point of actually being able to take the information and make decisions regarding how best to frame up specific indicators or outcomes by which we and others can judge our performance, it's a different sort of challenge. That's part of what we're looking to do here.

What you see in this report is a mix of both, frankly — some areas where we have the benefit of doing a lot of work over time where we can give some specific outcomes, and others, frankly, are more data-based because we haven't got there yet.

Part of the accountability that Martin referenced speaks to accountability at a provincial level; it speaks to our accountability through the representative's office, perhaps through this committee; and externally. Some of the accountability, however, also happens inside MCFD, not only at an executive table but within our service delivery areas.

How we use this information — that, too, is something that's beginning to evolve. You see within the report that some of the tables, at least, have some information that is specific to service delivery areas themselves, and that's purposeful. Each of the slides, in fact, that Martin has shown…. There is the availability of SDA breakdown so that then we can meet with our executive directors of service out of the SDAs and present the variants.

My experience is…. A former boss of mine used to call these integrity reports. They're not necessarily a yes or a no because there are often really good reasons why there is some amount of variance. But it's of interest to ask one executive director of service why their FDR recurrence to a finding of a child needing protection rate is better or worse by some degree than another's. Maybe that has to do with staffing and allocation — something a guy like me can handle. Maybe it has to do with the model of supervision that they've got going on within the region. But the point is that it's one thing to have the information. It's another to actually use it in order to effect change. And that's what Martin is trying to get after.

On the adoption side, it's linked to those broader points that I noted. Part of the accountability is presenting information where we're not necessarily pleased with the trend lines that we've been showing and are looking to correct a deviation in that trend. That's part of the purpose as well.

We are looking forward to a report from the representative's office pertaining to adoption some time later this spring. It's going to speak more to some of the work that's been going on in adoptions and some of the work that hasn't been going on and that needs to.

In my mind, between 2006 and 2011-ish, frankly, there was some amount of drift in terms of priority of adoption as an area of focus. Providing this amount of opportunity and being accountable for it allows us, then, to really focus the need to refocus and re-energize an area that obviously needs that, both from the trend line but more importantly from an individual permanency adoption caseworker point of view and the kids that are on those caseloads.

[1115]

That's part of what's on display here. It's our intent, when we're providing information, to do so inside the organization, make decisions about how we're going to have to readjust or recalibrate or reorganize our resources, and then speak to that in public and semi-public forums such as this.

J. Martin: Thank you so much for taking us through this. If I could just momentarily return to the issue around youth justice. We know that dispositions for youth are often heavily persuaded by what resources may be available. I'm wondering if there are some jurisdictional concerns where certain territories, certain parts of the province that may be able to make the case that they're underserviced in terms of having these community resources…. I wonder if those, to some extent, may be driving custody rates in some jurisdictions.

M. Wright: Very good question. I'll take it back, and I'll ask the question of our youth justice program area, and we'll have an answer for you.

M. Bernier: Thanks again for all this information.
[ Page 171 ]
I was browsing quickly through the entire operational strategy here and looking at all the different graphs. There's a lot of useful information there, especially comparing around the province, region to region, some of the differences. I think that's quite useful. I want to look through it, obviously, a lot more proficiently.

I'd like to talk about slide 8, if I could, around the adoption. One of the things on that, which I noticed, is…. If you look at 2007-08 right in the middle of the graph there, you see that blip. So in 2003 it's high. We're going down, and all of a sudden it's a jump straight up. Are you able to tell me: is that because of…? Was there an internal policy change, a direction around adoption, a legislative change around adoption? Or do you know if it was maybe actually a public campaign around awareness of adoption?

When I see something like that quickly jump up, I want to know why, because that's what we want to replicate. So I'm just curious if you have any knowledge around that.

M. Sieben: Your question is specific to 2003 — why the numbers are so high there?

M. Bernier: Well, in 2003 they're high. They start dropping down, and then in 2007-08 they go right back up again. I was just curious if you remember or know why, when you did the study of…. An explanation, you know. When you look at a graph technically, there's a trend. When you see a blip in that, I'm kind of curious why.

M. Sieben: Your instincts are right, and it's part of what I was speaking to with the previous question. That's exactly what happened in 2003 and 2004. There was a concerted effort for a certain period of time in order to focus on doing more adoptions. Then the same thing happened with the use of, if I can recall right, some money that became available, which we used strategically in order to up those numbers.

Then there was a focus for a duration of a year or two. It was not sustained. Therefore, the question becomes: how best do we re-sustain, and how do we continue to sustain the opportunities that are there?

In one of the previous member's questions we talked about whether or not certain outcomes related to practice might benefit from having targets. I believe they do. Adoption is an area where I think there is opportunity for discussion about what an appropriate target for numbers of adoptions and permanency should be.

[1120]

That's something we'll be looking at in determining what is going to be necessary within our allocation in order to do that. Every three weeks or so I have a video chat with team leaders. I do that according to function in the ministry. I haven't gotten to everybody yet, but so far I've met with the intake team leaders, integrated team leaders, child and youth mental health team leaders and permanency team leaders. Then last week was the resource office, the guys who look after foster homes.

Part of the intent here is that we're able to use those forums to tell them what they're seeing, because they're the ones experiencing in real life. Maybe they didn't see the graph, but they've probably felt what the work was like for the last two and a half years. So this validates that, and then we can talk about where we can go based on what resources the ministry has and what type of priority we want to put on the work.

You're exactly right that there was specific effort that was done in 2003-04 and in '07-08 in order to bring adoption numbers up. The challenge now is a way to reinvigorate and then sustain.

M. Bernier: Thanks for that. I'm sorry if some of that was answered already while I was getting absorbed in the numbers. To me it's really important, because when we see successes we want to be able to capitalize and, hopefully, repeat and make it better. So I really appreciate that, because to me it's important. Are we looking at policy? Are we looking at legislation? Or are we looking at just awareness? Those are different things sometimes.

M. Sieben: May I add further to that? Part of that is the challenge. None of it is easy work, but there's a….

We're talking about adoptions here of children in ministry care — right, Martin?

M. Wright: Yes.

M. Sieben: We're not talking about international adoptions. We're talking about children who are available for adoption in the Ministry of Children and Family Development. Each of those children either has been relinquished for the purpose of adoption — which are few, frankly — and children who are in care under a continuing custody order.

As of the end of December there were something like 8,160 children in the ministry's care, which is down historically from trend lines or from the past 15 to 20 years. Our trend lines for kids in care have gone down. Inclusive of First Nations' aboriginal children, the number is slightly down. The proportion of First Nations' aboriginal children in care is higher, though, because our decline in the children-in-care population is significantly more for our non-aboriginal children in care.

Amongst our continuing custody caseload are, I think, 4,200-ish of that 8,100. More than 50 percent of the children in care are continuing custody kids, for whom the eventual plan may be adoption.

Martin's slides there show that there's somewhere a little over 1,000 who are available for adoption at this point. Many of those kids are First Nations' aboriginal kids. The challenges associated with permanency planning for First Nations kids and adoptions are something that we're sensitive to.
[ Page 172 ]

While all of those kids are absolutely deserving of a permanency plan, adoption within the construct that it's developed within — sort of normative child welfare — is difficult for many First Nations communities to absorb, and rightfully so given the history of child welfare practice in a number of those jurisdictions.

Part of our work is developing new forms of adoption services and programming that will work within that context, different forms of permanency that can provide care to those kids.

That's a bit of a story behind some of the numbers. I didn't want to make it sound like it was easy to set a target and just go ahead and do it. There's a lot more behind it than that. We have to take into account the nature of the kids and the cohorts that we do and who we're going to work with in those children's communities in order to accomplish that.

J. Thornthwaite (Chair): Last question goes to Doug.

D. Donaldson: I had a question on the recurrence rate of FTR to investigation with protection finding…. The slide on page 4. You mentioned perhaps over time you'll have a better idea if this is natural volatility, this improvement, or what factors are at play. But I have a question about perhaps other factors.

[1125]

On the previous slide you mention the switch to the ICM system. I'm curious as to whether…. It appears to me that the data in the chart that I'm talking about or the graph has the same period as the ICM switch. Was there any different way of either collecting, reporting or entering the data under the ICM that would perhaps account for this improvement as well, or not?

M. Wright: The trends that you see there are not because of a change in information system. With ICM, there was a change, of course, in how data are entered into a system, and there was a change in some of how the work was done and what was counted. But those changes are not reflected in this graph. So what you see there, in February 2013, is comparable to what you see in April 2007.

D. Donaldson: Okay. It's the next graph that I was talking about. It's not reflected in that one, right?

M. Wright: In the recurrence rate?

D. Donaldson: Yeah.

M. Wright: No. That change is not in any way determined by changes in information systems.

J. Thornthwaite (Chair): Okay. Thank you very much for coming in and answering our questions. We look forward to ongoing dialogue and also for the follow-up that the members had requested. What we're going to do is just take a brief break so that you can depart.

Then we've got another item, to just get brief discussion on our special project in our next meetings. So let's recess for, now, two minutes to allow our guests to leave.

Thank you very much for coming.

The committee recessed from 11:27 a.m. to 11:31 a.m.

[J. Thornthwaite in the chair.]

Update on Proposal for
Youth Mental Health Project

J. Thornthwaite (Chair): We just have one other item to briefly cover, and then talk about the next meeting. Everybody's got a handout right now, just kind of a brief update. I just wanted to get a show of hands, so to speak, on moving forward.

The next step is to actually determine who the individuals are that we are going to have come to our special project on youth and mental health. I think what we decided last time, and the vice-Chair can confirm this, is that we are in the process right now of receiving input from you, from the Chair and from the vice-Chair. I think the rep has probably also provided us with some suggestions. But the decision, if you're comfortable, would be made between the Chair and the vice-Chair. I'm getting nods.

We do have a preliminary list, both myself and the vice-Chair, if you want to take a look at it, but we just thought that to simplify things we wouldn't take that time today to go through it.

Interjection.

J. Thornthwaite (Chair): Okay, good. I'm seeing lots of nodding heads. The Chair and the vice-Chair will meet soon to discuss that, but as you'll see in your handout here, we do have an option — a couple of options, actually — of the format. I just kind of wanted to go over what you thought about those two options, option A and option B. The Clerk has cautioned us that it is recommended that if we are going to have youth to appear in an in-camera session, we do provide professional services for that youth with their guardian.

Did you want to comment on that?

K. Ryan-Lloyd (Clerk of Committees): Just very briefly, our recommendation, as the Chair has noted, would be that we would work with a pediatric psychiatrist to ensure that we have the benefit of professional advice with respect to determining the competence of any youth who have an interest in proceeding. Prior to their in-camera proceeding we would ensure that we have parental and/or guardian consent to their participation and
[ Page 173 ]
that the youth would be fully informed and understand, to the best of their ability, what the context of a parliamentary proceeding meeting is.

Although we would have the consent in hand and also the services of a professional, if required, to determine the competency, we'd also recommend a support person be there for the youth. It could be, again, the parent or guardian, but somebody to attend the meeting alongside of the individual to ensure that the experience is one that is as positive as it can be for the youth in terms of the formalities of the proceedings.

We are looking at some best practices from other parliamentary jurisdictions who have had youth appear before committees, and we'll come back with a more detailed recommendation at your next meeting.

[1135]

J. Thornthwaite (Chair): Looking at option A and looking at option B, does anybody have any opinions either way?

C. James (Deputy Chair): I like option A. I think if we're going to bring in youth and families, you need a full day. It's almost impossible for us not to do that in a full day. I think that to try and squeeze it into a half day and expect to do something else…. I think we need the full day, so I would recommend option A.

If we want to look at including youth council representatives, I think we just have to look at adding some more time on another day, but to mix the two is too much. You need a full day with youth and family and then your second day with your other witnesses, with groups and organizations.

The one other piece I'd add in here — and it's challenging because it's front-line staff — is an opportunity to be able to talk to people, both within organizations but also within government, who work with youth, to provide their feedback around what they're seeing.

I know there's sensitivity around all of that for staff who are within employment, but I think the opportunity to hear from people who are directly dealing with the challenges would be valuable for us as well.

For sure, option A — from my perspective.

J. Thornthwaite (Chair): Anybody else got any comments, suggestions?

D. Barnett: Option A and in Victoria.

J. Thornthwaite (Chair): In Victoria?

D. Barnett: Yes.

J. Thornthwaite (Chair): That's a good point. I thought we were going to do it in Vancouver.

M. Bernier: It says Vancouver right now.

D. Barnett: Does it?

K. Ryan-Lloyd (Clerk of Committees): I think that part of the consideration may have been that some of the preliminary groups that have been identified as potential expert witnesses or stakeholders, who would come to, particularly, the second day, are based in the Lower Mainland, although they are reflective, on the whole, of the entire province at this preliminary stage.

Certainly, once we have more certainty as to what the final list of invited witnesses would be, I mean, that would be a consideration — whether the meeting should be held in Victoria or Vancouver. But I think that's where the Vancouver suggestion came from originally.

M. Karagianis: Yeah. I think that around the issue of whether it's Vancouver or Victoria, it really needs to be about whether either choice is an impediment to having people come and if there are cost pressures or anything like that. We want to freely hear from people, so I think we should pick the location where people have the easiest opportunity, where it doesn't cost them to get there to talk with us.

As much as it's nice to have it here in Victoria, I think that Vancouver probably gives us a better opportunity to get people there than the larger number of people from Vancouver having to find their way here, pay for ferries or whatever and possibly even hotels. I think those are kind of unfair burdens to put on people who might want to come and participate in this. I'd prefer to see it someplace where it's just really easy for everybody.

D. Plecas: I just want to echo Maurine's comments.

J. Thornthwaite (Chair): Okay. That's what we did the last time. We did it in Vancouver, and I get that. So if there's not any overt….

D. Barnett: Whatever's easiest.

J. Thornthwaite (Chair): Whatever's easiest. Okay, so that's good.

Then, the last thing is just to confirm that we are also going to open it up to written submissions before and after. So if you are not the chosen panellist to come in, then you will certainly have an opportunity to provide input.

M. Karagianis: Well, I think that, again, is an opportunity that speaks to this issue of whether people can afford to travel or if it's even an available option for them and for people all over the province. As Doug was saying, the people in Stikine would probably like to participate in this, but they're unlikely to be able to afford to come
[ Page 174 ]
down and do this.

I think that written submissions are a really important part of this, because then it does open it up to anybody in the province. They don't have to be in Vancouver in front of us in order to have their story heard. I'm very happy about that piece being included.

D. Plecas: In addition, perhaps I should have called attention to this earlier — the whole matter of youth in conflict with the law, and mental illness. I'm just wondering if we should somehow have, within what we do, an opportunity to visit, for example, a youth facility and/or talk to youth who are incarcerated. Again, just reminded of how big an issue that is across the system overall — youth or adults.

J. Thornthwaite (Chair): You're making a suggestion that, on another day, the committee do that?

M. Bernier: He's talking about a field trip.

J. Thornthwaite (Chair): On a field trip.

D. Plecas: Yes, as part of that.

[1140]

J. Thornthwaite (Chair): Okay. Obviously, when the vice-Chair and I meet together with regards to the choice of the individual, we'll hash that out as well. But that's actually not a bad idea — to go on a field trip. I don't know how anybody else feels about that. On another day.

Okay. That's a good idea. Then the last thing is just to discuss the format of the proposed meetings — number, length of presentations. I thought if you've got any input, go to the vice-Chair or the Chair on all of these things so that we can at least make sure that we're encompassing what everybody on the committee thinks and we're representing each other.

Then we'll get together, the Chair and the vice-Chair, and make a decision for the next meeting. It's coming up in March.

If that is amicable to everybody, then…. Okay?

Okay, do you want any more comments?

K. Ryan-Lloyd (Clerk of Committees): No. Just to note as well that Byron and Aaron, the committee research staff, have begun, based on the direction received from the Chair and the Deputy Chair, preparing a draft reading list on the topic of youth mental health as well as lists of proposed witnesses, both expert witnesses regarding youth and mental health and also some options for the youth and family witnesses who might appear on day 1.

Those are things that are actively on their desks at the present time. We hope to have an update to everybody by the next meeting.

Other Business

J. Thornthwaite (Chair): Then the last meeting. Any other business is the next meeting. Just to let you know that we do have one rep report that will be presented: Lost in the Shadows.

We do have another item about something that's going to have to come up in the next year, actually, about reviewing the terms of reference of the committee. But we can give you more information on that later, because it actually doesn't have to happen for a while. But given the legislative schedule, we have to plan ahead.

Then we'll give another update on the special project after the discussion that the Chair and the vice-Chair have had.

Our next meeting is Wednesday, March 26, nine to 12, in this room. I'm just making sure everybody has got that in their calendars.

If there's not anything else, then I would propose that we adjourn. So moved by Donna Barnett.

Motion approved.

The committee adjourned at 11:42 a.m.


Hansard Services publishes transcripts both in print and on the Internet.
Chamber debates are broadcast on television and webcast on the Internet.
Question Period podcasts are available on the Internet.