Children's Minister Hilary Armstrong was due today to outline what could become one of Project Blair's most ambitious, misguided and hubristic projects yet. The Government will attempt to identify children at risk of failure, violent behaviour or criminality at birth, and take the necessary corrective actions to steer them onto a law-abiding and successful path.
Ironically, Armstrong is floating these proposals just as this same predictive approach to future behaviour patterns is becoming discredited. A couple of national newspapers, the Independent and The Observer, appear to have seen outlines of the plans. According to the Independent, midwives, doctors and nurses are to be "asked to identify 'chaotic' families whose babies are in danger of growing up to be delinquents, drug addicts and violent criminals." The plan will be backed up by "research" which "shows that children from the most dysfunctional families are 100 times more likely to abuse alcohol commit crimes or take drugs", and a "source" close to Armstrong says: "It is the 'supernanny' model.' There is no reason why midwives who ask mothers lots questions anyway can't ask a few more about the family circumstances and identify families where there may be problems. We need to intervene early to stop the cycle that leads to social exclusion."
Actually there's every reason, as we will explain shortly. The point worth hanging onto here, however, is that there is a world of difference between taking action to deal with existing problems (which of course will usually grow if left alone) and predicting that problems will exist, and attempting to head them off via early intervention. If you're 'successful', how do you know for sure? And can you be sure that the 'clear' signs of developing problems you've identified have not to a large extent been generated by the monitoring systems you've put in place? This road leads into a swamp of junk science, hokum and voodoo, and down that road we find Tony Blair, who believes that it is possible to predict which two year old is going to turn out to be a troublemaker.
But Action on Rights for Children (ARCH) points out that evidence is stacking up against early intervention, and says: "Far from being, at worst, ineffective, a growing body of research suggests that it can actively do harm." ARCH, produces an extremely useful breakdown of Government children's databases (which are far more numerous than you might think) in the form of a blog, and points to the contribution made to a recent LSE conference (Children: Over Surveilled, Under Protected) by Jean Hine of de Montford University, who is involved in a five university ESRC-funded research programme into "Pathways into Crime."
In Hine's view the thinking underlying initiatives of this sort (which of course hadn't yet been announced) is approximately as follows. Police focuses on children as 'becomings', i.e. they are to some extent blank sheets that will develop into... Well, the Government tends to think of them developing into one of two categories, good or bad. Children are viewed as "passive recipients" of stimulus and intervention, and assessment tools can be used effectively to identify future offenders. Intervention programmes themselves focus on perceived deficits and problems, and tend to ignore strengths. Intervention, the Government policy wonks think, can successfully divert children from the problem path to the 'normal' one.
Unfortunately for the Government (or more accurately, for the future generations now being herded into the labs), the output of the assessment tools is starting to look like voodoo, and in real life, when non-factual data (i.e., value judgments) is poured into data sharing systems, it breeds imaginative and semi-fictional narratives, and in the case of social work, invents whole cases.
The problem with prediction is that although it is possible to identify 'tell-tale' signs in actual offenders, the presence of these does not necessarily identify future offenders. Start with the real villains and work backwards, and the signs were all obviously there, but studies that start with the signs and work forwards don't end up separating the serial criminals from the law abiding. So yes, it may still seem 'obvious' that you can figure out what made people bad and go back to childhood and fix it, but right now you haven't been able to prove it. So stop experimenting on whole generations until you have proved it, OK?
The basic fallacies of the Government's data collection and sharing plans (in the case of children, the Information Sharing Index is central to these) were covered more broadly by the LSE conference. Eileen Munro of the LSE's Department of Social Policy described collecting data as "probably the least useful approach", and noted that although the Government describes information sharing as "the key to successful collaborative working", there is "no empirical evidence" to support this.
The information they're sharing, meanwhile, will become more junk-like as the boxes they need to check and the fields they need to fill in multiply. Social workers, police, anyone who's given the job of spotting early warning signs will feel the need to put something in the box, for all too obvious reasons. What's it going to look like in five years time when some kid on your books gets beaten to death, and it turns out you didn't notice anything? The empty box clearly indicates negligence on your part. So the slightest, part-imagined 'signs' will go down, the people you're sharing the data with will see this 'concern' flagged and put in some 'signs' of your own. And as Brian Sheldon, Emeritus Professor, University of Exeter and former director of the Centre for Evidence-Based Social Work puts it, once social workers decide people need visiting, "they need visiting a lot." Or as Hine says, "if you're looking for problems, you will find problems."
The cases will tend to build themselves, the effect much magnified by the 'share and deploy' approach, and they'll also tend to focus on the easier cases. The ones who're easier to get at and who're on the receiving end of self-generating warning signs will get lots of attention (despite quite possibly never having needed any in the first place), and quite possible acquire real problems because of this, while harder cases of real need may not get any attention at all.
At ground level, midwives (and one presumes other professionals) are beginning to see the collateral damage of the Blair Project's data kleptocracy (Sheldon diagnoses this as symptomatic of a country suffering from obsessive-compulsive disorder). Some of the women midwives are dealing with have noticed that their histories can be taken down and used against them, and that it does not matter whether or not they have successfully coped, or are successfully coping with whatever the problem might have been. If you tell someone, it will be flagged as a 'concern' and will breed more concerns, and turn you into a 'case'. So they're starting to withhold information, and as midwives, and other professionals continue to ask "a few more" questions, people on the receiving end of the data kleptocracy will start to go underground.
Leaving systems built on junk science sharing junk data in pursuit of imaginary concerns and a pre-defined criminal underclass, while the rest of us hide. Welcome to virtual reality social work, welcome to Project Blair. ®