The City of Toronto plans to pay 100 people $100 each to
pose as homeless people for a day. Each person will first attend a 30-minute training seminar on how to convincingly look homeless. The reason: the decoys will act as a "control measure" during the city's upcoming survey of the homeless population.
I guess I don't sufficiently understand the methodology of conducting surveys, because this doesn't make any sense to me.
Comments
So they have the decoys stand in specific spots that are supposed to be covered by the headcount, and after the process is done they ask each decoy whether they were counted. In this way they'd get a survey of how accurately each area was counted - if five out of five decoys were approached, the headcount is more accurate than if some were missed.
But it still sounds weird.
http://www.torontosun.com/news/torontoandgta/2009/03/23/8848476-sun.html
...which states: "In an e-mail sent out last week to different social agencies and individuals, city employee Monica Waldman said they are looking for "tons" of people to sign up to be "decoys" on April 15, the night scheduled for the city's second homeless head count.
"Decoys are essentially 'faux' homeless people for quality assurance purposes in this research," she wrote. "As a decoy, you would need to come to a 30-minute training session and then be deployed to various sites throughout the city where you will wait to be approached by the research volunteers."
Each decoy will receive a $100 pre-paid Visa card as an honorarium."
.
The "decoys" will answer the survey in character - there's some first class 'quality assurance' for you.....and don't forget that a higher headcount means more allocation funds and more justification for the surveyors. Why don't they just give the money to real homeless people?? Sometimes I don't think I'm cynical enough....
You take the number of homeless people the surveyors counted, multiply it by 1.5, and you get a much more accurate count than if you hadn't corrected for the counting discrepancy.
More often it's arranged so that the "survey" gives a lovely, rosy picture, not reality!
As a student, my (now) wife once worked for a cheap market research company; I remember groups of us routinely sitting around filling in fake survey results because (surprisingly enough) no one on the streets wanted to stop and do a 40-minute interview in exchange for a crappy branded ballpoint pen or some similarly motivating incentive.
Most reputable research orgs will do call-backs to check that the reported survey respondents really were interviewed, but with only some homeless people having phones (and most probably being unwilling to be kept on a research company's database anyway) this would not be a reliable option.
For example, let's say that they did not have fake homeless out there. They do the count, and come up with 1,000 homeless. So, did they count them all? Who knows?
But now, let's say they seed the town with 10 fake homeless people. They again do the count and come up with 1,000 homeless. But 5 of them were the fake ones, namely half of the the fake ones. Thus, they can surmise that there were 995 real ones that they counted and twice that number of real ones that were not counted, for a total of 1,990 homeless. They do the same amount of counting work, but get a much more accurate answer.
That is also one of the reasons for tagging animals. When they do the same survey of animal populations the next year, they see what percentage were previously tagged and can then make an estimate as to the total animal population in the area.
I'm no statistician, but I don't think a mere 100 posed homeless controls would be much use in checking the efficiency of a headcount - not when the 2003 Housing and Homelessness Report Card reports that '31,985 homeless individuals (including 4,779 children) stayed in a Toronto shelter at least once during 2002.' That's too high a base rate for much statistical significance...
The survey, though, is likely to involve a relatively small number of fairly in-depth interviews, so the 100 controls *might* be enough. They would need to be appropriately seeded through the response population, of course. That seems to me more difficult to achieve: I assume it would need either prior recruitment of respondents or recruitment in specific places such as shelters, and either of these methodologies might well prejudice the findings. It's not an easy population to survey, though...
And all this time I've been giving it away ...
Straw men and righteous indignation go together quite well with the anti-intellectual, common sense, plain talking folk crowd. This is a case in point; the issue isn't in spending $5000 to have people pretend they're homeless (5% of the 100K budget for a homeless survey). The issue should be about the efficacy of doing this survey in the first place. I don't have enough information to decide that, but hey! not my city. Commenting on and casting judgement on whether it should be done in the first place is fair game.
Casting derision on a statistical research method should be left to those who understand how numbers work.
Putting "decoys" out in the field and assessing how many of them get approached by fieldworkers increases the validity and significance of the survey. It. Simply. Does. I could point you to a page that talks about experimental design, but really, I doubt you'd understand it. It was hard for me when I studied it (and used it in myown research) back in the day.
Which is the real point here. On the surface, this seems like an easy outrage to attack. "How dare they pay someone to pretend to be homeless when there are real homeless people out there!" they decry. The irony of this all is that the very same whiny voices you hear, outraged by this seeming misuse of money, are the very same who will normally complain about the homeless in the first place. Talk about arguing both side of the equation. A very thoughtful approach, that.
If you are uneducated in science, or mathematics, or experimental design, you really need to chill a bit and let those who do know a thing or two
those arcane issues do their job. No really, if you gave up on numbers or formulii or other complicated subjects in high school or university, you have lost your right to comment intelligently.
Notice I'm not commenting on spending $100K on surveying the homeless. That's an entirely separate issue. I can see both pros and cons to the approach, and am willing to let the experts in that field make their case.
I have to admit, though, that when I commented above I hadn't realized that the survey really is just a headcount... The methodology is not detailed, but that does raise again the question of whether 100 decoys is going to yield a verification which is statistically significant when the estimated homeless population exceeds 5000 (possibly by a very large margin if shelter usage rates are any guide). That's something it's not possible to calculate without knowing the full methodology...
Without knowing the methodology, it's also hard to tell if the survey money is being well-spent at all - but that's because it's hard to tell if the methods will provide an accurate finding rather than because of the cost.
One potential issue is that even if the decoy rate is sufficient, there is still the question of whether (as 'Homeless advocate Michael Shapcott of the Wellesley Institute' claims) 'a lot of homeless people don't look like homeless people.' Personally I wonder how true that is - but it looks like a hypothesis that needs testing before the headcount survey is done. It would only take a few such variables to raise the error margins on this survey to the point where it'll be useless.
If the survey is adequately designed (which I doubt), there can be little doubt it'll be money well spent. From a brief skim of the media, Toronto spends 'between $150 million and $300 million per year (depending on who you're listening to) on the homeless', and expensive proposals for new measures have been being debated for at least the past couple of years. With this scale of spending, an accurate picture of the scale of the problem is well worth 100K: a difference of a few hundred in the homelessness rates will make a difference of many hundreds of thousands, even millions, in investment needs... and this is well within the sizeable margins of error on a mere 'educated guess' (look at the differences between the official homelessness estimates (around 5000) and shelter use rates (around 25000!)).
The above assumes acceptance of the idea that Toronto should invest taxpayer money in resolving the homelessness problem, of course, which I certainly accept but which a libertarian would largely deny:)