Note the sample method from their site:"The survey was conducted by fax and online from April 18 to May 22, 2012. DPMAF obtained the office fax numbers of 36,000 doctors in active clinical practice, and 16, 227 faxes were successfully delivered. Doctors were asked to return their completed surveys by fax, or online at a web address included in the faxed copy. Browser rules prevented doctors from filing duplicate surveys, and respondents were asked to provide personal identification for verification. The response rate was 4.3% for a total of 699 completed surveys."That's not a scientific sample by any means. They faxed the questionnaire to doctors, asking them to respond. The questionnaire was self selecting. And we all know that people who are not happy about something are more likely to complain. I'm afraid this survey is pretty worthless.
I don't know where you're coming from on Obamacare. When Santorum was running, you sounded like a libertarian. He attacked him as a big gov't Republican. But now you're defending Obamacare. Where's the consistency?
Grifman said:That's not a scientific sample by any means. They faxed the questionnaire to doctors, asking them to respond. The questionnaire was self selecting. And we all know that people who are not happy about something are more likely to complain. I'm afraid this survey is pretty worthless.I don't care very much one way or the other about where doctors stand on this issue, but I'm not sure how this is functionally different than any other polling operation. Perhaps you can explain how this is "pretty worthless" in more detail.
1) They sent out the sample to a group of doctors. How do we know that this group is representative of doctors as a whole?2) The sample was dependent upon the doctors that actually answered. Again, how do we know that the doctors that actually responded were representative of doctors as a whole?Pollsters go to great lengths to devise samples that are reflective of the group they are sampling. This poll does none of that.
Grifman said:How do we know that this group is representative of doctors as a whole?It seemed to poll doctors who were actively practicing.That does limit those who are temporarily (or permanently) no longer practicing.So you're right that it is not representative of all doctors everywhere, but, from what I understand, it is more representative of the kinds of doctors who are most affected by the legislation.2) The sample was dependent upon the doctors that actually answered. Again, how do we know that the doctors that actually responded were representative of doctors as a whole?Isn't this a problem for all polling data everywhere? Perhaps I haven't read as much literature on polling methodology as you have.Pollsters go to great lengths to devise samples that are reflective of the group they are sampling.Well, those with lots of funding do, at least!
Thanks for your comment, Grifman.I could be wrong but I don't think this was meant to be scientific? Although like Matt asked, I don't know that a survey or poll of this sort is necessarily entirely worthless? But I do agree these sorts of things can be self-selecting.At the same time I think it's not insignificant that many doctors are complaining. Even if it's a minority, there seems to be enough of a more or less unified minority voice against the plan. At least from what I can tell. Anecdotally, I've had conversations with newly minted physicians who have (I think reasonable) concerns over the future of health care. A couple of them have even considered moving elsewhere to practice medicine if Obama's health care plan is fully implemented. For more anecdotal evidence, you can get a decent sampling if you check out the various forums at the widely followed and popular SDN. Check out the forum for a specialty like anesthesiology, for example.Personally, I do think we need health care reform. But I don't agree with Obama's health care plan. But this has to be spelled out further, which I might try to do someday if I can find the time.
Ok, if it's not meant to be scientific, then saying that "83% of doctors consider quitting due to Obamacare" is very misleading, if not deliberately deceptive. It's no better than someone going and polling a bunch of Democrats and saying "95% of Americans support Obamacare". That's my point.
At least to the extent I understand how it works (which maybe I don't), a news report will sometimes use a striking title to try and catch a person's attention in order to draw them to read the rest of the article, which is more specific.
I should add the article itself implicitly recognizes the survey's limitations when it mentions the AMA and AAFP wish to further study the survey prior to commenting on it.Also, the article contains a couple of other points I thought noteworthy besides the results of the poll itself (e.g. the forthcoming physician shortage).
The survey to one side, don't we know by now how badly the Federal gov't administers programs–not to mention state gov't (e.g. California). Based on ample precedent, can't we predict that Obamacare will be a boondoggle? How many examples of gov't mismanagement do you need before the light goes on?