How an "expert" will dismiss the death data analysis
I got this email from one of my readers (Jeff) who explained why "experts" won't believe what the objective death data indicates.
The plan is to have a third party survey company ask people survey questions like:
Did anyone in your family die in the last 18 months?
When did they die (date)?
How old were they?
Did they ever receive a COVID vaccine prior to death?
From this we can figure out whether the vaccine is safe or not. This is not opinion or bias, it's a statement of fact. There are no biases here. You can’t tell whether the survey is from a pro-vax company, anti-vax company, or a company that can’t figure it out either way.
“Jeff” wrote me:
I passed along the survey concept to "prove" increase in all cause mortality to see how it would be received by a Ph.D. in social sciences / marketing who works near enough to vaccine proponents to stand as an indicator of likely rejection of my representation of what I think you have in mind - also expert in social stats, who I personally know, and below you will kindly find his response, which if not addressed, will likely be the kind of resistant efforts, from the pro-vaxx side of the discourse, you will likely encounter.
That is the best I can offer to you - I hope it saves time and money and further, increases the likely acceptance of your survey results by both anti-vaxxers and by those who support covid vaccinations.
His friend wrote to him:
A social survey is an inappropriate method out of the box.
Social surveys about knowing others who had X happen after Y experience is 100% hearsay, but could look and feel "legitimate", which is an issue. Its called bias.
It won't likely fly in either law or psychological (or opinion) measurement. You might be able to flip it, asking beliefs about cov vax efficacy and safety, and then ask if they know anyone personally whom they believe was injured by admin of the cov vax. That gives a measure of belief and the potential reason for the belief. Social surveys, e.g. opinion surveys are good at getting at beliefs and poor at predicting behavior based on beliefs, and awful at medical diagnosis.
The method itself is inappropriate, even if you could ask valid and reliable questions.
Valid and reliable questions require at least three samples (of say 500) depending on how many constructs are included, and the indicators used to measure the constructs.
In my example above you would need to test at least 3-4 questions that measure beliefs about cov vax efficacy and safety (I suspect the questions are out there already tested by public health related centers, schools or health communications studies programs. Might go quick.) The questions about personally knowing someone you believe was injured by cov vax, would take some testing to get valid/reliable phrasing. Then you have belief.
Since the effort is geared from the outset to prove rather than test, it's biased in formulation. Another reason I wouldn't go that route. It's the exact same thing that the parties involved are alleging others are doing, ironically.
In other words, there will be people you will never convince no matter how hard you try.
I thought this was so absurd that I wanted to share it with you.
Coincidence is the leading cause of death these days…
If you want to make a difference then you will embrace accepted statistical methods to properly and objectively test a hypothesis.
I am not vaccinated, have taken graduate level epidemiology and bio statistics courses but your dismissal of accepted data norms does nothing but reduce your credibility in the eyes of those people with an actual formal background in data science.
As much as you don't want to believe it, there is a proper way to look for this data and the survey's you continue to discuss aren't part of the proper way to analyze this type of data.
Imagine how much you would shred the other side if they used a simple survey like yours to promote vaccination.