To its credit, the wonkosphere has been atizzy about the fact that the three Presidential candidates have all endorsed the scientifically untenable view that vaccination causes autism, or at least the idea that there is a genuine controversy. (Canada has had its own moral panic in regard to BPA and our politicians have reacted with similar intergrity and respect for science. Long story, short: babies will be protected from a non-existent health problem by giving them breakable glass bottles.)
Inevitably, some smartass shows up in the comments box and points out that none of the empirical studies prove the lack of a link. Rather, they just don't demonstrate a link. And then the smartass inevitably says, "Absence of evidence isn't evidence of absence." (These are always the same dudes who tell you that "causation isn't evidence of correlation" and that Karl Popper is relevant to some matter at hand.)
B is evidence of A if p(A|B)>p(A). In other words, if your belief that something is the case is rationally stronger once the fact is in than it had been before, you have evidence.
Bayes theorem tells us that p(A|B) = p(B|A)p(A)/p(B) where all quantities are greater than 0 and less than or equal to 1.
Let X be that there is a causal relationship between autism and vaccination. Let Y be that there is evidence of such a link after a number of methodologically sound studies.
There is evidence of absence if p(~X|~Y)>p(~X)
p(~X|~Y)>p(~X) iff. p(~X|~Y)/p(~X)>1
By Bayes' theorem, p(~X|~Y)/p(~X)=p(~Y|~X)/p(~Y)
Therefore, there is evidence of absence if p(~Y|~X)/p(~Y)>1 or, equivalently, if p(~Y|~X)>p(~Y)
The absence of a causal relationship between autism and vaccination is never going to make it more likely that there will be evidence of such a relationship. So the only plausible case where p(~X|~Y) will not be greater than p(~X) is if p(~Y|~X)=p(~Y).
In English, the absence of evidence is evidence of absence of a link or an entity when it is more likely that there will be no evidence if the link or entity doesn't exist than if it does. In better English, if we expect that something will have observable effects if it exists, and it doesn't have observable effects, it probably doesn't exist. The stronger our prior belief that something would, if it existed, have observable effects, the more absence of evidence is evidence of absence.
We will expect very, very, very small causal links out there in the world not to have empirical effects in ordinary studies. If exposure to the polio vaccine raises your baby's autism risk by one billionth, we'd never know. However, very, very, very small causal links between exposure to a chemical and bad health outcomes just aren't worth worrying about. To be more precise, no matter how risk averse you may be, very small risks are not worth any cost, certainly not the cost of risking a polio epidemic. As the hypothesized risk gets larger, the evidence of absence from the existing studies gets stronger, so that we could in fact state a level of risk we are highly certain (19 times out of twenty) is the upper bound of the actual risk.
You could say the same thing about God. If your conception of God makes (if God exists, there would be evidence) reasonably likely and there is no evidence, then that counts against your conception of God.