First, I am not an expert on population genetics so my posts should not be taken as gospel and anyone should feel free to point out any errors.
The latest critique addresses parts of Venema’s argument that the previous critique did not. Specifically, Gauger and Buggs address the linkage disequilibrium evidence which uses the rate at which the genome gets shuffled by recombination events to estimate past population sizes. From the EN&ST article which cites Buggs:
“This study depends critically on knowing the recombination rate of the populations. Recombination rate is used both to calculate effective population size (from LD) and to estimate the time point that this is being measured for (from distance between loci). But the main method used to estimate the recombination rate by the authors is patterns of LD. Linkage disequilibrium patterns are also being used to calculate the effective population sizes given a known recombination rate. A degree of circular reasoning seems to be inevitable here. When the authors use a slightly different method to estimate recombination rate (which also relies upon measures of LD), all their estimates of Ne dropped by a mean of 27%. Thus, with the best will in the world, all we have here are ballpark figures for past effective population sizes. I am sure the authors of the study would not view their results as being of equivalent certainty to heliocentrism.”
So there are two things to unpack from that. First, they claim that recombination rates are determined from linkage disequilibrium (LD) data and then they are using that rate to measure population sizes from LD data. IOW, it is a circular argument. While I am not an expert and don’t know if these are truly independent measures, there is another way to find the recombination rate which is to directly measure that rate in sperm and eggs. As it turns out, the two are in strong agreement.
Recombination rate used by the paper Venema references: "For a segment length of 25 Mb and an effective population size of 2000, the chosen input parameters equate to a mutation rate of 10^−8 per nucleotide and a recombination rate of 0.01 per Mb."
Recombination rate observed in human gametes: "Following the procedure described by Coop and colleagues , we localized crossovers at high-resolution in 68 nuclear families with at least two children and examined variation in fine-scale recombination patterns among individuals. We observed an average of 41.7 (40.2–43.3 95%CI) and 27.7 (26.9–28.4 95%CI) recombination events among maternal and paternal transmissions, respectively, in close agreement with published estimates , –, ."
That looks like a pretty good match to me. Venema’s reference had 0.01 events per Mb which is 60 recombination events in a 6 billion base human genome. The observed rate in the other paper was 41.7 from mom and 27.7 from dad for a total of 69.4. 60 v. 69.4 is pretty darn good.
Next, Buggs claims that using a different estimate for the recombination rate reduces the effective population size. This isn’t the whole story.
"The second method, which estimates recombination rates between each pair of adjacent HapMap markers from a model-free method that detects recombination hotspots from LD (Clarke and Cardon 2005; Visscher and Hill 2006), changed the estimate of Ne between +33% and −45% with an average reduction of 27% (mean Ne = 1901) when compared with that obtained from the first method (results not shown in Tables). "
27% is a mean reduction, but the model also shows increases in the effective population size in some instances. Buggs seems to imply that it always results in a reduction in the effective population size, but this isn’t the case.
In my opinion, these criticisms really miss the mark, but I am more than willing to change my mind if someone can demonstrate any errors that I have made.