Tuesday, September 26, 2023

Retroaction is not quite a retraction

 In a 2021 post Cauchy priors I made a gigantic blunder in misinterpreting a twitter response to a question posed by Victor Chernozhukov about  the mean E[X|X+Y] when X is standard Gaussian, and Y is independent and standard Cauchy.  I reformulated this as:  suppose Y|T ~ N(T,1) and T ~ Cauchy, what is E(T|Y=y)?  This is a standard Bayesian problem with the idea of Cauchy priors going back to Jeffreys and explored more recently by Berger and others.  My blunder was elementary and involved failing to remember  that the normalizing factor for the conditional density was dependent on y.  When this was fixed, I get the figure below.  To accentuate the flat portion of the posterior mean I've reduced the scale of the Cauchy to be 0.1 rather than 1.  The interpretation of figure is quite intuitive:  when y is near zero and therefore in agreement with the prior, the posterior mean is aggressively shrunken toward zero.  However, when |y| is far from zero, the prior says, "well, that could happen" and the posterior eventually looks indistinguishable  from y.



There is a mildly amusing story associated with how I came to revisit this problem.  I have been reading a recent JPE paper A/B Testing with Fat Tails that employs Student t  priors with low degrees of freedom in an essential way.  Having totally forgotten about the previous blog post, I proceeded to investigate how to compute this posterior mean, and not surprisingly my initial attempts faltered a bit, so I started to google around to see what was "out there" in webland.  Early on I found a nice paper by Guy Nason that dealt with the case of Student on 3 dfs.  It mentioned that there was a 1939 David Kendall paper that treated the Cauchy case.  This must have been written when Kendall was still a grad student.  It involves some quite exotic complex analysis, and among others cites a 1935 paper by Robert Oppenheimer!  If I interpret Nason correctly, the Kendall paper produces a "closed form" expression for the marginal density of a Cauchy mixture of Gaussians.  Kendall comments rather drolly that the expression isn't useful for computations because there was no tabulated  version of the erfc function for complex arguments.  This lack has been rectified in the intervening years, but my attempts in R, and then in Mathematica to check that this Kendall's expression integrated to one failed.  Instead, the integral seemed to diverge slowly.  I would be grateful for any and all suggestions about this, but I rather expect that it is all lost in the mists of time.

Meanwhile, fortunately, it is easy to cook up a numerical version of the posterior mean solution that I will append here:

# Berger problem

s <- 0.1

f <- function(t,x) dnorm(x-t, sd = 1) * dt(t/s,1)/s

k <- function(x) integrate(f, -Inf,Inf, x = x)$value

g <-function(t,x) t * f(t,x)

h <- function(x) integrate(g, -Inf,Inf, x = x)$value

x <- -100:100/10

m <- x

for(i in 1:length(x)) m[i] <- h(x[i])/k(x[i])

png("Cauchy.png")

plot(x, m, type = "l", xlab = "y", ylab = expression(E~theta|Y==y))

abline(c(0,1),col = 2)

abline(h = 0,col = 2)

dev.off()



No comments:

Post a Comment