The confidence sets for linear functions $\mu + \lambda\sigma^2$ of the mean $\mu$ and variance $\sigma^2$ of a normal distribution, defined in terms of the uniformly most powerful unbiased level $\alpha$ tests of hypotheses of form $H_0(\lambda, m): \mu + \lambda\sigma^2 = m$ against the two-sided alternative $H_1(\lambda, m): \mu + \lambda\sigma^2 \neq m$ for $-\infty < m < \infty$, for fixed $\alpha$ and $\lambda$, are shown to be intervals if the number of degrees of freedom for estimating $\sigma^2$ is $\geqq 2$.
"A Note on Two-Sided Confidence Intervals for Linear Functions of the Normal Mean and Variance." Ann. Statist. 1 (5) 940 - 943, September, 1973. https://doi.org/10.1214/aos/1176342514