01/06/2025: Substantial extended versions of two conference papers are here and here. The former establishes the asymptotic efficiency and Gaussianity of likelihood-based IMs with via a novel possibilistic Bernstein--von Mises theorem and the latter develops a variational-like approximation of the same possibilistic IM, simplifying relevant computations.
10/03/2024: A new paper entitled Regularized e-processes: anytime valid inference with knowledge-based efficiency gains is available here and here. In it, I propose to boost a given e-process's efficiency through the incorporation of (incomplete) prior knowledge. This incomplete prior knowledge comes in the form of an imprecise probability, which is then appropriately encoded as a regularizer and combined with the e-process to make a regularized e-process. The regularization penalizes incompatibility with the prior knowledge by inflating the proposed e-process, thereby boosting efficiency. A generalized Ville's inequality implies, among other things, that tests and confidence sets derived from regularized e-process are anytime valid in a novel, prior knowledge-dependent sense.
10/01/2023: A new paper entitled Valid and efficient imprecise-probabilistic inference with partial priors, III. Marginalization is now available here and here. This is a follow-up to the investigations started in Parts I & II described below. What's new here is a focus on marginal inference. I propose a general marginalization strategy for possibilistic IMs, one that relies on profile likelihoods and can accommodate partial prior information if available. Validity properties are established and lots of illustrations are given.
11/29/2022: A new paper entitled Valid and efficient imprecise-probabilistic inference with partial priors, II. General framework is now available here and here. This is a follow-up to Part I mentioned below. What I didn't do in Part I was explain how valid and efficient imprecise-probabilistic inference with partial priors can be achieved. This new paper describes a valid and efficient inferential model (IM) construction, which turns out to be practical, conceptually simple, and not so much different from familiar things. Very strong properties are established for this IM and I show lots of examples.
11/29/2022: New revision to Valid and efficient imprecise-probabilistic inference with partial priors, I. First results is now available here
and here. There I interpret the "no-prior" perspective taken by non-Bayesians in an imprecise-probabilistic way as "every-prior". This simple adjustment creates an opportunity for meaningful unification between different schools of thought. Useful positive and negative results concerning validity and efficiency are also presented.
12/20/2021: Version 3 of the paper on an imprecise-probabilistic characterization of frequentist inference is available here and here. I show that what is typically referred to as "frequentist inference", i.e., hypothesis testing and confidence regions, is best understood in the context of imprecise probability, especially possibility theory. Moreover, I show that for every test or confidence procedure that provably controls error rates, there exists an inferential model that admits a procedure that's no less efficient. So, frequentism does have a rigorous uncertainty-quantification underpinning, not unlike Bayesianism, it just takes a different mathematical form.