Transit fitting#
import exoplanet
exoplanet.utils.docs_setup()
print(f"exoplanet.__version__ = '{exoplanet.__version__}'")
exoplanet.__version__ = '0.5.4.dev27+g75d7fcc'
exoplanet includes methods for computing the light curves transiting planets. In its simplest form this can be used to evaluate a light curve like you would do with batman, for example:
import numpy as np
import matplotlib.pyplot as plt
import exoplanet as xo
# The light curve calculation requires an orbit
orbit = xo.orbits.KeplerianOrbit(period=3.456)
# Compute a limb-darkened light curve using starry
t = np.linspace(-0.1, 0.1, 1000)
u = [0.3, 0.2]
light_curve = (
xo.LimbDarkLightCurve(*u)
.get_light_curve(orbit=orbit, r=0.1, t=t, texp=0.02)
.eval()
)
# Note: the `eval` is needed because this is using Theano in
# the background
plt.plot(t, light_curve, color="C0", lw=2)
plt.ylabel("relative flux")
plt.xlabel("time [days]")
_ = plt.xlim(t.min(), t.max())
But the real power comes from the fact that this is defined as a Aesara/Theano operation so it can be combined with PyMC3 to do gradient-based inference.
The transit model in PyMC3#
In this section, we will construct a simple transit fit model using PyMC3 and then we will fit a two planet model to simulated data. To start, let’s randomly sample some periods and phases and then define the time sampling:
np.random.seed(123)
periods = np.random.uniform(5, 20, 2)
t0s = periods * np.random.rand(2)
t = np.arange(0, 80, 0.02)
yerr = 5e-4
Then, define the parameters.
In this simple model, we’ll just fit for the limb darkening parameters of the star, and the period, phase, impact parameter, and radius ratio of the planets (note: this is already 10 parameters and running MCMC to convergence using emcee would probably take at least an hour).
For the limb darkening, we’ll use a quadratic law as parameterized by Kipping (2013).
This reparameterizations is implemented in exoplanet as custom PyMC3 distribution :class:exoplanet.distributions.QuadLimbDark
.
import pymc3 as pm
import pymc3_ext as pmx
with pm.Model() as model:
# The baseline flux
mean = pm.Normal("mean", mu=0.0, sd=1.0)
# The time of a reference transit for each planet
t0 = pm.Normal("t0", mu=t0s, sd=1.0, shape=2)
# The log period; also tracking the period itself
logP = pm.Normal("logP", mu=np.log(periods), sd=0.1, shape=2)
period = pm.Deterministic("period", pm.math.exp(logP))
# The Kipping (2013) parameterization for quadratic limb darkening paramters
u = xo.distributions.QuadLimbDark("u", testval=np.array([0.3, 0.2]))
r = pm.Uniform(
"r", lower=0.01, upper=0.1, shape=2, testval=np.array([0.04, 0.06])
)
b = xo.distributions.ImpactParameter(
"b", ror=r, shape=2, testval=np.random.rand(2)
)
# Set up a Keplerian orbit for the planets
orbit = xo.orbits.KeplerianOrbit(period=period, t0=t0, b=b)
# Compute the model light curve using starry
light_curves = xo.LimbDarkLightCurve(u[0], u[1]).get_light_curve(
orbit=orbit, r=r, t=t
)
light_curve = pm.math.sum(light_curves, axis=-1) + mean
# Here we track the value of the model light curve for plotting
# purposes
pm.Deterministic("light_curves", light_curves)
# ******************************************************************* #
# On the folowing lines, we simulate the dataset that we will fit #
# #
# NOTE: if you are fitting real data, you shouldn't include this line #
# because you already have data! #
# ******************************************************************* #
y = pmx.eval_in_model(light_curve)
y += yerr * np.random.randn(len(y))
# ******************************************************************* #
# End of fake data creation; you want to include the following lines #
# ******************************************************************* #
# The likelihood function assuming known Gaussian uncertainty
pm.Normal("obs", mu=light_curve, sd=yerr, observed=y)
# Fit for the maximum a posteriori parameters given the simuated
# dataset
map_soln = pmx.optimize(start=model.test_point)
optimizing logp for variables: [b, r, u, logP, t0, mean]
message: Desired error not necessarily achieved due to precision loss.
logp: 24787.977771807484 -> 24793.5394256112
Now we can plot the simulated data and the maximum a posteriori model to make sure that our initialization looks ok.
plt.plot(t, y, ".k", ms=4, label="data")
for i, l in enumerate("bc"):
plt.plot(
t, map_soln["light_curves"][:, i], lw=1, label="planet {0}".format(l)
)
plt.xlim(t.min(), t.max())
plt.ylabel("relative flux")
plt.xlabel("time [days]")
plt.legend(fontsize=10)
_ = plt.title("map model")
Sampling#
Now, let’s sample from the posterior defined by this model.
As usual, there are strong covariances between some of the parameters so we’ll use pmx.sample
from pymc3-ext.
np.random.seed(42)
with model:
trace = pmx.sample(
tune=1000,
draws=1000,
start=map_soln,
cores=2,
chains=2,
target_accept=0.9,
return_inferencedata=True,
)
Multiprocess sampling (2 chains in 2 jobs)
NUTS: [b, r, u, logP, t0, mean]
Sampling 2 chains for 1_000 tune and 1_000 draw iterations (2_000 + 2_000 draws total) took 25 seconds.
After sampling, it’s important that we assess convergence.
We can do that using the summary
function from ArviZ
:
import arviz as az
az.summary(trace, var_names=["period", "t0", "r", "b", "u", "mean"])
mean | sd | hdi_3% | hdi_97% | mcse_mean | mcse_sd | ess_bulk | ess_tail | r_hat | |
---|---|---|---|---|---|---|---|---|---|
period[0] | 15.448 | 0.002 | 15.443 | 15.452 | 0.000 | 0.000 | 571.0 | 878.0 | 1.00 |
period[1] | 9.292 | 0.000 | 9.292 | 9.293 | 0.000 | 0.000 | 1705.0 | 1529.0 | 1.00 |
t0[0] | 3.503 | 0.006 | 3.493 | 3.515 | 0.000 | 0.000 | 647.0 | 1136.0 | 1.00 |
t0[1] | 5.121 | 0.001 | 5.119 | 5.124 | 0.000 | 0.000 | 2262.0 | 1601.0 | 1.00 |
r[0] | 0.039 | 0.002 | 0.037 | 0.042 | 0.000 | 0.000 | 725.0 | 1268.0 | 1.00 |
r[1] | 0.058 | 0.001 | 0.056 | 0.060 | 0.000 | 0.000 | 889.0 | 1501.0 | 1.00 |
b[0] | 0.667 | 0.049 | 0.568 | 0.739 | 0.002 | 0.002 | 516.0 | 762.0 | 1.00 |
b[1] | 0.403 | 0.038 | 0.333 | 0.469 | 0.001 | 0.001 | 947.0 | 1251.0 | 1.00 |
u[0] | 0.366 | 0.212 | 0.000 | 0.715 | 0.006 | 0.004 | 1204.0 | 909.0 | 1.00 |
u[1] | 0.289 | 0.347 | -0.323 | 0.873 | 0.012 | 0.009 | 759.0 | 576.0 | 1.01 |
mean | 0.000 | 0.000 | -0.000 | 0.000 | 0.000 | 0.000 | 2687.0 | 1394.0 | 1.00 |
That looks pretty good! Fitting this without exoplanet would have taken a lot more patience.
Now we can also look at the corner plot of some of that parameters of interest:
import corner
truth = dict(
zip(
["period", "r"],
pmx.eval_in_model([period, r], model.test_point, model=model),
)
)
_ = corner.corner(
trace,
var_names=["period", "r"],
truths=truth,
)
Phase plots#
Like in the Radial velocity fitting tutorial, we can make plots of the model predictions for each planet.
for n, letter in enumerate("bc"):
plt.figure()
# Get the posterior median orbital parameters
period_trace = trace.posterior["period"].values[:, :, n]
p = np.median(period_trace)
t0 = np.median(trace.posterior["t0"].values[:, :, n])
# Compute the median of posterior estimate of the contribution from
# the other planet. Then we can remove this from the data to plot
# just the planet we care about.
lcs = trace.posterior["light_curves"].values
other = np.median(lcs[:, :, :, (n + 1) % 2], axis=(0, 1))
# Plot the folded data
x_fold = (t - t0 + 0.5 * p) % p - 0.5 * p
plt.errorbar(
x_fold, y - other, yerr=yerr, fmt=".k", label="data", zorder=-1000
)
# Plot the folded model
inds = np.argsort(x_fold)
inds = inds[np.abs(x_fold)[inds] < 0.3]
pred = lcs[:, :, inds, n] + trace.posterior["mean"].values[:, :, None]
pred = np.median(pred, axis=(0, 1))
plt.plot(x_fold[inds], pred, color="C1", label="model")
# Annotate the plot with the planet's period
txt = "period = {0:.4f} +/- {1:.4f} d".format(
np.mean(period_trace), np.std(period_trace)
)
plt.annotate(
txt,
(0, 0),
xycoords="axes fraction",
xytext=(5, 5),
textcoords="offset points",
ha="left",
va="bottom",
fontsize=12,
)
plt.legend(fontsize=10, loc=4)
plt.xlim(-0.5 * p, 0.5 * p)
plt.xlabel("time since transit [days]")
plt.ylabel("relative flux")
plt.title("planet {0}".format(letter))
plt.xlim(-0.3, 0.3)
Citations#
As described in the citation tutorial, we can use citations.get_citations_for_model to construct an acknowledgement and BibTeX listing that includes the relevant citations for this model.
with model:
txt, bib = xo.citations.get_citations_for_model()
print(txt)
This research made use of \textsf{exoplanet} \citep{exoplanet:joss,
exoplanet:zenodo} and its dependencies \citep{exoplanet:agol20,
exoplanet:arviz, exoplanet:astropy13, exoplanet:astropy18, exoplanet:kipping13,
exoplanet:luger18, exoplanet:pymc3, exoplanet:theano}.
print(bib.split("\n\n")[0] + "\n\n...")
@article{exoplanet:joss,
author = {{Foreman-Mackey}, Daniel and {Luger}, Rodrigo and {Agol}, Eric
and {Barclay}, Thomas and {Bouma}, Luke G. and {Brandt},
Timothy D. and {Czekala}, Ian and {David}, Trevor J. and
{Dong}, Jiayin and {Gilbert}, Emily A. and {Gordon}, Tyler A.
and {Hedges}, Christina and {Hey}, Daniel R. and {Morris},
Brett M. and {Price-Whelan}, Adrian M. and {Savel}, Arjun B.},
title = "{exoplanet: Gradient-based probabilistic inference for
exoplanet data \& other astronomical time series}",
journal = {arXiv e-prints},
year = 2021,
month = may,
eid = {arXiv:2105.01994},
pages = {arXiv:2105.01994},
archivePrefix = {arXiv},
eprint = {2105.01994},
primaryClass = {astro-ph.IM},
adsurl = {https://ui.adsabs.harvard.edu/abs/2021arXiv210501994F},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
...