Skip to content

Commit

Permalink
refs #92 add tutorial 3
Browse files Browse the repository at this point in the history
  • Loading branch information
AnthonyLim23 committed Nov 12, 2024
1 parent 9c6c3fa commit 8c36e1c
Show file tree
Hide file tree
Showing 6 changed files with 359 additions and 6 deletions.
5 changes: 3 additions & 2 deletions docs/source/cf_methods.rst
Original file line number Diff line number Diff line change
Expand Up @@ -91,9 +91,10 @@ To evaluate the odds factor, the probability of the data given the model needs t
This is written as

.. math::
P(D | M) = \int_\omega d\underline{\theta} \quad P(D| \underline{\theta}, M)P(\underline{\theta} | M)
P(D | M) = \int_\Omega d\underline{\theta} \quad P(D| \underline{\theta}, M)P(\underline{\theta} | M)
where the integral over :math:`\omega` is over the available parameter space for :math:`\underline{\theta}`.
where the integral over :math:`\Omega` is over the available parameter space for :math:`\underline{\theta}`.
This quantity can be evaluated using either Markov Chain Monte Carlo (MCMC) or nested sampling.



Expand Down
6 changes: 3 additions & 3 deletions docs/source/examples/QENS.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
"QENS Example\n",
"------------\n",
"\n",
"This example is based on Quasi-Elastic Neutron (QENS) data. It is used to examine the a variety of molecular motions from within a sample. For example, diffusion hopping, rotation modes of molecults and electronic transitions. This example uses real data collected at the ISIS neutron and muon source.\n",
"This example is based on Quasi_Elastic Neutron Scattering (QENS) data. It is used to examine a variety of molecular motions within a sample. For example, diffusion hopping, rotation modes of molecules and electronic transitions. This example uses real data collected at the ISIS neutron and muon source.\n",
"\n",
"This example will demonstrate the `QLData` workflow for determining the number of Lorentzians in a sample. The first step is to import the correct packages. From `quickBayes` there are three imports; \n",
"- The workflow `QLData`\n",
Expand All @@ -35,7 +35,7 @@
"id": "b30df950-1ab9-434f-9ba0-8c0655efdd77",
"metadata": {},
"source": [
"The data is contained within the test's for `quickBayes`. Analysing QENS data requires both the sample measurements and the resolution. The resolution encompasses the noise of the instrument."
"The data is contained within the test's for `quickBayes`. Analysing QENS data requires both the sample measurements and the resolution. The resolution is used to reduce the background noise to zero, making it easier to identify the functional form of the data."
]
},
{
Expand Down Expand Up @@ -83,7 +83,7 @@
"id": "27ae815d-045a-4f11-a7dd-8aa186a945c2",
"metadata": {},
"source": [
"The next step is to set the problem parameters. The `results` and `results_errors` are empty as we are doing a fresh calculation. The start value is chosen to be $-0.4$ and the end value is $0.4$, this is to make sure that there is some data that contains predominantly just background measurements. The last value is the maximum number of Lorentzians to consider, in this case three."
"The next step is to set the problem parameters. The `results` and `results_errors` are empty as we are doing a fresh calculation. The start value is chosen to be $-0.4$ and the end value is $0.4$, this is to make sure that there is some data that can be used for fitting the background values. The last value is the maximum number of Lorentzians to consider, in this case three."
]
},
{
Expand Down
1 change: 1 addition & 0 deletions docs/source/examples/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,4 @@ Contents

muon.ipynb
QENS.ipynb
sin_wave.ipynb
2 changes: 1 addition & 1 deletion docs/source/examples/muon.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -204,7 +204,7 @@
"id": "28fb2a5e-2d00-4f39-9c62-131541d04ee3",
"metadata": {},
"source": [
"The results and errors of the calculation can be obtained from the `get_parameters_and_errors` method. This returns two dictionaries, one for the values and one for the errors. The keys indicate the parameter, with the `Nx` showing how many features (decays) were used in the calculation. Within the `results` are the loglikelihoods ($log_{10}(P)$), which are the logs of the unnormalized posterior probability. Hence, the most likely model is the one with the largest value. "
"The results and errors of the calculation can be obtained from the `get_parameters_and_errors` method. This returns two dictionaries, one for the values and one for the errors. The keys indicate the parameter, with the `Nx` showing how many features (decays) were used in the calculation. Within the `results` are the loglikelihoods ($log_{10}(P)$), which are the logs of the unnormalized posterior probability. Hence, the most likely model is the one with the largest value. When the loglikelihood is calculated it does not take the background into account (other than being part of the $\\chi^2$ value). Since the background is the same function in all of the models it just adds an offset to the loglikelihood, so it is neglected. "
]
},
{
Expand Down
Loading

0 comments on commit 8c36e1c

Please sign in to comment.