Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Doctest ng #11

Closed
wants to merge 38 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
7640d32
WIP: record the refguide-check log
ev-br Jun 12, 2022
c536397
ENH: add tools/doctest_public_modules
ev-br Jun 12, 2022
70aeec6
MAINT: compare refguide-check and doctest_public_modules.py
ev-br Jun 12, 2022
28e6824
ENH: plumb docstest_public_submodules.py through to dev.py (nee do.py)
ev-br Jun 18, 2022
d49df76
MAINT: close the LOGFILE
ev-br Feb 16, 2024
b750bad
CI: run doctesting on GH actions -- ADD A CLI to too/doctest_...
ev-br Jun 18, 2022
ac46f16
WIP: cmdline API for doctest_public_modules
ev-br Jun 25, 2022
dd87201
WIP: temporarily comment out the verbosity setting
ev-br Jun 26, 2022
971df5e
BUG: run all modules if not provided
ev-br Jun 26, 2022
3d96ab2
ENH: doctests: implement the -t option to test a single file
ev-br Jun 27, 2022
6adda13
WIP: dev.py/refguide-check/doctest
ev-br Jun 28, 2022
3d1b5e6
ENH: doctests: run doctesting on tutorials
ev-br Jul 3, 2022
86b9de7
ENH: allow linalg.norm emit warnings
ev-br Jul 3, 2022
6ad7972
MAINT: doctests: clean up dead code/comments
ev-br Jul 4, 2022
e14be1b
ENH: plumb doctest_public_modules.py through to dev.py
ev-br Feb 16, 2024
f5d9aa2
ENH: control the rng state
ev-br Feb 16, 2024
0e00ec0
BUG: close the log file
ev-br Feb 16, 2024
9050f09
WIP: signal.normalize emits warnings (rework the example? + filter ou…
ev-br Feb 16, 2024
af2e078
DOC: fft: fix docstring examples in scipy.fft
ev-br Feb 17, 2024
b71962c
MAINT: fftpack: skip doctesting things from numpy
ev-br Feb 17, 2024
08615f4
MAINT: integrate: adapt `simpson` docstring to deprecations
ev-br Feb 17, 2024
474255b
MAINT: interpolate: account for interp2d being deprecated
ev-br Feb 17, 2024
314c3fa
MAINT: doctests: rm scipy.misc, add scipy.datasets
ev-br Feb 17, 2024
96ccea2
DOC: optimize: fix errors in milp and basinhopping docstrings
ev-br Feb 17, 2024
03b9294
BUG: signal: temporary skip doctesting normalize
ev-br Feb 17, 2024
8fb496c
DOC: sparse: fix the module docstring example
ev-br Feb 17, 2024
f4fe0fb
MAINT: sparse.linalg: fix docstring examples (warnings, deprecations)
ev-br Feb 17, 2024
ff6dbd0
DOC: sparse.csgraph: bump lobpcg tolerance to 1e-2
ev-br Feb 17, 2024
07a56ac
MAINT: special: make docstrings raw for TeX
ev-br Feb 17, 2024
b42d190
MAINT: stats: allow warnings in several stats docstrings
ev-br Feb 17, 2024
10eed17
MAINT: special: remove duplicate filters
ev-br Feb 17, 2024
3b363e0
MAINT: stats.qmc: add __all__ for the doctester
ev-br Feb 17, 2024
fdae089
MAINT: stats.sampling: add __all__ list for the doctester
ev-br Feb 17, 2024
dbf0fa9
TST: tutorial: fix examples in csgraph and integrate tutorials
ev-br Feb 17, 2024
a7c15d9
DOC: doctest tutorial/io.rst
ev-br Jul 6, 2022
d4daf0c
ENH: doctest: recurse into tutorial subdirs; fix easy erorrs, skip th…
ev-br Feb 17, 2024
059bfb4
CI: add a CI run on Circle
ev-br Feb 17, 2024
1f2a951
CI: run doctests on GH actions & CircleCI
ev-br Feb 17, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -174,6 +174,25 @@ jobs:
export PYTHONPATH=$PWD/build-install/lib/python3.11/site-packages
python dev.py --no-build refguide-check

# Reference guide checking
refguide_check_ng:
<<: *defaults
steps:
- attach_workspace:
at: ~/

- check-skip
- apt-install

- run:
name: refguide_check_ng
no_output_timeout: 25m
command: |
sudo apt-get install -y wamerican-small
export PYTHONPATH=$PWD/build-install/lib/python3.11/site-packages
python dev.py --no-build doctest


# Upload build output to scipy/devdocs repository, using SSH deploy keys.
# The keys are only available for builds on main branch.
# https://developer.github.com/guides/managing-deploy-keys/
Expand Down
7 changes: 7 additions & 0 deletions .github/workflows/linux_meson.yml
Original file line number Diff line number Diff line change
Expand Up @@ -123,6 +123,13 @@ jobs:

python -u dev.py mypy

- name: doctest_ng
run: |
sudo apt-get install -y wamerican-small threadpoolctl
python -m pip install git+https://github.com/ev-br/scpdt.git
python -m pip install matplotlib
python dev.py doctest

- name: Test SciPy
run: |
export OMP_NUM_THREADS=2
Expand Down
46 changes: 45 additions & 1 deletion dev.py
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ class EMOJI:
},
{
"name": "documentation",
"commands": ["doc", "refguide-check"],
"commands": ["doc", "refguide-check", "doctest"],
},
{
"name": "release",
Expand Down Expand Up @@ -1073,6 +1073,7 @@ class RefguideCheck(Task):
verbose = Option(
['--verbose', '-v'], default=False, is_flag=True, help="verbosity")


@classmethod
def task_meta(cls, **kwargs):
kwargs.update(cls.ctx.get())
Expand All @@ -1095,6 +1096,49 @@ def task_meta(cls, **kwargs):
}


@cli.cls_cmd('doctest')
class Doctest(Task):
""":wrench: Run doctests via CLI."""
ctx = CONTEXT

submodule = Option(
['--submodule', '-s'], default=None, metavar='SUBMODULE',
help="Submodule whose tests to run (cluster, constants, ...)")
verbose = Option(
['--verbose', '-v'], default=False, is_flag=True, help="verbosity")
filename = Option(
['-t', '--filename'], default=None, metavar='FILENAME',
help="Specify a .py file to check")
fail_fast = Option(
['--fail-fast', '-x'], default=False, is_flag=True,
help="fail on first error")

@classmethod
def task_meta(cls, **kwargs):
kwargs.update(cls.ctx.get())
Args = namedtuple('Args', [k for k in kwargs.keys()])
args = Args(**kwargs)
dirs = Dirs(args)

cmd = [f'{sys.executable}',
str(dirs.root / 'tools' / 'doctest_public_modules.py'),
]
if args.verbose:
cmd += ['-vv']
if args.filename:
cmd += ['-t', args.filename]
if args.submodule:
cmd += ['-s', args.submodule]
if args.fail_fast:
cmd += ['-x']
cmd_str = ' '.join(cmd)
return {
'actions': [f'env PYTHONPATH={dirs.site} {cmd_str}'],
'task_dep': ['build'],
'io': {'capture': False},
}


##########################################
# ENVS

Expand Down
3 changes: 2 additions & 1 deletion doc/source/tutorial/csgraph.rst
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,8 @@ first create this list. The system word lists consist of a file with one
word per line. The following should be modified to use the particular word
list you have available::

>>> word_list = open('/usr/share/dict/words').readlines()
>>> with open('/usr/share/dict/words') as f:
... word_list = f.readlines()
>>> word_list = map(str.strip, word_list)

We want to look at words of length 3, so let's select just those words of the
Expand Down
4 changes: 2 additions & 2 deletions doc/source/tutorial/integrate.rst
Original file line number Diff line number Diff line change
Expand Up @@ -317,7 +317,7 @@ of order 2 or less.
>>> x = np.array([1,3,4])
>>> y1 = f1(x)
>>> from scipy import integrate
>>> I1 = integrate.simpson(y1, x)
>>> I1 = integrate.simpson(y1, x=x)
>>> print(I1)
21.0

Expand All @@ -331,7 +331,7 @@ This corresponds exactly to
whereas integrating the second function

>>> y2 = f2(x)
>>> I2 = integrate.simpson(y2, x)
>>> I2 = integrate.simpson(y2, x=x)
>>> print(I2)
61.5

Expand Down
2 changes: 1 addition & 1 deletion doc/source/tutorial/interpolate/1D.rst
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ B-splines form an alternative (if formally equivalent) representation of piecewi
>>> xx = np.linspace(0, 3/2, 51)
>>> plt.plot(xx, bspl(xx), '--', label=r'$\sin(\pi x)$ approx')
>>> plt.plot(x, y, 'o', label='data')
>>> plt.plot(xx, der(xx)/np.pi, '--', label='$d \sin(\pi x)/dx / \pi$ approx')
>>> plt.plot(xx, der(xx)/np.pi, '--', label=r'$d \sin(\pi x)/dx / \pi$ approx')
>>> plt.legend()
>>> plt.show()

Expand Down
5 changes: 3 additions & 2 deletions doc/source/tutorial/interpolate/ND_regular_grid.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ using each method.

.. plot::

>>> import numpy as np
>>> import matplotlib.pyplot as plt
>>> from scipy.interpolate import RegularGridInterpolator

Expand Down Expand Up @@ -90,8 +91,8 @@ controlled by the ``fill_value`` keyword parameter:
>>> data = np.array([[0], [5], [10]])
>>> rgi = RegularGridInterpolator((x, y), data,
... bounds_error=False, fill_value=None)
>>> rgi([(2, 0), (2, 1), (2, -1)])
array([2., 2., 2.])) # extrapolate the value on the axis
>>> rgi([(2, 0), (2, 1), (2, -1)]) # extrapolates the value on the axis
array([2., 2., 2.]))
>>> rgi.fill_value = -101
>>> rgi([(2, 0), (2, 1), (2, -1)])
array([2., -101., -101.]))
Expand Down
5 changes: 3 additions & 2 deletions doc/source/tutorial/interpolate/ND_unstructured.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ that do not form a regular grid.

Suppose we want to interpolate the 2-D function

>>> import numpy as np
>>> def func(x, y):
... return x*(1-x)*np.cos(4*np.pi*x) * np.sin(4*np.pi*y**2)**2

Expand Down Expand Up @@ -114,7 +115,7 @@ classes from the `scipy.interpolate` module.
>>> ius = InterpolatedUnivariateSpline(x, y)
>>> yi = ius(xi)

>>> plt.subplot(2, 1, 1)
>>> plt.subplot(211)
>>> plt.plot(x, y, 'bo')
>>> plt.plot(xi, yi, 'g')
>>> plt.plot(xi, np.sin(xi), 'r')
Expand All @@ -124,7 +125,7 @@ classes from the `scipy.interpolate` module.
>>> rbf = RBFInterpolator(x, y)
>>> fi = rbf(xi)

>>> plt.subplot(2, 1, 2)
>>> plt.subplot(212)
>>> plt.plot(x, y, 'bo')
>>> plt.plot(xi, fi, 'g')
>>> plt.plot(xi, np.sin(xi), 'r')
Expand Down
3 changes: 2 additions & 1 deletion doc/source/tutorial/interpolate/splines_and_polynomials.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ Manipulating `PPoly` objects
and antiderivatives, computing integrals and root-finding. For example, we
tabulate the sine function and find the roots of its derivative.

>>> import numpy as np
>>> from scipy.interpolate import CubicSpline
>>> x = np.linspace(0, 10, 71)
>>> y = np.sin(x)
Expand Down Expand Up @@ -109,7 +110,7 @@ PCHIP interpolant (we could as well used a `CubicSpline`):

>>> from scipy.interpolate import PchipInterpolator
>>> x = np.linspace(0, np.pi/2, 70)
>>> y = (1 - m*np.sin(x)**2))**(-1/2)
>>> y = (1 - m*np.sin(x)**2)**(-1/2)
>>> spl = PchipInterpolator(x, y)

and integrate
Expand Down
26 changes: 15 additions & 11 deletions doc/source/tutorial/io.rst
Original file line number Diff line number Diff line change
Expand Up @@ -85,13 +85,13 @@ Now, to Python:

>>> mat_contents = sio.loadmat('octave_a.mat')
>>> mat_contents
{'a': array([[[ 1., 4., 7., 10.],
[ 2., 5., 8., 11.],
[ 3., 6., 9., 12.]]]),
{'__header__': b'MATLAB 5.0 MAT-file, written
by Octave 3.2.3, 2010-05-30 02:13:40 UTC',
'__version__': '1.0',
'__header__': 'MATLAB 5.0 MAT-file, written by
Octave 3.6.3, 2013-02-17 21:02:11 UTC',
'__globals__': []}
'__globals__': [],
'a': array([[[ 1., 4., 7., 10.],
[ 2., 5., 8., 11.],
[ 3., 6., 9., 12.]]])}
>>> oct_a = mat_contents['a']
>>> oct_a
array([[[ 1., 4., 7., 10.],
Expand Down Expand Up @@ -156,8 +156,12 @@ We can load this in Python:

>>> mat_contents = sio.loadmat('octave_struct.mat')
>>> mat_contents
{'my_struct': array([[([[1.0]], [[2.0]])]],
dtype=[('field1', 'O'), ('field2', 'O')]), '__version__': '1.0', '__header__': 'MATLAB 5.0 MAT-file, written by Octave 3.6.3, 2013-02-17 21:23:14 UTC', '__globals__': []}
{'__header__': b'MATLAB 5.0 MAT-file, written by Octave 3.2.3, 2010-05-30 02:00:26 UTC',
'__version__': '1.0',
'__globals__': [],
'my_struct': array([[(array([[1.]]), array([[2.]]))]], dtype=[('field1', 'O'), ('field2', 'O')])
}

>>> oct_struct = mat_contents['my_struct']
>>> oct_struct.shape
(1, 1)
Expand Down Expand Up @@ -214,7 +218,7 @@ this, use the ``struct_as_record=False`` parameter setting to ``loadmat``.
File "<stdin>", line 1, in <module>
AttributeError: 'mat_struct' object has no attribute 'shape'
>>> type(oct_struct)
<class 'scipy.io.matlab.mio5_params.mat_struct'>
<class 'scipy.io.matlab._mio5_params.mat_struct'>
>>> oct_struct.field1
1.0

Expand Down Expand Up @@ -287,12 +291,12 @@ Back to Python:

Saving to a MATLAB cell array just involves making a NumPy object array:

>>> obj_arr = np.zeros((2,), dtype=np.object)
>>> obj_arr = np.zeros((2,), dtype=object)
>>> obj_arr[0] = 1
>>> obj_arr[1] = 'a string'
>>> obj_arr
array([1, 'a string'], dtype=object)
>>> sio.savemat('np_cells.mat', {'obj_arr':obj_arr})
>>> sio.savemat('np_cells.mat', {'obj_arr': obj_arr})

.. sourcecode:: octave

Expand Down
17 changes: 9 additions & 8 deletions doc/source/tutorial/stats/continuous_kstwo.rst
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ with asymptotic estimates of Li-Chien, Pelz and Good to compute the CDF with 5-1
Examples
--------

>>> import numpy as np
>>> from scipy.stats import kstwo

Show the probability of a gap at least as big as 0, 0.5 and 1.0 for a sample of size 5
Expand All @@ -50,17 +51,17 @@ a target N(0, 1) CDF.
>>> gendist = norm(0.5, 1) # Normal distribution, mean 0.5, stddev 1
>>> x = np.sort(gendist.rvs(size=n, random_state=np.random.default_rng()))
>>> x
array([-1.59113056, -0.66335147, 0.54791569, 0.78009321, 1.27641365])
array([-1.59113056, -0.66335147, 0.54791569, 0.78009321, 1.27641365]) # may vary
>>> target = norm(0, 1)
>>> cdfs = target.cdf(x)
>>> cdfs
array([0.0557901 , 0.25355274, 0.7081251 , 0.78233199, 0.89909533])
# Construct the Empirical CDF and the K-S statistics (Dn+, Dn-, Dn)
array([0.0557901 , 0.25355274, 0.7081251 , 0.78233199, 0.89909533]) # may vary
>>> # Construct the Empirical CDF and the K-S statistics (Dn+, Dn-, Dn)
>>> ecdfs = np.arange(n+1, dtype=float)/n
>>> cols = np.column_stack([x, ecdfs[1:], cdfs, cdfs - ecdfs[:n], ecdfs[1:] - cdfs])
>>> np.set_printoptions(precision=3)
>>> cols
array([[-1.591, 0.2 , 0.056, 0.056, 0.144],
array([[-1.591, 0.2 , 0.056, 0.056, 0.144], # may vary
[-0.663, 0.4 , 0.254, 0.054, 0.146],
[ 0.548, 0.6 , 0.708, 0.308, -0.108],
[ 0.78 , 0.8 , 0.782, 0.182, 0.018],
Expand All @@ -70,17 +71,17 @@ array([[-1.591, 0.2 , 0.056, 0.056, 0.144],
>>> Dn = np.max(Dnpm)
>>> iminus, iplus = np.argmax(gaps, axis=0)
>>> print('Dn- = %f (at x=%.2f)' % (Dnpm[0], x[iminus]))
Dn- = 0.308125 (at x=0.55)
Dn- = 0.246201 (at x=-0.14)
>>> print('Dn+ = %f (at x=%.2f)' % (Dnpm[1], x[iplus]))
Dn+ = 0.146447 (at x=-0.66)
Dn+ = 0.224726 (at x=0.19)
>>> print('Dn = %f' % (Dn))
Dn = 0.308125
Dn = 0.246201

>>> probs = kstwo.sf(Dn, n)
>>> print(chr(10).join(['For a sample of size %d drawn from a N(0, 1) distribution:' % n,
... ' Kolmogorov-Smirnov 2-sided n=%d: Prob(Dn >= %f) = %.4f' % (n, Dn, probs)]))
For a sample of size 5 drawn from a N(0, 1) distribution:
Kolmogorov-Smirnov 2-sided n=5: Prob(Dn >= 0.308125) = 0.6319
Kolmogorov-Smirnov 2-sided n=5: Prob(Dn >= 0.246201) = 0.8562

Plot the Empirical CDF against the target N(0, 1) CDF

Expand Down
8 changes: 4 additions & 4 deletions doc/source/tutorial/stats/resampling.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,8 +48,8 @@ Your brother Kyle is the analytical one. He answers:
>>> std = math.sqrt(n*p*(1-p))
>>> # CDF of the normal distribution. (Unfortunately, Kyle forgets a continuity correction that would produce a more accurate answer.)
>>> prob = 0.5 * (1 + math.erf((x - mean) / (std * math.sqrt(2))))
>>> print(f"The normal approximation estimates the probability as {prob}")
The normal approximation estimates the probability as 0.15865525393145713
>>> print(f"The normal approximation estimates the probability as {prob:.3f}")
The normal approximation estimates the probability as 0.159

You are a little more practical, so you decide to take a computational
approach (or more precisely, a Monte Carlo approach): just simulate many
Expand All @@ -63,8 +63,8 @@ count does not exceed 45.
>>> simulation = rng.random(size=(n, N)) < p # False for tails, True for heads
>>> counts = np.sum(simulation, axis=0) # count the number of heads each trial
>>> prob = np.sum(counts <= x) / N # estimate the probability as the observed proportion of cases in which the count did not exceed 45
>>> print(f"The Monte Carlo approach estimates the probability as {prob}")
The Monte Carlo approach estimates the probability as 0.18348
>>> print(f"The Monte Carlo approach estimates the probability as {prob:.3f}")
The Monte Carlo approach estimates the probability as 0.187

The demon replies:

Expand Down
8 changes: 5 additions & 3 deletions doc/source/tutorial/stats/sampling.rst
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,8 @@ An example of this interface is shown below:
... return -x * exp(-0.5 * x*x)
...
>>> dist = StandardNormal()
>>>
>>>
>>> import numpy as np
>>> urng = np.random.default_rng()
>>> rng = TransformedDensityRejection(dist, random_state=urng)

Expand Down Expand Up @@ -238,7 +239,8 @@ by visualizing the histogram of the samples:
:class:`~TransformedDensityRejection` would not be the same even for
the same ``random_state``:

>>> from scipy.stats.sampling import norm, TransformedDensityRejection
>>> from scipy.stats import norm
>>> from scipy.stats.sampling import TransformedDensityRejection
>>> from copy import copy
>>> dist = StandardNormal()
>>> urng1 = np.random.default_rng()
Expand All @@ -253,7 +255,7 @@ We can pass a ``domain`` parameter to truncate the distribution:

>>> rng = TransformedDensityRejection(dist, domain=(-1, 1), random_state=urng)
>>> rng.rvs((5, 3))
array([[-0.99865691, 0.38104014, 0.31633526],
array([[-0.99865691, 0.38104014, 0.31633526], # may vary
[ 0.88433909, -0.45181849, 0.78574461],
[ 0.3337244 , 0.12924307, 0.40499404],
[-0.51865761, 0.43252222, -0.6514866 ],
Expand Down
Loading