NumericalIntegrator
NumericalIntegrator
NumericalIntegrator()A numerical integrator for high-dimensional integrals.
Methods
| Name | Description |
|---|---|
__copy__ |
Copy the grid without any unprocessed samples. |
add_training_samples |
Add the samples and their corresponding function evaluations to the grid |
continuous |
Create a new continuous grid for the numerical integrator. |
discrete |
Create a new discrete grid for the numerical integrator |
export_grid |
Export the grid, so that it can be sent to another thread or machine |
get_live_estimate |
Get the estamate of the average, error, chi-squared, maximum negative and positive evaluations, and the number of processed samples for the current iteration, including the points submitted in the current iteration. |
import_grid |
Import an exported grid from another thread or machine |
integrate |
Integrate the function integrand that maps a list of Samples to a list of floats |
merge |
Add the accumulated training samples from the grid other to the current grid |
probe |
Probe the Jacobian weight for a region in the grid. |
rng |
Create a new random number generator, suitable for use with the integrator |
sample |
Sample num_samples points from the grid using the random number generator rng |
uniform |
Create a new uniform layered grid for the numerical integrator |
update |
Update the grid using the discrete_learning_rate and continuous_learning_rate. |
__copy__
NumericalIntegrator.__copy__() -> NumericalIntegratorCopy the grid without any unprocessed samples.
add_training_samples
NumericalIntegrator.add_training_samples(samples: Sequence[Sample], evals: Sequence[float]) -> NoneAdd the samples and their corresponding function evaluations to the grid. Call update after to update the grid and to obtain the new expected value for the integral.
Parameters
samples(Sequence[Sample]) The samples to add or process.evals(Sequence[float]) The function evaluations associated with the samples.
continuous
NumericalIntegrator.continuous(
n_dims: int,
n_bins: int = 128,
min_samples_for_update: int = 100,
bin_number_evolution: Sequence[int] | None = None,
train_on_avg: bool = False,
) -> NumericalIntegratorCreate a new continuous grid for the numerical integrator.
Parameters
n_dims(int) The number of continuous integration dimensions.n_bins(int) The number of bins per continuous dimension.min_samples_for_update(int) The minimum number of samples to accumulate before updating the grid.bin_number_evolution(Sequence[int] | None) An optional schedule that changes the number of bins during training.train_on_avg(bool) Whether integrator training should use average sample values.
discrete
NumericalIntegrator.discrete(
bins: Sequence[NumericalIntegrator | None],
max_prob_ratio: float = 100.0,
train_on_avg: bool = False,
) -> NumericalIntegratorCreate a new discrete grid for the numerical integrator. Each bin can have a sub-grid.
Examples
def integrand(samples: list[Sample]):
res = []
for sample in samples:
if sample.d[0] == 0:
res.append(sample.c[0]**2)
else:
res.append(sample.c[0]**1/2)
return res
integrator = NumericalIntegrator.discrete(
[NumericalIntegrator.continuous(1), NumericalIntegrator.continuous(1)])
integrator.integrate(integrand, True, 10, 10000)Parameters
bins(Sequence[NumericalIntegrator | None]) The optional subgrid assigned to each discrete bin.max_prob_ratio(float) The maximum probability ratio allowed between bins.train_on_avg(bool) Whether integrator training should use average sample values.
export_grid
NumericalIntegrator.export_grid(export_samples: bool = True) -> bytesExport the grid, so that it can be sent to another thread or machine. If you are exporting your main grid, make sure to set export_samples to False to avoid copying unprocessed samples.
Use import_grid to load the grid.
Parameters
export_samples(bool) Whether pending samples should be included in the exported grid.
get_live_estimate
NumericalIntegrator.get_live_estimate() -> tuple[float, float, float, float, float, int]Get the estamate of the average, error, chi-squared, maximum negative and positive evaluations, and the number of processed samples for the current iteration, including the points submitted in the current iteration.
import_grid
NumericalIntegrator.import_grid(grid: bytes) -> NumericalIntegratorImport an exported grid from another thread or machine. Use export_grid to export the grid.
Parameters
grid(bytes) The serialized integration grid to import.
integrate
NumericalIntegrator.integrate(
integrand: Callable[[Sequence[Sample]], list[float]],
max_n_iter: int = 10000000,
min_error: float = 0.01,
n_samples_per_iter: int = 10000,
seed: int = 0,
show_stats: bool = True,
) -> tuple[float, float, float]Integrate the function integrand that maps a list of Samples to a list of floats. The return value is the average, the statistical error, and chi-squared of the integral.
With show_stats=True, intermediate statistics will be printed. max_n_iter determines the number of iterations and n_samples_per_iter determine the number of samples per iteration. This is the same amount of samples that the integrand function will be called with.
For more flexibility, use sample, add_training_samples and update. See update for an example.
Examples
from symbolica import NumericalIntegrator, Sample
def integrand(samples: list[Sample]):
res = []
for sample in samples:
res.append(sample.c[0]**2+sample.c[1]**2)
return res
avg, err = NumericalIntegrator.continuous(2).integrate(integrand, True, 10, 100000)
print('Result: {} +- {}'.format(avg, err))Parameters
integrand(Callable[[Sequence[Sample]], list[float]]) The function to integrate.max_n_iter(int) The maximum number of integration iterations.min_error(float) The target statistical error.n_samples_per_iter(int) The number of samples drawn per integration iteration.seed(int) The seed used to initialize the random number generator.show_stats(bool) Whether intermediate integration statistics should be shown.
merge
NumericalIntegrator.merge(other: NumericalIntegrator) -> NoneAdd the accumulated training samples from the grid other to the current grid. The grid structure of self and other must be equivalent.
Parameters
other(NumericalIntegrator) The other operand to combine or compare with.
probe
NumericalIntegrator.probe(probe: Probe) -> floatProbe the Jacobian weight for a region in the grid.
Parameters
probe(Probe) The probe that identifies the region of interest.
rng
NumericalIntegrator.rng(seed: int, stream_id: int) -> RandomNumberGeneratorCreate a new random number generator, suitable for use with the integrator. Each thread of instance of the integrator should have its own random number generator, that is initialized with the same seed but with a different stream id.
Parameters
seed(int) The seed used to initialize the random number generator.stream_id(int) The stream identifier for the random number generator.
sample
NumericalIntegrator.sample(num_samples: int, rng: RandomNumberGenerator) -> list[Sample]Sample num_samples points from the grid using the random number generator rng. See rng() for how to create a random number generator.
Parameters
num_samples(int) The number of samples to draw.rng(RandomNumberGenerator) The random number generator used to draw the samples.
uniform
NumericalIntegrator.uniform(
bins: Sequence[int],
continuous_subgrid: NumericalIntegrator,
) -> NumericalIntegratorCreate a new uniform layered grid for the numerical integrator. len(bins) specifies the number of discrete layers, and each entry in bins specifies the number of bins in that layer. Each discrete bin has equal probability.
Examples
def integrand(samples: Sequence[Sample]) -> list[float]:
res = []
for sample in samples:
if sample.d[0] == 0:
res.append(sample.c[0]**2)
else:
res.append(sample.c[0]**3)
return res
integrator = NumericalIntegrator.uniform(
[2], NumericalIntegrator.continuous(1))
integrator.integrate(integrand, min_error=1e-3)Parameters
bins(Sequence[int]) The number of bins in each discrete layer.continuous_subgrid(NumericalIntegrator) The continuous subgrid attached beneath the discrete layers.
update
NumericalIntegrator.update(
discrete_learning_rate: float,
continous_learning_rate: float,
) -> tuple[float, float, float]Update the grid using the discrete_learning_rate and continuous_learning_rate.
Examples
from symbolica import NumericalIntegrator, Sample
def integrand(samples: list[Sample]):
res = []
for sample in samples:
res.append(sample.c[0]**2+sample.c[1]**2)
return res
integrator = NumericalIntegrator.continuous(2)
for i in range(10):
samples = integrator.sample(10000 + i * 1000)
res = integrand(samples)
integrator.add_training_samples(samples, res)
avg, err, chi_sq = integrator.update(1.5, 1.5)
print('Iteration {}: {:.6} +- {:.6}, chi={:.6}'.format(i+1, avg, err, chi_sq))Parameters
discrete_learning_rate(float) The learning rate for discrete layers.continous_learning_rate(float) The learning rate for continuous layers.