fitproblem - Interface between models and fitters

Fitness Manage parameters, data, and theory function evaluation.
FitProblem Return a fit problem instance for the fitness function(s).
load_problem Load a problem definition from a python script file.
BaseFitProblem See FitProblem()
MultiFitProblem Weighted fits for multiple models.

Interface between the models and the fitters.

Fitness defines the interface that model evaluators can follow. These models can be bundled together into a FitProblem() and sent to bumps.fitters.FitDriver for optimization and uncertainty analysis.

Summary of problem attributes:

# Used by fitters
nllf(p: Optional[Vector]) -> float  # main calculation
bounds() -> Tuple(Vector, Vector)    # or equivalent sequence
setp(p: Vector) -> None
getp() -> Vector
residuals() -> Vector  # for LM, MPFit
parameter_residuals() -> Vector  # for LM, MPFit
constraints_nllf() -> float # for LM, MPFit;  constraint cost is spread across the individual residuals
randomize() -> None # for multistart
resynth_data() -> None  # for Monte Carlo resampling of maximum likelihood
restore_data() -> None # for Monte Carlo resampling of maximum likelihood
name: str  # DREAM uses this
chisq() -> float
chisq_str() -> str
labels() -> List[str]
summarize() -> str
show() -> None
load(input_path: str) -> None
save(output_path: str) -> None
plot(figfile: str, view: str) -> None

# Set/used by bumps.cli
model_reset() -> None # called by load_model
path: str  # set by load_model
name: str # set by load_model
title: str = filename # set by load_moel
options: List[str]  # from sys.argv[1:]
undefined:List[int]  # when loading a save .par file, these parameters weren't defined
store: str # set by make_store
output_path: str # set by make_store
simulate_data(noise: float) -> None # for --simulate in opts
cov() -> Matrix   # for --cov in opts
class bumps.fitproblem.Fitness[source]

Bases: object

Manage parameters, data, and theory function evaluation.

See Complex models for a detailed explanation.

nllf()[source]

Return the negative log likelihood value of the current parameter set.

numpoints()[source]

Return the number of data points.

parameters()[source]

return the parameters in the model.

model parameters are a hierarchical structure of lists and dictionaries.

plot(view='linear')[source]

Plot the model to the current figure. You only get one figure, but you can make it as complex as you want. This will be saved as a png on the server, and composed onto a results web page.

residuals()[source]

Return residuals for current theory minus data.

Used for Levenburg-Marquardt, and for plotting.

restore_data()[source]

Restore the original data in the model (after resynth).

resynth_data()[source]

Generate fake data based on uncertainties in the real data. For Monte Carlo resynth-refit uncertainty analysis. Bootstrapping?

save(basename)[source]

Save the model to a file based on basename+extension. This will point to a path to a directory on a remote machine; don’t make any assumptions about information stored on the server. Return the set of files saved so that the monitor software can make a pretty web page.

to_dict()[source]
update()[source]

Called when parameters have been updated. Any cached values will need to be cleared and the model reevaluated.

bumps.fitproblem.FitProblem(*args, **kw)[source]

Return a fit problem instance for the fitness function(s).

For an individual model:

fitness is a Fitness instance.

For a set of models:

models is a sequence of Fitness instances.

weights is an optional scale factor for each model. A weighted fit returns nllf \(L = \sum w_k^2 L_k\). If an individual nllf is the sum squared residuals then this is equivalent to scaling the measurement uncertainty by \(1/w\). Unless the measurement uncertainty is unknown, weights should be in [0, 1], representing an unknown systematic uncertainty spread across the individual measurements.

freevars is parameter.FreeVariables instance defining the per-model parameter assignments. See Free Variables for details.

Additional parameters:

name name of the problem

constraints is a function which returns the negative log likelihood of seeing the parameters independent from the fitness function. Use this for example to check for feasible regions of the search space, or to add constraints that cannot be easily calculated per parameter. Ideally, the constraints nllf will increase as you go farther from the feasible region so that the fit will be directed toward feasible values.

soft_limit is the constraints function cutoff, beyond which the penalty_nllf will be used and fitness nllf will not be calculated.

penalty_nllf is the nllf to use for fitness when constraints is greater than soft_limit.

Total nllf is the sum of the parameter nllf, the constraints nllf and the depending on whether constraints is greater than soft_limit, either the fitness nllf or the penalty nllf.

New in 0.9.0: weights are now squared when computing the sum rather than linear.

bumps.fitproblem.load_problem(filename, options=None)[source]

Load a problem definition from a python script file.

sys.argv is set to [file] + options within the context of the script.

The user must define problem=FitProblem(...) within the script.

Raises ValueError if the script does not define problem.

class bumps.fitproblem.BaseFitProblem(fitness, name=None, constraints=None, penalty_nllf=inf, soft_limit=inf, partial=False)[source]

Bases: object

See FitProblem()

bounds()[source]

Return the bounds fore each parameter a 2 x N array

chisq()[source]

Return sum squared residuals normalized by the degrees of freedom.

In the context of a composite fit, the reduced chisq on the individual models only considers the points and the fitted parameters within the individual model.

Note that this does not include cost factors due to constraints on the parameters, such as sample_offset ~ N(0,0.01).

chisq_str()[source]

Return a string representing the chisq equivalent of the nllf.

If the model has strictly gaussian independent uncertainties then the negative log likelihood function will return 0.5*sum(residuals**2), which is 1/2*chisq. Since we are printing normalized chisq, we multiply the model nllf by 2/DOF before displaying the value. This is different from the problem nllf function, which includes the cost of the prior parameters and the cost of the penalty constraints in the total nllf. The constraint value is displayed separately.

constraints_nllf()[source]

Returns the cost of all constraints.

cov()[source]

Return the covariance matrix as computed by numdifftools from the Hessian matrix for the problem at the current parameter values.

getp()[source]

Returns the current value of the parameter vector.

has_residuals

True if the underlying fitness function defines residuals.

labels()[source]

Return the list of labels, one per fitted parameter.

model_nllf()[source]

Negative log likelihood of seeing data given model.

model_parameters()[source]

Parameters associated with the model.

model_points()[source]

Number of data points associated with the model.

model_reset()[source]

Prepare for the fit.

This sets the parameters and the bounds properties that the solver is expecting from the fittable object. We also compute the degrees of freedom so that we can return a normalized fit likelihood.

If the set of fit parameters changes, then model_reset must be called.

model_update()[source]

Update the model according to the changed parameters.

nllf(pvec=None)[source]

compute the cost function for a new parameter set p.

this is not simply the sum-squared residuals, but instead is the negative log likelihood of seeing the data given the model parameters plus the negative log likelihood of seeing the model parameters. the value is used for a likelihood ratio test so normalization constants can be ignored. there is an additional penalty value provided by the model which can be used to implement inequality constraints. any penalty should be large enough that it is effectively excluded from the parameter space returned from uncertainty analysis.

the model is not actually calculated if the parameter nllf plus the constraint nllf are bigger than soft_limit, but instead it is assigned a value of penalty_nllf. this will prevent expensive models from spending time computing values in the unfeasible region.

parameter_nllf()[source]

Returns negative log likelihood of seeing parameters p.

parameter_residuals()[source]

Returns negative log likelihood of seeing parameters p.

plot(p=None, fignum=None, figfile=None, view=None)[source]

Plot the problem state for the current parameter set.

The underlying Fitness object plot method is called with view. It should produce its plot on the current matplotlib figure. This method will add chisq to the plot and save it to a file.

randomize(n=None)[source]

Generates a random model.

randomize() sets the model to a random value.

randomize(n) returns a population of n random models.

For indefinite bounds, the random population distribution is centered on initial value of the parameter, or 1. if the initial parameter is not finite.

residuals()[source]

Return the model residuals.

If the model is defined by \(y = f(x) + \epsilon\) for normally distributed error in the measurement \(y\) equal to \(\epsilon \sim N(0, \sigma^2)\), then residuals will be defined by \(R = (y - f(x))/\sigma\). If the measurement uncertainty is not normal, then the normal equivalent residuals should be defined so that the Levenberg-Marquardt fit behaves reasonably, and the plot of residuals gives an indication of which points are driving the fit.

restore_data()[source]

Restore original data after resynthesis.

resynth_data()[source]

Resynthesize data with noise from the uncertainty estimates.

save(basename)[source]

Save the problem state for the current parameter set.

The underlying Fitness object save method is called, if it exists, so that theory values can be saved in a format suitable to the problem.

Uses basename as the base of any files that are created.

setp(pvec)[source]

Set a new value for the parameters into the model. If the model is valid, calls model_update to signal that the model should be recalculated.

Returns True if the value is valid and the parameters were set, otherwise returns False.

show(_subs={})[source]

Print the available parameters to the console as a tree.

simulate_data(noise=None)[source]

Simulate data with added noise

stderr()[source]

Return the 1-sigma uncertainty estimate for each parameter and the correlation matrix R as computed from the covariance returned by cov.

summarize()[source]

Return a table of current parameter values with range bars.

to_dict()[source]
valid(pvec)[source]

Return true if the point is in the feasible region

class bumps.fitproblem.MultiFitProblem(models, weights=None, name=None, constraints=None, soft_limit=inf, penalty_nllf=1000000.0, freevars=None)[source]

Bases: bumps.fitproblem.BaseFitProblem

Weighted fits for multiple models. See FitProblem() for an explanation of weights.

bounds()

Return the bounds fore each parameter a 2 x N array

chisq()

Return sum squared residuals normalized by the degrees of freedom.

In the context of a composite fit, the reduced chisq on the individual models only considers the points and the fitted parameters within the individual model.

Note that this does not include cost factors due to constraints on the parameters, such as sample_offset ~ N(0,0.01).

chisq_str()

Return a string representing the chisq equivalent of the nllf.

If the model has strictly gaussian independent uncertainties then the negative log likelihood function will return 0.5*sum(residuals**2), which is 1/2*chisq. Since we are printing normalized chisq, we multiply the model nllf by 2/DOF before displaying the value. This is different from the problem nllf function, which includes the cost of the prior parameters and the cost of the penalty constraints in the total nllf. The constraint value is displayed separately.

constraints_nllf()[source]

Return the cost function for all constraints

cov()

Return the covariance matrix as computed by numdifftools from the Hessian matrix for the problem at the current parameter values.

getp()

Returns the current value of the parameter vector.

has_residuals

True if all underlying fitness functions define residuals.

labels()

Return the list of labels, one per fitted parameter.

model_nllf()[source]

Return cost function for all data sets

model_parameters()[source]

Return parameters from all models

model_points()[source]

Return number of points in all models

model_reset()

Prepare for the fit.

This sets the parameters and the bounds properties that the solver is expecting from the fittable object. We also compute the degrees of freedom so that we can return a normalized fit likelihood.

If the set of fit parameters changes, then model_reset must be called.

model_update()[source]

Let all models know they need to be recalculated

models

Iterate over models, with free parameters set from model values

nllf(pvec=None)

compute the cost function for a new parameter set p.

this is not simply the sum-squared residuals, but instead is the negative log likelihood of seeing the data given the model parameters plus the negative log likelihood of seeing the model parameters. the value is used for a likelihood ratio test so normalization constants can be ignored. there is an additional penalty value provided by the model which can be used to implement inequality constraints. any penalty should be large enough that it is effectively excluded from the parameter space returned from uncertainty analysis.

the model is not actually calculated if the parameter nllf plus the constraint nllf are bigger than soft_limit, but instead it is assigned a value of penalty_nllf. this will prevent expensive models from spending time computing values in the unfeasible region.

parameter_nllf()

Returns negative log likelihood of seeing parameters p.

parameter_residuals()

Returns negative log likelihood of seeing parameters p.

plot(p=None, fignum=1, figfile=None, view=None)[source]

Plot the problem state for the current parameter set.

The underlying Fitness object plot method is called with view. It should produce its plot on the current matplotlib figure. This method will add chisq to the plot and save it to a file.

randomize(n=None)

Generates a random model.

randomize() sets the model to a random value.

randomize(n) returns a population of n random models.

For indefinite bounds, the random population distribution is centered on initial value of the parameter, or 1. if the initial parameter is not finite.

residuals()[source]

Return the model residuals.

If the model is defined by \(y = f(x) + \epsilon\) for normally distributed error in the measurement \(y\) equal to \(\epsilon \sim N(0, \sigma^2)\), then residuals will be defined by \(R = (y - f(x))/\sigma\). If the measurement uncertainty is not normal, then the normal equivalent residuals should be defined so that the Levenberg-Marquardt fit behaves reasonably, and the plot of residuals gives an indication of which points are driving the fit.

restore_data()[source]

Restore original data after resynthesis.

resynth_data()[source]

Resynthesize data with noise from the uncertainty estimates.

save(basename)[source]

Save the problem state for the current parameter set.

The underlying Fitness object save method is called, if it exists, so that theory values can be saved in a format suitable to the problem.

Uses basename as the base of any files that are created.

set_active_model(i)[source]

Use free parameters from model i

setp(pvec)

Set a new value for the parameters into the model. If the model is valid, calls model_update to signal that the model should be recalculated.

Returns True if the value is valid and the parameters were set, otherwise returns False.

show()[source]

Print the available parameters to the console as a tree.

simulate_data(noise=None)[source]

Simulate data with added noise

stderr()

Return the 1-sigma uncertainty estimate for each parameter and the correlation matrix R as computed from the covariance returned by cov.

summarize()

Return a table of current parameter values with range bars.

to_dict()[source]
valid(pvec)

Return true if the point is in the feasible region