:py:mod:`nessai.flowmodel` ========================== .. py:module:: nessai.flowmodel .. autoapi-nested-parse:: Object and functions to handle training the normalising flow. .. !! processed by numpydoc !! Module Contents --------------- Classes ~~~~~~~ .. autoapisummary:: nessai.flowmodel.FlowModel Functions ~~~~~~~~~ .. autoapisummary:: nessai.flowmodel.update_config Attributes ~~~~~~~~~~ .. autoapisummary:: nessai.flowmodel.logger .. py:data:: logger .. !! processed by numpydoc !! .. py:function:: update_config(d) Update the configuration dictionary to include the defaults. :Parameters: **d** : dict Dictionary with configuration :Returns: dict Dictionary with updated default configuration .. rubric:: Notes The default configuration is:: lr=0.001 annealing=False batch_size=100 val_size=0.1 max_epochs=500 patience=20 noise_scale=0.0, use_dataloader=False, optimiser='adam', optimiser_kwargs={} model_config=default_model where ``default model`` is:: n_neurons=32 n_blocks=4 n_layers=2 ftype='RealNVP' device_tag='cpu', flow=None, inference_device_tag=None, kwargs={batch_norm_between_layers=True, linear_transform='lu'} The kwargs can contain any additional keyword arguments that are specific to the type of flow being used. .. !! processed by numpydoc !! .. py:class:: FlowModel(config=None, output='./') Object that contains the normalsing flows and handles training and data pre-processing. Does NOT use structured arrays for live points, :obj:`~nessai.proposal.base.Proposal` object should act as the interface between structured used by the sampler and unstructured arrays of live points used for training. :Parameters: **config** : dict, optional Configuration used for the normalising flow. If None, default values are used. **output** : str, optional Path for output, this includes weights files and the loss plot. .. !! processed by numpydoc !! .. py:attribute:: model_config .. !! processed by numpydoc !! .. py:method:: save_input(self, config, output_file=None) Save the dictionary used as an inputs as a JSON file in the output directory. :Parameters: **config** : dict Dictionary to save. **output_file** : str, optional File to save the config to. .. !! processed by numpydoc !! .. py:method:: setup_from_input_dict(self, config) Setup the trainer from a dictionary, all keys in the dictionary are added as methods to the object. Input is automatically saved. :Parameters: **config** : dict Dictionary with parameters that are used to update the defaults. .. !! processed by numpydoc !! .. py:method:: update_mask(self) Method to update the ask upon calling ``initialise`` By default the mask is left unchanged. .. !! processed by numpydoc !! .. py:method:: get_optimiser(self, optimiser='adam', **kwargs) Get the optimiser and ensure it is always correctly initialised. :Returns: :obj:`torch.optim.Adam` Instance of the Adam optimiser from torch.optim .. !! processed by numpydoc !! .. py:method:: initialise(self) Initialise the model and optimiser. This includes: - Updating the model configuration - Initialising the normalising flow - Initialising the optimiser - Configuring the inference device .. !! processed by numpydoc !! .. py:method:: move_to(self, device, update_default=False) Move the flow to a different device. :Parameters: **device** : str Device to move flow to. **update_default** : bool, optional If True, the default device for the flow (and data) will be updated. .. !! processed by numpydoc !! .. py:method:: prep_data(self, samples, val_size, batch_size, use_dataloader=False) Prep data and return dataloaders for training :Parameters: **samples** : array_like Array of samples to split in to training and validation. **val_size** : float Float between 0 and 1 that defines the fraction of data used for validation. **batch_size** : int Batch size used when constructing dataloaders. **use_dataloader** : bool, optional If True data is returned in a dataloader else a tensor is returned. :Returns: **train_data, val_data** : Training and validation data as either a tensor or dataloader .. !! processed by numpydoc !! .. py:method:: train(self, samples, max_epochs=None, patience=None, output=None, val_size=None, plot=True) Train the flow on a set of samples. Allows for training parameters to specified instead of those given in initial config. :Parameters: **samples** : ndarray Unstructured numpy array containing data to train on **max_epochs** : int, optional Maximum number of epochs that is used instead of value in the configuration. **patience** : int, optional Patience in number of epochs that is used instead of value in the configuration. **val_size** : float, optional Fraction of the samples to use for validation **output** : str, optional Path to output directory that is used instead of the path specified when the object is initialised **plot** : bool, optional Boolean to enable or disable plotting the loss function .. !! processed by numpydoc !! .. py:method:: save_weights(self, weights_file) Save the weights file. If the file already exists move it to ``.old`` and then save the file. :Parameters: **weights_file** : str Path to to file to save weights. Recommended file type is ``.pt``. .. !! processed by numpydoc !! .. py:method:: load_weights(self, weights_file) Load weights for the model and initialises the model if it is not initialised. The weights_file attribute is also updated. Model is loaded in evaluation mode (``model.eval()``) :Parameters: **weights_files** : str Path to weights file .. !! processed by numpydoc !! .. py:method:: reload_weights(self, weights_file) Tries to the load the weights file and if not, tries to load the weights file stored internally. :Parameters: **weights_files** : str Path to weights file .. !! processed by numpydoc !! .. py:method:: reset_model(self, weights=True, permutations=False) Reset the weights of the model and optimiser. :Parameters: **weights** : bool, optional If true the model weights are reset. **permutations** : bool, optional If true any permutations (linear transforms) are reset. .. !! processed by numpydoc !! .. py:method:: forward_and_log_prob(self, x) Forward pass through the model and return the samples in the latent space with their log probabilities :Parameters: **x** : ndarray Array of samples :Returns: **z** : ndarray Samples in the latent space **log_prob** : ndarray Log probabilities for each samples .. !! processed by numpydoc !! .. py:method:: sample_and_log_prob(self, N=1, z=None, alt_dist=None) Generate samples from samples drawn from the base distribution or and alternative distribution from provided latent samples :Parameters: **N** : int, optional Number of samples to draw if z is not specified **z** : ndarray, optional Array of latent samples to map the the data space, if ``alt_dist`` is not specified they are assumed to be drawn from the base distribution of the flow. **alt_dist** : :obj:`nflows.distribution.Distribution` Distribution object from which the latent samples z were drawn from. Must have a ``log_prob`` method that accepts an instance of ``torch.Tensor`` :Returns: **samples** : ndarray Array containing samples in the latent space. **log_prob** : ndarray Array containing the log probability that corresponds to each sample. .. !! processed by numpydoc !!