:py:mod:`nessai.flows.realnvp` ============================== .. py:module:: nessai.flows.realnvp .. autoapi-nested-parse:: Implementation of Real Non Volume Preserving flows. .. !! processed by numpydoc !! Module Contents --------------- Classes ~~~~~~~ .. autoapisummary:: nessai.flows.realnvp.RealNVP Attributes ~~~~~~~~~~ .. autoapisummary:: nessai.flows.realnvp.logger .. py:data:: logger .. !! processed by numpydoc !! .. py:class:: RealNVP(features, hidden_features, num_layers, num_blocks_per_layer, mask=None, context_features=None, net='resnet', use_volume_preserving=False, activation=F.relu, dropout_probability=0.0, batch_norm_within_layers=False, batch_norm_between_layers=False, linear_transform=None, distribution=None) Bases: :py:obj:`nessai.flows.base.NFlow` Implementation of RealNVP. This class modifies ``SimpleRealNVP`` from nflows to allows for a custom mask to be parsed as a numpy array and allows for MLP to be used instead of a ResNet > L. Dinh et al., Density estimation using Real NVP, ICLR 2017. :Parameters: **features** : int Number of features (dimensions) in the data space **hidden_features** : int Number of neurons per layer in each neural network **num_layers** : int Number of coupling transformations **num_blocks_per_layer** : int Number of layers (or blocks for resnet) per neural network for each coupling transform **mask** : array_like, optional Custom mask to use between coupling transforms. Can either be a single array with the same length as the number of features or and two-dimensional array of shape (# features, # num_layers). Must use -1 and 1 to indicate no updated and updated. **context_features** : int, optional Number of context (conditional) parameters. **net** : {'resnet', 'mlp'} Type of neural network to use **use_volume_preserving** : bool, optional (False) Use volume preserving flows which use only addition and no scaling **activation** : function Activation function implemented in torch **dropout_probability** : float, optional (0.0) Dropout probability used in each layer of the neural network **batch_norm_within_layers** : bool, optional (False) Enable or disable batch norm within the neural network for each coupling transform **batch_norm_between_layers** : bool, optional (False) Enable or disable batch norm between coupling transforms **linear_transform** : {'permutation', 'lu', 'svd', None} Linear transform to use between coupling layers. Not recommended when using a custom mask. .. !! processed by numpydoc !!