crested.tl.zoo.dilated_cnn_decoupled#
- crested.tl.zoo.dilated_cnn_decoupled(seq_len, num_classes, first_conv_filters=512, first_conv_filter_size=5, first_conv_pool_size=0, first_conv_activation='gelu', first_conv_l2=1e-05, first_conv_dropout=0.1, n_dil_layers=8, num_filters=512, filter_size=3, activation='relu', output_activation='softplus', l2=1e-05, dropout=0.1, batch_norm=True, dense_bias=True)#
Construct a CNN using dilated convolutions with a separate dense head per output class.
- Parameters:
seq_len (
int) – Width of the input region.num_classes (
int) – Number of classes to predict.first_conv_filters (
int(default:512)) – Number of filters in the first convolutional layer.first_conv_filter_size (
int(default:5)) – Size of the kernel in the first convolutional layer.first_conv_pool_size (
int(default:0)) – Size of the pooling kernel in the first convolutional layer.first_conv_activation (
str(default:'gelu')) – Activation function in the first convolutional layer.first_conv_l2 (
float(default:1e-05)) – L2 regularization for the first convolutional layer.first_conv_dropout (
float(default:0.1)) – Dropout rate for the first convolutional layer.n_dil_layers (
int(default:8)) – Number of dilated convolutional layers.num_filters (
int(default:512)) – Number of filters in the dilated convolutional layers.filter_size (
int(default:3)) – Size of the kernel in the dilated convolutional layers.activation (
str(default:'relu')) – Activation function in the dilated convolutional layers.output_activation (
str(default:'softplus')) – Activation function for the output layer.l2 (
float(default:1e-05)) – L2 regularization for the dilated convolutional layers.dropout (
float(default:0.1)) – Dropout rate for the dilated convolutional layers.batch_norm (
bool(default:True)) – Whether or not to use batch normalization.dense_bias (
bool(default:True)) – Whether or not to add a bias to the dense layer.
- Return type:
Model- Returns:
A Keras model.