LeakyReLU¶
- class torch.ao.nn.quantized.LeakyReLU(scale, zero_point, negative_slope=0.01, inplace=False, device=None, dtype=None)[source]¶
This is the quantized equivalent of
LeakyReLU
.- Parameters:
scale – quantization scale of the output tensor
zero_point – quantization zero point of the output tensor
negative_slope – Controls the angle of the negative slope. Default: 1e-2