Concept
Defensive Distillation 0
Defensive distillation is a technique used in machine learning to enhance model robustness against adversarial attacks by training it at a higher 'temperature' to smooth out prediction probabilities. This approach effectively makes it harder for attackers to subtly manipulate inputs to produce incorrect model outputs without detection.
Relevant Degrees