A case for knowledge distillation

As part of our ongoing machine learning improvements our engineers had to find a way to reduce the complexity of their model without a drastic decrease in accuracy. We identified that knowledge distillation would best suite our needs. This article is an attempt to consolidate the various points of information and to deliver a strong understanding of how knowledge distillation works.

Read More
We are presenting at EMNLP in Brussels

We’re presenting our toxic language research work at EMNLP on October 31, 2018. Isuru Gunasekara, co-author of the publication “A Review of Standard Text Classification Practices for Multi-label Toxicity Identification of Online Content“, will be highlighting research findings as well as demonstrate live results.

Read More
Henri Kuschkowitzconference