LAD: Layer-Wise Adaptive Distillation for BERT Model Compression

Recent advances with large-scale pre-trained language models (e.g., BERT) have brought significant potential to natural language processing.However, the large model size hinders their use in IoT and edge devices.Several studies have utilized task-specific knowledge distillation to compress the pre-trained language models.However, to reduce the numb

read more




Concise review of mechanisms of bacterial adhesion to biomaterials and of techniques used in estimating bacteria-material interactions

This article reviews the mechanisms of bacterial adhesion to biomaterial surfaces, the factors affecting the adhesion, the techniques used in estimating bacteria-material interactions and the models that have been developed in order to predict adhesion.The process of bacterial adhesion includes an initial physicochemical interaction phase and a lat

read more