Customer-obsessed science
Research areas
-
December 1, 20258 min read“Network language models” will coordinate complex interactions among intelligent components, computational infrastructure, access points, data centers, and more.
-
-
November 20, 20254 min read
-
October 20, 20254 min read
-
October 14, 20257 min read
Featured news
-
IJCNN 20222022Beyond accuracy, diversity has become a crucial factor to evaluate a recommendation system as higher diversity helps mitigate echo chamber issue and improve user satisfaction. Recently, great success has been made to improve diversity, but the approaches often sacrifice much lower accuracy. Herein this work, we propose contrastive co-training for diversified recommendation that improves diversity greatly
-
ICPR 20222022A major challenge encountered in the offline evaluation of machine learning models before being released to production is the discrepancy between the distributions of the offline test data and of the online data, due to, e.g., biased sampling scheme, data aging issues and occurrence(s) of regime shift. Consequently, the offline evaluation metrics often do not reflect the actual performance of the model
-
NAACL 20222022In production SLU systems, new training data becomes available with time so that ML models need to be updated on a regular basis. Specifically, releasing new features adds new classes of data while the old data remains constant. However, retraining the full model each time from scratch is computationally expensive. To address this problem, we propose to consider production releases from the curriculum learning
-
Improving distantly supervised document-level relation extraction through natural language inferenceNAACL 2022 Workshop on Deep Learning for Low-Resource NLP2022The distant supervision (DS) paradigm has been widely used for relation extraction (RE) to alleviate the need for expensive annotations. However, it suffers from noisy labels, which leads to worse performance than models trained on human-annotated data, even when trained using hundreds of times more data. We present a systematic study on the use of natural language inference (NLI) to improve distantly supervised
-
PRX Quantum2022We study the dynamical properties of the bosonic quantum East model at low temperature. We show that a naive generalization of the corresponding spin-1/2 quantum East model does not possess analogous slow dynamical properties. In particular, conversely to the spin case, the bosonic ground state turns out to be not localized. We restore localization by introducing a repulsive interaction term. The bosonic
Collaborations
View allWhether you're a faculty member or student, there are number of ways you can engage with Amazon.
View all