登录 注册
当前位置:主页 > 资源下载 > 9 > Joint Multilingual Supervision for Cross-lingual Entity Linking.pdf下载

Joint Multilingual Supervision for Cross-lingual Entity Linking.pdf下载

  • 更新:2024-07-16 11:02:38
  • 大小:1.95MB
  • 推荐:★★★★★
  • 来源:网友上传分享
  • 类别:深度学习 - 人工智能
  • 格式:PDF

资源介绍

Cross-lingual Entity Linking (XEL) aims to ground entity mentions written in any language to an English Knowledge Base (KB), such as Wikipedia. XEL for most languages is challenging, owing to limited availability of resources as supervision. We address this challenge by developing the first XEL approach that combines supervision from multiple languages jointly. This enables our approach to: (a) augment the limited supervision in the target language with additional supervision from a high-resource language (like English), and (b) train a single entity linking model for multiple languages, improving upon individually trained models for each language. Extensive evaluation on three benchmark datasets across 8 languages shows that our approach signifi- cantly improves over the current state-of-theart. We also provide analyses in two limited resource settings: (a) zero-shot setting, when no supervision in the target language is available, and in (b) low-resource setting, when some supervision in the target language is available. Our analysis provides insights into the limitations of zero-shot XEL approaches in realistic scenarios, and shows the value of joint supervision in low-resource settings