It could learn them all. But will it?
Subscribe and turn on notifications ? so you don't miss any videos: http://goo.gl/0bsAjO
Large language models are astonishingly good at understanding and producing language. But there’s an often overlooked bias toward languages that are already well-represented on the internet. That means some languages might lose out in AI’s big technical advances.
Some researchers are looking into how that works — and how to possibly shift the balance from these “high resource” languages to ones that haven’t yet had a huge online footprint. These approaches range from original dataset creation, to studying the outputs of large language models, to training open source alternatives.
Watch the video above to learn more.
Further reading:
https://ruth-ann.notion.site/ruth-ann/JamPatoisNLI-A-Jamaican-Patois-Natural-Language-Inference-Dataset-91523ec89af24bfdbcb9c1ec7e28cc3c
This is the hub for Ruth-Ann Armstrong’s JamPatois NLI. You can see the dataset and read the paper.
https://arxiv.org/search/cs?searchtype=author&query=Melero%2C+M
You can read Maite Melero’s work on Catalan here.
https://huggingface.co/bigscience/bloom
This is the Hugging Face home for BLOOM, the open source large language model.
Make sure you never miss behind the scenes content in the Vox Video newsletter, sign up here: http://vox.com/video-newsletter
Vox is an explanatory newsroom on a mission to help everyone understand our weird, wonderful, complicated world, so that we can all help shape it. Part of that mission is keeping our work free.
You can help us do that by making a gift: http://www.vox.com/contribute-now
Watch our full video catalog: http://goo.gl/IZONyE
Follow Vox on TikTok: http://tiktok.com/@voxdotcom
Check out our articles: https://www.vox.com/
Listen to our podcasts: https://www.vox.com/podcasts