Achtung! Das Lehrangebot ist noch nicht vollständig und wird bis Semesterbeginn laufend ergänzt.
340199 VU Advanced Machine Translation (2024W)
Prüfungsimmanente Lehrveranstaltung
Labels
An/Abmeldung
Hinweis: Ihr Anmeldezeitpunkt innerhalb der Frist hat keine Auswirkungen auf die Platzvergabe (kein "first come, first served").
- Anmeldung von Mo 16.09.2024 09:00 bis Fr 27.09.2024 17:00
- Anmeldung von Mo 14.10.2024 09:00 bis Fr 18.10.2024 17:00
- Abmeldung bis Do 31.10.2024 23:59
Details
max. 25 Teilnehmer*innen
Sprache: Englisch
Lehrende
Termine (iCal) - nächster Termin ist mit N markiert
- Mittwoch 16.10. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Mittwoch 23.10. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Mittwoch 30.10. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Mittwoch 06.11. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Mittwoch 13.11. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Mittwoch 20.11. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Mittwoch 04.12. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Mittwoch 11.12. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Mittwoch 08.01. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- Mittwoch 15.01. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
- N Mittwoch 22.01. 16:45 - 19:00 Medienlabor II ZfT Gymnasiumstraße 50 4.OG
Information
Ziele, Inhalte und Methode der Lehrveranstaltung
Art der Leistungskontrolle und erlaubte Hilfsmittel
Continuous evaluation:
- Weekly reflections, and paper presentations count for 40% of the mark.
- Decoding for NMT task deliverable counts for 20% of the mark.
- Benchmarking of domain adaptation approaches for NMT task deliverable counts for 40% of the mark.
- Weekly reflections, and paper presentations count for 40% of the mark.
- Decoding for NMT task deliverable counts for 20% of the mark.
- Benchmarking of domain adaptation approaches for NMT task deliverable counts for 40% of the mark.
Mindestanforderungen und Beurteilungsmaßstab
In order to pass this module, a student needs to reach the threshold of 4.
MT marking map
excellent - sehr gut (1)
good - gut (2)
average - befriedigend (3)
sufficient - genügend (4)
insufficient - nicht genügend (5)
MT marking map
excellent - sehr gut (1)
good - gut (2)
average - befriedigend (3)
sufficient - genügend (4)
insufficient - nicht genügend (5)
Prüfungsstoff
- Self-attention architectures
- Multilingual NMT
- Multilingual NMT domain adaptation
- NMT decoding
- Multilingual NMT
- Multilingual NMT domain adaptation
- NMT decoding
Literatur
Core texts:
- Kenny, Dorothy. 2022. Machine translation for everyone: Empowering users in the age of artificial intelligence. (Translation and Multilingual Natural Language Processing 18). Berlin: Language Science Press. DOI: 10.5281/zenodo.6653406 (url: https://langsci-press.org/catalog/book/342)
- Koehn, P. 2020. Neural Machine Translation. Cambridge University Press
- Peter F. Brown, Stephen A. Della Pietra, Vincent J. Della Pietra, and Robert L. Mercer. 1993. The Mathematics of Statistical Machine Translation: Parameter Estimation. Computational Linguistics, 19(2):263–311.
- Collins, Michael. “Statistical Machine Translation : IBM Models 1 and 2.” (2011).
- Yi Tay, Mostafa Dehghani, Dara Bahri, and Donald Metzler. 2022. Efficient Transformers: A Survey. ACM Comput. Surv. 55, 6, Article 109 (June 2023), 28 pages. https://doi.org/10.1145/3530811
- Danielle Saunders. 2022. Domain Adaptation for Neural Machine Translation. In Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, pages 9–10, Ghent, Belgium. European Association for Machine Translation.
- Xu, Lingling et al. Parameter-Efficient Fine-Tuning Methods for Pretrained Language Models: A Critical Review and Assessment. ArXiv abs/2312.12148 (2023): n. pag.Additional recommended resources:
- Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention). https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/
- Voita, Lena. Neural Machine Translation Inside Out. https://lena-voita.github.io/posts/nmt_inside_out.html
- Huang, A., Subramanian, S., Sum, J., Almubarak, K., Biderman, S., & Rush, S. (2022). The Annotated Transformer.(2022). URL https://nlp.seas.harvard.edu/annotated-transformer/
- Kenny, Dorothy. 2022. Machine translation for everyone: Empowering users in the age of artificial intelligence. (Translation and Multilingual Natural Language Processing 18). Berlin: Language Science Press. DOI: 10.5281/zenodo.6653406 (url: https://langsci-press.org/catalog/book/342)
- Koehn, P. 2020. Neural Machine Translation. Cambridge University Press
- Peter F. Brown, Stephen A. Della Pietra, Vincent J. Della Pietra, and Robert L. Mercer. 1993. The Mathematics of Statistical Machine Translation: Parameter Estimation. Computational Linguistics, 19(2):263–311.
- Collins, Michael. “Statistical Machine Translation : IBM Models 1 and 2.” (2011).
- Yi Tay, Mostafa Dehghani, Dara Bahri, and Donald Metzler. 2022. Efficient Transformers: A Survey. ACM Comput. Surv. 55, 6, Article 109 (June 2023), 28 pages. https://doi.org/10.1145/3530811
- Danielle Saunders. 2022. Domain Adaptation for Neural Machine Translation. In Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, pages 9–10, Ghent, Belgium. European Association for Machine Translation.
- Xu, Lingling et al. Parameter-Efficient Fine-Tuning Methods for Pretrained Language Models: A Critical Review and Assessment. ArXiv abs/2312.12148 (2023): n. pag.Additional recommended resources:
- Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention). https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/
- Voita, Lena. Neural Machine Translation Inside Out. https://lena-voita.github.io/posts/nmt_inside_out.html
- Huang, A., Subramanian, S., Sum, J., Almubarak, K., Biderman, S., & Rush, S. (2022). The Annotated Transformer.(2022). URL https://nlp.seas.harvard.edu/annotated-transformer/
Zuordnung im Vorlesungsverzeichnis
Letzte Änderung: Mo 07.10.2024 10:07
Students will acquire specialised and practical knowledge on neural machine translation (NMT), self-attention architectures, multilingual NMT, domain adaptation approaches for NMT, and NMT decoding.
Using state-of-the-art technologies, students will learn to apply different approaches to customise multilingual NMT models.Content:- Multilingual NMT
- Domain adaptation for multilingual NMT
- Efficient transformer architectures
- Decoding for NMTDidactic approach:
Students will need to complete practical assignments involving a range of approaches for multilingual NMT, and domain adaptation. Students will also gain experience of deep generative models, and decoding for NMT. The course will be taught in English, with some opportunities for using other languages to complete the coursework.