Maksym Del

I am a PhD student at the Natural Language Processing group at the University of Tartu, specializing in language model interpretability under the advisement of Mark Fishel.

During my PhD, I taught the Mechanistic Interpretability seminar (Fall 2024) and served as a TA for Natural Language Processing (Spring 2019) and Neural Machine Translation (Fall 2017, 2018). I also gained R&D experience with the Bergamot project.

I hold an MS in Computer Science (AI focus) from the University of Tartu, where I researched neural machine translation, and a BS in Software Engineering from the Kyiv Polytechnic Institute, where I focused on software development and machine learning.

Email  /  Twitter (X)  /  GitHub  /  Google Scholar  /  LinkedIn

profile photo

Research

I'm interested in mechanistic interpretability and other approaches to AI safety.

project image

To Err Is Human, but Llamas Can Learn It Too


Agnes Luhtaru*, Taido Purason*, Martin Vainikko, Maksym Del, Mark Fishel
preprint, 2024
arxiv /

We show how to effectively use large langauge models for artificial error generation and grammatical error correction.

project image

True Detective: A Deep Abductive Reasoning Benchmark Undoable for GPT-3 and Challenging for GPT-4


Maksym Del, Mark Fishel
*SEM, 2023
paper / code /

We show that GPT-4 cannot reliably solve short-form detective puzzles, even when given a chain of thought reasoning trace hinting at the correct answer.

project image

Cross-lingual Similarity of Multilingual Representations Revisited


Maksym Del, Mark Fishel
AACL, 2022
paper / code /

We demonstrate the universality of the internal cross-lingual structure across multilingual language models.

project image

Similarity of Sentence Representations in Multilingual LMs: Resolving Conflicting Literature and a Case Study of Baltic Languages


Maksym Del, Mark Fishel
BJMC, 2022
paper / code /

We address confsuion in prior work regarding the internal cross-slingual structure in multilingual language models.

project image

Translation Transformers Rediscover Inherent Data Domains


Maksym Del*, Elizaveta Korotkova*, Mark Fishel
WMT, 2021
paper / code /

We show that translation transformers keep internal domain representaions apart.

project image

Grammatical Error Correction and Style Transfer via Zero-shot Monolingual Translation


Elizaveta Korotkova, Agnes Luhtaru, Maksym Del, Krista Liin, Daiga Deksne, Mark Fishel
preprint, 2019
arxiv /

We present an approach that does both grammatical error correction and style transfer with a single multilingual translation model out-of-the-box.

project image

Phrase-based Unsupervised Machine Translation with Compositional Phrase Embeddings


Maksym Del, Andre Tattar, Mark Fishel
WMT, 2018
paper /

We propose compositional phrase embeddings for unsupervised machiene translation.

project image

C-3MA: Tartu-Riga-Zurich Translation Systems for WMT17


Matīss Rikters, Chantal Amrhein, Maksym Del, Mark Fishel
WMT, 2017
paper / code /

We describe the neural machine translation systems of the University of Latvia, University of Zurich and University of Tartu submitted as a part of the WMT17 shared task.





Design and source code from Jon Barron's website