Handwriting Transformers

ICCV 2021

Ankan Kumar Bhunia1,    Salman Khan1,2,    Hisham Cholakkal1,    Rao Muhammad Anwer1
Fahad Shahbaz Khan1,4,    Mubarak Shah3,   

2Australian National University, Australia
3University of Central Florida, USA
4Linköping University, Sweden


Our method can mimic handwriting style of someone from few sample images.

Formally, we are given (a) set of handwritten word images as few-shot calligraphic style examples of one writer, (b) query text from an unconstrained set of vocabulary, our model strives to generate handwritten images with the same text in the writing style of the given writer.

  1. Our proposed HWT imitates the style of a writer for a given query content through self- and encoder-decoder attention that emphasizes relevant self attentive style features with respect to each character in that query.;
  2. This enables us to (a) capture style-content entanglement at the character- level, and (b) model both the global as well as local style features for a given calligraphic style.;


    author    = {Bhunia, Ankan Kumar and Khan, Salman and Cholakkal, Hisham and Anwer, Rao Muhammad and Khan, Fahad Shahbaz and Shah, Mubarak},
    title     = {Handwriting Transformers},
    booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
    month     = {October},
    year      = {2021},
    pages     = {1086-1094}}