Özet:
Sign language is the main communication tool for the hearing impaired. However, the information retrieval from sign language is not as easy as in spoken languages. In this thesis, three methods of information retrieval are proposed for sign language. Firstly, a Dynamic Time Warping based Query-by-Example search technique is developed to enable the Deaf to search for information in their native language. Secondly, a translation-based cross-lingual keyword search method is proposed which will enable the people outside of the Deaf community to learn sign language in context with queries from the written language. Finally, as the main contribution of this thesis, a weaklysupervised keyword search technique is designed based on neural word embeddings and attention concept from neural machine translation. This technique is shown to be capable of performing both gloss search and cross-lingual written keyword search; and can be used together with di erent input features such as human pose estimates and various hand shape features. Our experiments conducted on three datasets, i.e. HospiSign, RWTH-Phoenix-Weather 2014T, and MeineDGS corpus, indicate that using human pose estimates extracted with OpenPose framework generally performs good under di erent retrieval tasks in sign language, especially when they are combined with Spatio-Temporal Graph Convolution. Furthermore, models based on attention is found to be able to temporally localize the keywords as a by-product of weakly supervised training.