Abstract: Recent advances in Deep Learning (DL) and Artificial Intelligence (AI) algorithms have proven to be greatly beneficial in many applications. The success of such algorithms in the last decade was mainly built on supervised training of very complex models using large labeled datasets. However, the data labeling process is expensive and time consuming especially for applications with sequential data. Self-Supervised Learning (SSL) appeared as a solution to train complex models with unlabeled datasets. The main idea behind SSL is to create auxiliary task based on some properties in the dataset to learn a latent space that captures the data semantics. Models trained using SSL can be fine-tuned later on a much smaller labeled datasets. This learning scheme becomes very successful in Computer Vision (CV) and Natural Language Processing (NLP) applications but hasn’t yet generalized to other sequential data applications. In this work, we propose a novel DL framework for sequential data processing while adopting SSL techniques to make use of unlabeled data. Our framework is evaluated on two different applications from two different domains, Crop Identification and mRNA Vaccine Degradation Prediction. Details of the applications’ datasets, pre-processing steps and learning objectives of both applications are explained. Experimental results demonstrate that the proposed framework offers a promising approach for utilizing SSL in the training pipeline of commonly-used DL models.