Predicting gene expression from histone modifications with self-attention based neural networks and transfer learning
Yuchi Chen 1, Minzhu Xie 1, Jie Wen 1 Affiliations- PMID: 36588793
- PMCID: PMC9797047
- DOI: 10.3389/fgene.2022.1081842
Abstract
It is well known that histone modifications play an important part in various chromatin-dependent processes such as DNA replication, repair, and transcription. Using computational models to predict gene expression based on histone modifications has been intensively studied. However, the accuracy of the proposed models still has room for improvement, especially in cross-cell lines gene expression prediction. In the work, we proposed a new model TransferChrome to predict gene expression from histone modifications based on deep learning. The model uses a densely connected convolutional network to capture the features of histone modifications data and uses self-attention layers to aggregate global features of the data. For cross-cell lines gene expression prediction, TransferChrome adopts transfer learning to improve prediction accuracy. We trained and tested our model on 56 different cell lines from the REMC database. The experimental results show that our model achieved an average Area Under the Curve (AUC) score of 84.79%. Compared to three state-of-the-art models, TransferChrome improves the prediction performance on most cell lines. The experiments of cross-cell lines gene expression prediction show that TransferChrome performs best and is an efficient model for predicting cross-cell lines gene expression.
Keywords: convolutional neural network; deep learning; gene expression; histone modification; transfer learning.
Copyright © 2022 Chen, Xie and Wen.
标签:modifications,based,histone,attention,cell,learning,gene,expression From: https://www.cnblogs.com/wangprince2017/p/17884527.html