epbd_bert.datasets package#
Submodules#
epbd_bert.datasets.data_collators module#
epbd_bert.datasets.sequence_dataset module#
epbd_bert.datasets.sequence_epbd_dataset module#
- class epbd_bert.datasets.sequence_epbd_dataset.SequenceEPBDDataset(data_path: str, pydnaepbd_features_path: str, tokenizer: PreTrainedTokenizer, home_dir='')[source]#
Bases:
SequenceDataset
Supervised fine-tuning from seq and epbd features
epbd_bert.datasets.sequence_epbd_multimodal_dataset module#
- class epbd_bert.datasets.sequence_epbd_multimodal_dataset.SequenceEPBDMultiModalDataset(data_path: str, pydnaepbd_features_path: str, tokenizer: PreTrainedTokenizer, home_dir='')[source]#
Bases:
SequenceEPBDDataset
Dataset for multi-modal transformer
epbd_bert.datasets.sequence_epbd_multimodal_labelspecific_dataset module#
- class epbd_bert.datasets.sequence_epbd_multimodal_labelspecific_dataset.SequenceEPBDMultiModalLabelSpecificDataset(data_path: str, pydnaepbd_features_path: str, tokenizer: PreTrainedTokenizer, label='wgEncodeAwgTfbsBroadDnd41CtcfUniPk', home_dir='')[source]#
Bases:
SequenceEPBDMultiModalDataset
Dataset for multi-modal transformer
epbd_bert.datasets.sequence_randepbd_dataset module#
- class epbd_bert.datasets.sequence_randepbd_dataset.SequenceRandEPBDDataset(data_path: str, pydnaepbd_features_path: str, tokenizer: PreTrainedTokenizer, home_dir='')[source]#
Bases:
SequenceEPBDDataset
epbd_bert.datasets.sequence_randepbd_multimodal_dataset module#
- class epbd_bert.datasets.sequence_randepbd_multimodal_dataset.SequenceRandEPBDMultiModalDataset(data_path: str, pydnaepbd_features_path: str, tokenizer: PreTrainedTokenizer, home_dir='')[source]#
Bases:
SequenceEPBDDataset
Dataset for multi-modal transformer