STonKGs

STonKGs model architecture components.

class BertForPreTrainingOutputWithPooling(loss=None, prediction_logits=None, seq_relationship_logits=None, hidden_states=None, attentions=None, pooler_output=None)[source]

Overriding the BertForPreTrainingOutput class to further include the pooled output.

class STonKGsELMPredictionHead(config)[source]

Custom masked entity and language modeling (ELM) head used to predict both entities and text tokens.

Initialize the ELM head based on the (hyper)parameters in the provided BertConfig.

forward(hidden_states)[source]

Map hidden states to values for the text vocab (first half) and kg vocab (second half).

class STonKGsForPreTraining(config, nlp_model_type='dmis-lab/biobert-v1.1', kg_embedding_dict_path='/home/docs/checkouts/readthedocs.org/user_builds/stonkgs/checkouts/stable/models/kg-hpo/embeddings_best_model.tsv')[source]

Create the pre-training part of the STonKGs model based on both text and entity embeddings.

Initialize the model architecture components of STonKGs.

forward(input_ids=None, attention_mask=None, token_type_ids=None, masked_lm_labels=None, ent_masked_lm_labels=None, next_sentence_labels=None, return_dict=None, head_mask=None)[source]

Perform one forward pass for a given sequence of text_input_ids + ent_input_ids.

classmethod from_default_pretrained(cls, **kwargs)[source]

Get the default pre-trained STonKGs model.

Return type

STonKGsForPreTraining