BoxSquaredELModule

class mowl.nn.BoxSquaredELModule(nb_ont_classes, nb_rels, embed_dim=50, gamma=0, delta=2, reg_factor=0.05)[source]

Bases: ELModule

Implementation of Box \(^2\) EL from [jackermeier2023].

Methods Summary

gci0_bot_loss(data[, neg])

Loss function for GCI0 with bottom concept: \(C \sqsubseteq \perp\).

gci0_loss(data[, neg])

Loss function for GCI0: \(C \sqsubseteq D\).

gci1_bot_loss(data[, neg])

Loss function for GCI1 with bottom concept: \(C_1 \sqcap C_2 \sqsubseteq \perp\).

gci1_loss(data[, neg])

Loss function for GCI1: \(C_1 \sqcap C_2 \sqsubseteq D\).

gci2_loss(data[, neg])

Loss function for GCI2: \(C \sqsubseteq \exists R.D\).

gci3_bot_loss(data[, neg])

Loss function for GCI3 with bottom concept: \(\exists R.C \sqsubseteq \perp\).

gci3_loss(data[, neg])

Loss function for GCI3: \(\exists R.C \sqsubseteq D\).

init_embeddings(num_entities, embed_dim[, ...])

Methods Documentation

gci0_bot_loss(data, neg=False)[source]

Loss function for GCI0 with bottom concept: \(C \sqsubseteq \perp\).

Parameters:
  • gci (torch.Tensor) – Input tensor of shape (*,2) where C classes will be at gci[:,0] and bottom classes will be at gci[:,1]. It is recommended to use the ELDataset.

  • neg (bool, optional.) – Parameter indicating that the negative version of this loss function must be used. Defaults to False.

gci0_loss(data, neg=False)[source]

Loss function for GCI0: \(C \sqsubseteq D\).

Parameters:
  • gci (torch.Tensor) – Input tensor of shape (*,2) where C classes will be at gci[:,0] and D classes will be at gci[:,1]. It is recommended to use the ELDataset.

  • neg (bool, optional.) – Parameter indicating that the negative version of this loss function must be used. Defaults to False.

gci1_bot_loss(data, neg=False)[source]

Loss function for GCI1 with bottom concept: \(C_1 \sqcap C_2 \sqsubseteq \perp\).

Parameters:
  • gci (torch.Tensor) – Input tensor of shape (*,3) where C1 classes will be at gci[:,0], C2 classes will be at gci[:,1] and bottom classes will be at gci[:,2]. It is recommended to use the ELDataset.

  • neg (bool, optional.) – Parameter indicating that the negative version of this loss function must be used. Defaults to False.

gci1_loss(data, neg=False)[source]

Loss function for GCI1: \(C_1 \sqcap C_2 \sqsubseteq D\).

Parameters:
  • gci (torch.Tensor) – Input tensor of shape (*,3) where C1 classes will be at gci[:,0], C2 classes will be at gci[:,1] and D classes will be at gci[:,2]. It is recommended to use the ELDataset.

  • neg (bool, optional.) – Parameter indicating that the negative version of this loss function must be used. Defaults to False.

gci2_loss(data, neg=False)[source]

Loss function for GCI2: \(C \sqsubseteq \exists R.D\). \(C \sqsubseteq \exists R. D\). :param gci: Input tensor of shape (*,3) where C classes will be at gci[:,0], R object properties will be at gci[:,1] and D classes will be at gci[:,2]. It is recommended to use the ELDataset. :type gci: torch.Tensor :param neg: Parameter indicating that the negative version of this loss function must be used. Defaults to False. :type neg: bool, optional.

gci3_bot_loss(data, neg=False)[source]

Loss function for GCI3 with bottom concept: \(\exists R.C \sqsubseteq \perp\).

Parameters:
  • gci (torch.Tensor) – Input tensor of shape (*,3) where R object properties will be at gci[:,0], C classes will be at gci[:,1] and bottom classes will be at gci[:,2]. It is recommended to use the ELDataset.

  • neg (bool, optional.) – Parameter indicating that the negative version of this loss function must be used. Defaults to False.

gci3_loss(data, neg=False)[source]

Loss function for GCI3: \(\exists R.C \sqsubseteq D\).

Parameters:
  • gci (torch.Tensor) – Input tensor of shape (*,3) where R object properties will be at gci[:,0], C classes will be at gci[:,1] and D classes will be at gci[:,2]. It is recommended to use the ELDataset.

  • neg (bool, optional.) – Parameter indicating that the negative version of this loss function must be used. Defaults to False.

init_embeddings(num_entities, embed_dim, min=-1, max=1)[source]