add low- and high-level Concept-Based Memory Reasoner (CMR) implementation#12
add low- and high-level Concept-Based Memory Reasoner (CMR) implementation#12daviddebot wants to merge 2 commits intopyc-team:devfrom
Conversation
…moryConceptExogenousToConcept with unit tests and example
…ow, explicit CMRLoss, docs, and tests
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
| 'task_target': target[:, task_indices], | ||
| } | ||
|
|
||
| def shared_step(self, batch, step): |
There was a problem hiding this comment.
is it possible to avoid overriding the lightning shared step? it is OK to instead override the model forward since each model can have their own.
I see the problem is the task predictor which need to be done twice. Is is possible to play with the inference query in the forward?
| return c_loss * self.concept_weight + t_loss * self.task_weight | ||
|
|
||
|
|
||
| class CMRLoss(nn.Module): |
There was a problem hiding this comment.
It would be great not to have a dedicated loss for every model. It would be great to decompose the CMR loss into modular pieces. I would expect you just need to create an additional piece for the recursion part. To combine the traditional loss with this additional loss, see the way we now enable losses to be summed in the example
pytorch_concepts/examples/utilization/2.2_model/13_composite_loss.py
This PR adds CMR across both low-level and high-level model API, with documentation updates and unit tests.