-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TM activeCells_ is a SDR #442
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking for help with activeCells_ being SDR and handling extraActive inputs.
for(const auto &active : extraActive) { | ||
NTA_ASSERT( active < extra_ ); | ||
activeCells_.push_back( static_cast<CellIdx>(active + numberOfCells()) ); | ||
activeVec.push_back( static_cast<CellIdx>(active + numberOfCells()) ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
continuing the discussion from #437 (comment)
This code breaks because we're adding extraActive indices offsetted by numOfCells() which does not pass a check in the SDR used.
@ctrl-z-9000-times do you have a proposal how to proceed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, make two new vectors named something like presynActive & presynWinner. Copy the TM's active and winner cells into it, and concatenate the extra inputs into those vectors too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The external predictive inputs are not part of the active cells of this TM, and shouldn't be added to the activeCells. This is a hack, really it should concatenate both the activeCells and the externalPredictiveCells into a new vector, instead of repurposing an existing vector.
This is the culpit!
The external predictive inputs are not part of the active cells
ok, let's do it properly. and not add them to activeCells_.
Proposal: Should we use a member SDR extraActivations_ ?
- would allow to (again) unite the activateDendrites API (no version for extra*)
- users would call setExtra*Cells(SDR) to set them. After compute, the SDR gets cleared again.
are not part of the active cells of this TM
but they do refer to cells in this TM, right?
Sorry, again, what is the meaing of externalActive/WinnerCells? I thought it's an "deus ex" activation of the TM's cells.
So, if I have a TM with only 3 cells {0,1,2}. we now encode extras with offset, so {0,1,2, ex3, ex4, ex5}.
Assume this state (dense repr): {101 |extra: 111}
Does this translate to:
- {111} active cells?
- or only +3 number to pass threshold, so 5 active cells (2 from this TM).
Another question, how should we treat the extra activations in context of anomalies?
- include in the computation,
- or ignore?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but they do refer to cells in this TM, right?
Sorry, again, what is the meaing of externalActive/WinnerCells? I thought it's an "deus ex" activation of the TM's cells.
No, external here means that it comes from outside of yhe TM.
For example i could have 2 brain areas with 2 TMs which are looking different sensors, and one TM can help the othermake predictions.
Another Example is to model L4 with a TM and to generate a location signal and feed it to the TM. Numenta did this in their "columns" paper.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another question, how should we treat the extra activations in context of anomalies?
They should ve included if they are given.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another view is, that it's valid and working to have activeCells_,winnerCells_ as vector due to the extra* functionality, and we cannot do that with SDR that strictly checks the members are within its bounds. Also, having active/winners as SDR does not give some big advantage. So we could wery well just close this PR.
thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can work on this a bit.
I will change the name from extra to something more descriptive.
TMReg default output
The TM has multiple outputs, all of which are valid. IMO its better to not have a default at all and to force the user to specify what they want
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 thank you!
MO its better to not have a default at all and to force the user to specify what they want
you're talking about TM, I agree. I was talking about TMRegion where NetworkAPI forces us to take a default output. Is it that winner cells would be generally more useful/"the nicest"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it that winner cells would be generally more useful?
It depends on what you're doing.
Winner cells are the cells which the TM has designated to represent the current state of its world, so they should be used for learning.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[winner cells] represent the current state of its world,
Sounds that should be the default output of TMRegion, I'll update that.
Should anomaly also be made from Winners/Predictive?
This is now a nice small, passing PR |
Ok, I will fix the issue with extras and then we should be able to merge this. |
It still doesn't work, but I think it gets a little further before failing now. |
#endif | ||
activeCells_.concatenate(activeCells_, extraActive); | ||
SDR dendriteInputs({ activeCells_.size + extraActive.size }); | ||
dendriteInputs.concatenate(activeCells_.flatten(), extraActive.flatten()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh, thank you! I've totally forgot about this problem.
@@ -426,6 +426,9 @@ namespace sdr { | |||
SDR::setDenseInplace(); | |||
} | |||
|
|||
Reshape SparseDistributedRepresentation::flatten() const | |||
{ return Reshape(*this, {size} ); } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we want Reshape, or a plain SDR {size} + setSparse would do?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That would work too, but it would copy the data. The Reshape class does not copy data.
Follow up to #437
Let's continue the discussion from #437 (comment)