Speed Detection to Adjust Timer

Add ability to detect offsets in timer input and learned sequence model to speed up or slow down the global interval timer.

This item relates to the broader goal of modeling object behaviors in Monty. For more details on time representations and processing, see our future work page on the interval timer.

The LM has expectations of when it will sense the next feature at the current location. This is stored as the interval duration that came in from the global interval timer during learning at the same time as the sensory input.

If the next expected feature at the current location appears earlier than what is stored in the model (i.e. timer input < stored interval), the LM sends a signal to the global timer to speed up (by the magnitude of the difference).

If the next expected feature at the current location appears later than what is stored in the model (i.e. timer input > stored interval), the LM sends a signal to the global timer to slow down.

Note: This might be a noisy process and require voting to work well.




Help Us Make This Page Better

All our docs are open-source. If something is wrong or unclear, submit a PR to fix it!

Make a Contribution

Learn how to contribute to our docs