I built an unconventional application of sequence-to-sequence (Seq2Seq) neural networks: predicting the intermediate steps of manual multiplication from the final answer alone. The model receives just the digits of a product (e.g., 56088) and must reconstruct the original 9 hidden cell values of the 3x3 Gelosia lattice table that produced that sum.
This project turns a deterministic math problem on its head, forcing an LSTM to learn abstract mathematical relationships implicitly through observation. It's a fun exploration of how ML can map a final, "collapsed" sequence of information back to its higher-dimensional, hidden components. Check out the code and project details here:
https://gitlab.com/9o1d/gelosia