Discussion about this post

User's avatar
Hollis Robbins (@Anecdotal)'s avatar

Yes and I am going to spend the day speculating how it all ties back to Attention Is All You Need. https://arxiv.org/abs/1706.03762 " Attention mechanisms have become an integral part of compelling sequence modeling and transduction models in various tasks, allowing modeling of dependencies without regard to their distance in the input or output sequences"

Expand full comment
Shawn Mealey's avatar

Jeeze this is well written, I feel like this touch is on some of the things I have been feeling myself. "Succinctly… everything feels like crypto now?" Is actually somewhat close to what I was mentioning with my friends recently.

Expand full comment
77 more comments...

No posts