4 Comments
User's avatar
rina nicolae's avatar

this was amazing

Neural Foundry's avatar

The Stack Ovreflow example is brillaint because it shows how the same memory that enables a tool can also disable the system that created it. Once LLMs consumed all that tacit knoweldge as training data, the incentive to document new edge cases basically vanished. I've noticed this in my own work too, people default to asking an LLM instead of writing down solutions that could compound over time.

Altar Of Now's avatar

recently learned of tv mechanism called the ident bumper- came to mind when reading your perspective on micro trends.

Thank you for sharing

Sachin's avatar

ooh new term for me, and thank you!