AI Unlearning

When we talk about artificial intelligence, we often think about how well it remembers — data, facts, styles, even our writing tone. But what happens when we need it to forget? That’s the tricky challenge of AI unlearning, and a new paper — Distribution Preference Optimization: A Fine-grained Perspective for LLM Unlearning — offers a smart new angle on how to do it. Why forgetting matters As AI models grow larger, they absorb massive amounts of information — some of it private, copyrighted, or simply outdated. From user data that should have been deleted to training examples that contain bias or sensitive facts, keeping everything isn’t always safe or ethical. ...

October 24, 2025 · 4 min

Hello World is All You Need

Hello World Attention is All You Need “Attention is All You Need” - the paper that revolutionized artificial intelligence and gave birth to the transformer architecture that powers modern AI systems like GPT, BERT, and countless others. The Revolutionary Paper Published in 2017 by Vaswani et al., “Attention is All You Need” introduced the transformer architecture that would become the foundation of modern AI. This paper marked a paradigm shift from recurrent neural networks (RNNs) and convolutional neural networks (CNNs) to attention-based architectures. ...

January 15, 2024 · 2 min