Vibe Coding – Programming by Feel with AI

Summary “A new way of coding is emerging where AI writes most of the code and humans just guide the process — meet vibe coding.” Vibe coding is a buzzworthy approach to software development where developers (and even non-developers) describe what they want in natural language, and a Large Language Model (LLM) generates the code. Instead of meticulously crafting every line, humans focus on testing, tweaking, and guiding the AI’s output—letting the “vibes” steer the process. ...

June 10, 2025 · 2 min

How to Write Like a Human (Hint: Don’t Be Too Polished)

The Human Checkist Want to prove you’re not a machine? Easy. Forget polish. Forget elegance. Forget grammar. Here’s the new checklist for authentic human writing: Include at least one typo (preferably in the first line ;). Use a run-on sentence that wheezes and collapses halfway. Sprinkle punctuation like you’re seasoning a cheap steak. Congratulations. You’ve passed the humanity test. Robots need not apply. Why the Paranoia? We’re living in a time where teachers, employers, and strangers on the internet suspect every clean sentence of being auto-generated. Detection tools scan essays for “AI fingerprints.” Bosses double-check reports. Nothing says authentic human soul quite like misspelling “definitely” three different ways. ...

March 21, 2025 · 2 min

Life Is a Sine Wave — And That’s a Good Thing

Right now, we’re somewhere on the curve — maybe climbing towards a high, maybe coasting down the other side. Either way, the wave keeps moving. Life’s a sine wave — gentle rises, soft falls, and all the in-between stretches we barely notice. The highs feel brilliant, but they don’t last. The lows can feel endless, but they pass too. Most of our days happen between those extremes — in the quiet climbs, steady plateaus, and slow descents. ...

January 15, 2025 · 1 min

Why AI Forgets: The Context Window Explained

You tell ChatGPT something in the morning and by afternoon it’s forgotten. Is it lying, lazy, or limited? Spoiler: it’s limited — by design. The Goldfish Problem LLMs (large language models) aren’t “forgetful” in the human sense. They simply work with a limited sliding window of awareness. Once you’ve gone beyond it, earlier content disappears like the view in your rear-view mirror. In Kiwi terms, it’s like watching a rugby match but only remembering the last few plays, not the whole season. ...

November 14, 2024 · 3 min

Enterprise Architecture in the World of Agile and AI

Enterprise Architecture (EA) has long been about the big picture—vision, strategy, and ensuring the technology landscape doesn’t collapse under the weight of its own complexity. Agile, on the other hand, is all about delivering value fast, embracing change, and avoiding big upfront designs. When these two worlds meet… sparks fly. Agile vs EA — The Friendly (Mostly) Tension From Agile’s perspective: “Why are we waiting for an architecture review when customers want this feature yesterday?” “We value working software over comprehensive documentation… remember?” From EA’s perspective: ...

June 12, 2024 · 2 min

Context Is All You Need — Why AI Cares About Context

In the world of AI, context isn’t just helpful—it’s everything. Whether it’s the billions of tokens in a language model’s attention mechanism, the rapidly shifting mosaic of user intent, or the structured metadata keeping facts grounded, context defines what AI can do. 1. A Foundation Built on Attention The 2017 paper “Attention Is All You Need” introduced the Transformer model, which revolutionized deep learning by letting models process all parts of a sequence in parallel using self-attention. No more recurrences—just powerful, context-aware computation. Since then, it’s become the backbone of modern models like GPT, BERT, and beyond. 1 ...

April 23, 2024 · 2 min

Hello World is All You Need

Hello World Attention is All You Need “Attention is All You Need” - the paper that revolutionized artificial intelligence and gave birth to the transformer architecture that powers modern AI systems like GPT, BERT, and countless others. The Revolutionary Paper Published in 2017 by Vaswani et al., “Attention is All You Need” introduced the transformer architecture that would become the foundation of modern AI. This paper marked a paradigm shift from recurrent neural networks (RNNs) and convolutional neural networks (CNNs) to attention-based architectures. ...

January 15, 2024 · 2 min