I've worked as a web developer for a decade now. When starting something fresh, I usually go with my gut npx create-next-app. It seems fresh. It just works. Shipping to Vercel? Totally hooked. npx create-next-app When I began crafting Reddit Toolbox - a marketing tool for indie makers - I just went with it. No second thoughts. Set up a sleek dashboard using Next.js, tied in Supabase for backend stuff, then handled scraping inside serverless chunks instead. It ran just fine on localhost. Yet when I launched it live, nothing worked right. The "Cloud IP" Trap This is how it goes if you attempt automating Reddit - or say, LinkedIn or even Twitter - from a regular web server: You are not just "you." To Reddit's anti-spam systems, you are AWS us-east-1. Meet Google Cloudunknown-host. Your IP’s packed with tons of bots, crawlers, automated tools - same one. Just a day after I launched the site, test logins got quietly blocked. No spam sent, nothing fishy - just that the browser setup looked like “This is headless Chrome, running in some server room.” us-east-1 I gave rotating proxies a shot. Then moved to stealth tools instead. Still, none delivered steady results. That’s because a Node.js request leaves a TLS mark that stands out way too much. The Hard Pivot: Going Back to Desktop 🖥️ I figured out one thing - to make a tool that truly protects user accounts, I couldn't stay inside the browser. I took a step that seemed like moving backward - rewriting the whole app from scratch as a desktop version using Electron along with Python. Why? Three reasons: Distributed Residential IPs: Run tasks right on the user’s device, so each request pops up through real home networks - think Comcast or Verizon. No data center vibes at all. Just some dude in Ohio scrolling Reddit like any normal person would. Real hardware fingerprints: my Python tool now rides along with the person’s own fonts, canvas output, or even their GPU details. Looks just like a regular user - since it runs right on their device, using what’s already there. Price? Dropped from $200 every month - needed that cash for strong scraping proxies - to totally free. Now the visitor’s machine handles the load instead. Distributed Residential IPs: Run tasks right on the user’s device, so each request pops up through real home networks - think Comcast or Verizon. No data center vibes at all. Just some dude in Ohio scrolling Reddit like any normal person would. Real hardware fingerprints: my Python tool now rides along with the person’s own fonts, canvas output, or even their GPU details. Looks just like a regular user - since it runs right on their device, using what’s already there. Price? Dropped from $200 every month - needed that cash for strong scraping proxies - to totally free. Now the visitor’s machine handles the load instead. Conclusion If you're making a CRUD app, stick to the web. Yet when crafting something near the murky side of bots or data pulling - stick to local setups. Building it takes more effort. Fixing binaries? Totally annoying. Yet without this, beating today’s bot blockers just won’t happen. (I'm still refining this architecture, but if you want to poke around the beta, it's live here: Reddit Toolbox) (I'm still refining this architecture, but if you want to poke around the beta, it's live here: Reddit Toolbox Reddit Toolbox