PWA on iOS, Part 1: The Anatomy of an App That Doesn’t Exist in the App Store

Written by shkurko83ios | Published 2026/03/29
Tech Story Tags: progressive-web-app | ios-development | pwa-ios-limitations | apple-pwa-support | web-apps-vs-native-apps | safari-pwa-support | webclip-ios | pwa-architecture

TLDRThis article traces the evolution of Progressive Web Apps from early experiments like Microsoft’s HTA to modern implementations on iOS. It highlights how Apple initially embraced web-based apps, later deprioritized them in favor of the App Store, and eventually reintroduced partial support under external pressure. The piece sets the stage for understanding the technical and strategic tensions that continue to shape PWA behavior on iOS today.via the TL;DR App

The Ghost of Steve Jobs and the Stolen Idea

Fifteen years ago, Apple invented the future — then forgot about it. Today, that future is knocking on your iPhone.


One day, I needed to build a PWA for iOS — figured it'd take a couple of days. Several months later, I was still at it. Safari was behaving like a saboteur, the documentation was lying, and Stack Overflow kept serving up three-year-old solutions that no longer worked. Eventually, I realized: to work properly with PWA on iOS, you need to understand not just how things work, but why they're built the way they are — with all these quirks, limitations, and historical skeletons in the closet.

This series is the result of that investigation. Welcome to the autopsy.


1999: The One Everyone Forgot

Redmond, Washington. 1999. The dot-com bubble is inflating to obscene proportions. Every other startup promises to change the world through a browser. Every single one of them flames out.

But somewhere inside Microsoft, a group of engineers is thinking about something different. They're asking themselves an uncomfortable question: why does a website have to live inside a browser at all? Why can't it break free — become a real program, with its own icon, its own access to files, no address bar, no window chrome?

Their answer was a technology with the driest corporate name imaginable — HTA, HTML Applications.

The idea was audacious. Take a regular HTML file, change the extension from .html to .hta — and the browser disappears. In its place: a fully-fledged desktop application with its own icon, its own window, and direct access to the file system and Windows registry. No address bar. No tabs. Just your interface, written in plain HTML and JavaScript.

It was remarkable. It was dangerous. And it worked.

For the technically curious: HTA didn't run through the browser — it launched as a separate process, mshta.exe. Under the hood it used the Internet Explorer engine, but stripped away every security restriction. That meant HTA could do anything a C++ program could do: read and write files, poke around in the registry, spin up processes. Web technologies with desktop-level privileges. In 2003, Microsoft even patented the concept.

System administrators were thrilled. They started building everything in HTA — internal tooling, server configuration utilities, admin dashboards. Why bother learning C++ or VB when you could do the same thing in HTML?

But there was one small, catastrophic catch.

HTA only worked on Windows. Only with Internet Explorer. No Mac, no Linux, no mobile. This wasn't a cross-platform revolution — it was Microsoft's own walled garden, just with a different fence.

And there was another problem: HTA was fully trusted. No isolation, no sandbox. A webpage with admin privileges isn't the technology of the future — it's a welcome mat for malware. Hackers figured this out quickly. HTA became a favourite delivery mechanism for malicious software — all you had to do was convince a user to click the file. Eventually, Microsoft was forced to restrict HTA from launching directly from the browser, purely on security grounds.

By the mid-2000s, HTA had quietly faded into the background. It didn't die — it still runs on Windows 11 — but it stopped being the future. It belonged to old-school sysadmins and malware authors. Nobody wrote about it in magazines. Nobody cited it at conferences.

The idea of turning a webpage into an application had been buried alive.

Eight years passed. The world forgot about HTA. Forgot that someone had already tried to tear down the wall between the browser and the desktop — and failed.

Then a man in a black turtleneck walked on stage. And it turned out the idea had never really gone anywhere. It was just waiting for the right moment. And the right phone.

But the story of this technology reads like a detective thriller. There's the visionary prophet (Steve Jobs), who showed the world a light — then turned away from it. There's the villain (the App Store), which held back the natural evolution of the web for a decade. There are the hero-engineers at Google, who picked up the discarded idea and took it to completion. And there's the modern antihero — Apple, forced to play by someone else's rules, doing so with barely concealed resentment.

In this series, we're going to dig deep into the internals of PWA on iOS. We'll become digital forensic pathologists, dissecting the process from the moment you tap "Add to Home Screen" to the moment that icon comes alive and launches a fully isolated application.

But before we pick up the scalpel, let's establish the most important thing: where this technology came from, and why Apple is simultaneously its parent and its reluctant foster child.


2007: A Prophet in His Own Land

San Francisco, Moscone Center, January 9, 2007. The keynote of the year. Steve Jobs walks onto the stage in his trademark black turtleneck and jeans. "Today, Apple reinvents the phone," he says. The room erupts.

But there's one detail that only hardcore fans and tech historians remember now. At that January event, Jobs said absolutely nothing about third-party apps. Not a word. As if the question simply didn't exist.

— Wait, you might say. What about developers? How are they supposed to build for this new device?

The answer came six months later — at WWDC in June 2007. And it was not what anyone expected.

Jobs took the stage and announced: no SDK, no app store. Just the web. "The full Safari engine is inside of iPhone. And so, you can write amazing Web 2.0 and Ajax apps that look exactly and behave exactly like apps on the iPhone. We think we've got a very sweet story for you."

Apps were going to be websites. He called them Web Apps.

And here's the thing — it worked. Developers could build sites that, when added to the home screen, turned into icons, opened without the browser chrome, and even sent notifications (primitive ones, but still). Apple even coined a specific internal term for this: WebClip.

For the technically curious: WebClip is still the internal iOS name for a PWA installed to the home screen. When you tap "Add to Home Screen," the system creates a WebClip — a special container that stores the icon, metadata, and the launch URL.

Jobs was looking at the future. He saw a world where developers didn't have to pass through gates of censorship, where apps didn't consume gigabytes of storage, and updated instantly because they lived on a server.

The idyll didn't last long.


2008: Betrayal

Nearly a year had passed. Users loved the iPhone. Developers were complaining. They needed direct access to hardware — the camera, the contacts. They needed money — a way to actually sell what they built.

Apple blinked.

In March 2008, iPhone OS 2.0 shipped. And with it came the App Store. It was a tectonic shift.

The irony: the App Store became so successful that it killed Apple's interest in web apps entirely. Why invest in complex web technologies when developers were lining up to hand over 30% and beg for a spot in the storefront?

For the next ten years, web apps on iOS went into hibernation. WebClip remained, but nothing moved forward. Like an abandoned garden — planted by a great gardener, then left without water.

While Apple rested on its laurels, a revolution was taking shape. And its forge was the headquarters of Apple's biggest rival.


2015: The Term That Changed Everything

In 2015, two people — designer Frances Berriman and Google Chrome engineer Alex Russell — sat down and tried to come up with a name for a new concept. They were tired of explaining to clients that websites could work offline and send notifications. They needed a term that would stick.

And so Progressive Web App was born.

— Why "progressive"? Because it works on any browser, but fully unlocks its potential on modern ones. Like a good wine — the older the device, the simpler the experience, but you can drink it anywhere.

Google threw itself into the work with engineering fanaticism. They invented the Service Worker — the "invisible office worker" running inside your phone, caching files and catching notifications. They created the manifest — the application's passport, through which the browser understands that it's looking at more than just a website.

By 2017, PWA had become the de facto standard in the Android world. Chrome persistently nudged users to install websites to their home screen. Users in India and Brazil saved data and storage because PWAs weighed almost nothing compared to native apps.

But there was one problem. A large, Californian problem, with a bitten apple on its side.

Apple pretended none of this was happening.


2017–2018: The Siege of Cupertino

Developers around the world stared at Apple and screamed: "Give us PWA! Give us Service Workers!" Apple said nothing.

Safari was the only browser on iOS — because Apple banned the use of any other rendering engine. And Safari didn't support the core PWA technologies. This wasn't a bug. It was a feature. A strategy.

Why? The answer was hiding in plain sight: the App Store was printing money. Thirty percent of every in-app purchase — an enormous take. PWA would let developers collect revenue without touching the App Store at all (through Stripe, PayPal, you name it). Why would Apple shoot itself in the foot?

But the pressure kept building.

  • Developers wrote petitions.
  • Major companies — Twitter, Uber, Pinterest — built PWAs and lobbied for support.
  • Europe started looking at App Store monopoly with a cold, regulatory eye.

And then, in March 2018, something remarkable happened. iOS 11.3 shipped, and buried in the release notes — small print, somewhere between Battery Health and ARKit improvements — was this: "Safari now supports Service Workers and Web App Manifest."

Apple had capitulated. But had it surrendered unconditionally?


Epilogue: A Pyrrhic Victory

iOS 11.3 genuinely brought PWA to iPhone. But it was support with a sour face.

Apple did the bare minimum required to avoid being accused of sabotage. Service Workers worked, the manifest was read — but:

  • No push notifications (those wouldn't arrive until iOS 16.4 in 2023)
  • No background sync
  • Icons still had to be declared using ancient meta tags from 2007, because Safari parsed the modern manifest inconsistently
  • PWA data was deleted if you hadn't opened the app in 7 days (yes, a real bug, fixed over the course of years)

It was a victory. But one that required prying every single stone from the wall by hand.


What's Next

So now you know the history. PWA isn't some Google vanity project or a 2020s hype cycle. It's an idea that Apple invented, buried, and was eventually forced to resurrect under pressure.

A prophet, a betrayal, ten years of silence, and a capitulation with a sour face.

But history is only half the story. The other half is mechanics.

In the next part, we'll break down how three deceptively boring technical standards — Service Worker, the manifest, and HTTPS — force iOS to do exactly what Apple resisted for so long: treat a website like a real application.

It's going to get technical. But it'll be honest.


Written by shkurko83ios | Lead mobile app developer at JSC Rosselkhozbank
Published by HackerNoon on 2026/03/29