An Interview With PowerShell Inventor Jeffrey Snover  by@elizabethlvova

An Interview With PowerShell Inventor Jeffrey Snover

Microsoft Technical Fellow and CTO Jeffrey Snover is the inventor of Windows PowerShell. Snover has been developing the distributed automation engine, scripting language, and command line shell for more than 15 years. The Monad Manifesto document became the starting point for the development of Windows PowerShell. He says a lot of people give him too much credit for PowerShell but it is a great way to get around the problem. Snovers: "PowerShell has been available for both Linux and MacOS since 2016"
Elizabeth Lvova HackerNoon profile picture

Elizabeth Lvova

Building Digital Products at

facebook social iconlinkedin social icontwitter social iconinstagram social icongithub social icon


We had a great talk with Jeffrey Snover, Microsoft Technical Fellow, and CTO for Modern Workforce Transformation. Snover is the inventor of Windows PowerShell, an object-based distributed automation engine, scripting language, and command line shell.

Enjoy the full interview below!

The Interview

Evrone: For more than 15 years PowerShell has helped to automate routine processes and to perform tasks on local and remote systems. We can link the Monad Manifesto document with a «Chicken or the egg» dilemma. Did the document become the starting point for PowerShell development or was it just a reflection of principles at the already created system?

Jeffrey: PowerShell was a journey that had three distinct "points of clarity." The first came with the unique design of a command line tool called WMIC. I designed it specifically to get around some crazy internal product-prevention processes at Microsoft. It was a pipeline processor of XML documents with XSLT transforms. That architecture allowed me to add an incredible amount of power with little effort. I looked at that and said, “wow”.

The second point of clarity is when I tried to convince a team porting ksh to Windows to use the core ideas I had in WMIC. I wanted them to implement my ideas using .NET to create a general-purpose shell. They looked at me like I had a rat’s tail hanging out of my mouth - they had no idea what I was talking about. Eventually, I just closed my office door and a month later I had a 10KLOC prototype, which had all of the core concepts of what we now call PowerShell. Once I was able to show people the ideas, they understood and wanted to go down this path.

We got funding but the bulk of the development team was in India. That was a disaster as none of us knew how to do distributed development. To me, the problem was that my dev team didn’t ‘get’ the problem, the approach and could not make any independent decisions that added up to a coherent solution. So I took the time to write the Monad Manifesto which really documented the soul of the idea. That, coupled with moving all the development back to Redmond WA, enabled the project to succeed.

People give me too much credit for PowerShell. PowerShell is great because we had a series of rockstar engineers add their awesomeness to the project. What I will take credit for is that the Monad Manifesto created a conceptual framework, which created clarity. That allowed those rockstars to contribute in a way that their IQs added together to produce a coherent solution. Empirical evidence indicates that that is a rare thing.

Evrone: PowerShell has been available for both Linux and MacOS since 2016. Meanwhile, a huge part of system administrators and developers are not fully informed of the opportunities for applying it. They continue using traditional automation methods and standard shells. Could you specify, which tasks are perfect to be solved by PowerShell using Linux and MacOS?

Jeffrey: I focus on people being successful instead of promoting PowerShell. If someone is succeeding with the tools they have, why change? That said, a lot of people are masking over a lot of problems, and they would be well served by trying PowerShell. PowerShell replaces fragile, prayer-based text parsing with pipelines of objects (structured data). To some, this sounds like a complication but in practice, it is a dramatic simplification of the problem space.

And this gets to the question, why invent PowerShell, why not just use ksh or bash? I’m a long-time Unix dev so that was my first instinct. I tried and failed. There is a core architectural difference between Unix and Windows. Linux is a file-oriented OS and Windows is an API-oriented OS. In Linux, if you can modify a file and run a process, you can manage anything. That is why awk, sed, and grep are management tools. At the time, nothing on Windows worked that way. Everything was behind an API that returned a structured object. That is why awk doesn’t work with WMI, sed doesn’t work with Active Directory, and grep doesn’t work against the registry. I had to invent a new tool that manipulated this environment.

The interesting thing is that the Windows approach is winning in the world and that makes PowerShell the best tool for the modern world. That sounds controversial but it is clearly true - most of the world is moving towards REST APIs returning structured objects (JSON documents). That is where PowerShell hits a home run.

Evrone: When we speak about studying PowerShell, there is a famous book Mastering PowerShell by Dr. Tobias Weltner. What else should we add to our must-read list?

Jeffrey: There are LOTS of great books but if I had to add three I would say that Don Jones’ PowerShell in a Month of Lunches is widely regarded as a great ‘getting started’ book. Lee Holmes’ PowerShell Cookbook shows you how to use PowerShell to solve real-world problems. Bruce Payette was one of the co-designers of the language and he wrote a book that every expert should have called PowerShell in Action.

Evrone: Systems engineers adore developing fun and sometimes not very safe projects, such as «Russian Roulette» based on PowerShell. Have you encountered any out-of-the-ordinary things, made with PowerShell?

Jeffrey: Every couple of weeks I get surprised by how people are using PowerShell. That is the fun of creating a technology - you never really know what will happen when brilliant people take and apply their creativity to it. You just get to sit back and admire it.

Evrone: Application operation stability requires a lot of work from programmers and beta testers. One of the latest most interesting things is the open preview Azure Chaos Studio, a chaos design platform. Could you tell us more about how this platform will help developers with their code improvement?

Jeffrey: I’ve been doing technology for over 40 years and here is what I’ve learned, software works when it works and fails when it fails. That sounds stupid but it isn’t. Most programmers focus on success. They get a clear vision of success, they budget their time for success, and they get emotionally centered on the success of their technology. When their code works, it works. BUT, it turns out that the world is not perfect. There are problems. APIs don’t always succeed. Many engineers half-ass their error handling and in lots of cases, that error handling does not work. When their code fails, it fails.

Systemically introducing ‘chaos’ into a system is the best way to find out whether your code is going to work when it fails. In my early days, I would pull the network cable or the power plug to see how robust things were. That got my head in the right place but it was hit or miss and low volume. Later, I would write fault injectors into all my code so that I could emulate APIs failing. That was a huge benefit. Now the industry has a name for this model, “Chaos” and now we have standardized tools to introduce it into our systems. And I love it. Few things improve the quality and robustness of your code better than chaos. Step by step, we are getting specific things in focus and better patterns emerge but I expect that architectures, patterns, and anti-patterns will continue to evolve for quite some time. I'm particularly optimistic about the role of conflict-free replicated data types or CRDTs.

Evrone: Windows Server becomes more and more integrated into Azure with each new version. Let's take a look into the future: is it right to suppose that all loads could run entirely in the cloud, and Windows Server will become only a way of representing cloud resources for customer infrastructure?

Jeffrey: Hybrid computing is required to achieve the proper balance between cost, latency, performance, robustness, and scale - which is to say that computing will happen in cloud data centers, in the network fabric out to the edge, and at the edge and user devices as well. I’m of the view that software is still in its infancy and that, as an industry, we still have no clue how to write important software that works well across all these factors. Step by step, we are getting specific things in focus and better patterns emerge but I expect that architectures, patterns, and anti-patterns will continue to evolve for quite some time. 

As always, Microsoft is focused on “Developers! Developers! Developers!”. The big difference is that historical hubris has been replaced with empathy and humility. Our partnership with developers is closer now than it has ever been, and we are much better at listening and solving problems together with our developers and the community. Specifically, we are focused on giving developers what they need and what they want. In addition to investing in Windows, we have huge investments in Linux and other open-source projects and communities. The big revolution, and a lesson in humility, came when we started running our own services. It turns out that when you call a developer at 3 a.m. to fix a bug, they start writing better software. More importantly, we are able to see how our customers are really using our software; where it is working well, and where it is not. This allows us to have much richer conversations with our customers and serve them better than ever before.

Evrone: COVID-19 made it clear that offline businesses can break at any time and only online ones might remain. How has the pandemic affected the Microsoft cloud services architecture?

Jeffrey: COVID-19 stress-tested all our designs. I was tapped to take the graveyard shift as Executive Incident Manager as we raced to scale up Teams as entire school systems, and organizations shifted to online everything. It was an incredible stress test.  We had to buy and provision as many servers as we could get our hand on, decommission and downscale other workloads to free up capacity, and then scale up new Teams’ capacity at an unheard-of rate. At one point we even chartered a plane and shipped network equipment to a country whose internet connections would not support the schools moving to online classes. The amazing thing is that it all worked. The wheels on the bus got wobbly at times but it worked. It was a resounding affirmation of the business value of cloud architecture.

While the pandemic has not changed our cloud services architecture, it reinforced how important reliability and robustness are.

Evrone: Data security protection has been and remains one of the main priorities for most customers. What we should pay special attention to while designing secure applications, deploying them in a cloud infrastructure?

Jeffrey: You have to decide whether security is important or not. If you decide it is important, you allocate the resources and follow the well-established Security Development Lifecycle patterns and practices. Lip service doesn’t get the job done. Neither does hope, hiring and ignoring a security expert, or throwing money at the problem. Security requires rolling up your sleeves and doing the hard, often unglamorous, work. If you aren’t using multi-factor auth, I’m talking about you.

Securing cloud applications is very different than on-prem applications. My view is that it is much easier to make cloud applications much more secure. Many security aspects are handled by Microsoft and Azure provides a rich, comprehensive set of security controls and protections. BUT, you need to use them.

Evrone: Today our world is rapidly changing and sometimes it's quite difficult for us to adapt to it. Could you share any success secrets — how do you manage to «stay on the crest of a wave» of technology and make the world a better place?

Jeffrey: Here are the physics: as information explodes, expertise narrows. As paradigms shift, expertise expires. This is a tough industry and I constantly feel like I’m on my heels and can’t keep up. To me, the necessary, but difficult, step was shifting my focus and feeling of self-worth from ‘knowing’ to ‘learning’. Sounds safe and simple but when you are the most senior person in a room and you have no idea what people are talking about, that is an uncomfortable chair to sit in. I take a deep breath, lean in, give myself permission to ask ‘dumb’ questions and suddenly, my anxiety gets replaced with excitement. It turns out that my hard-earned scar tissue is useful in seeing patterns, knowing what matters, how and when to listen to customers, knowing when to be cautious and when to be bold, how to create a roadmap, and most importantly - how to work with a wide range people to accomplish results. When in doubt, focus on people. Always.

The conclusion

We’re grateful for the opportunity we had to interview Jeffrey and learn from his valuable experiences and years of expertise.

We also want to thank our friend, Nikolai Rubanov from Selectel, for assisting with the questions for this ​interview.

Also published as "Windows PowerShell Inventor Jeffrey Snover Interview." The interviewer is the Chief Editor at Evrone.

react to story with heart
react to story with light
react to story with boat
react to story with money

Related Stories

. . . comments & more!