In 2015, I found myself talking to a smart entrepreneur who had just raised millions of dollars in venture capital from a prominent venture capital (VC) firm. He was flying high after getting substantial press and recruiting a number of early employees. His business also wholly depended on a popular open source project.
As part of his strategy, the CEO gave a core open source developer in the project equity. With this, he boasted that he would get critical information sooner, influence the roadmap for his company’s benefit, and recruit developers from the project to his company
I was doubtful this connection would ever be publicized by the open source developer. Would he speak glowingly about the company at a conference, despite private reservations? How would he influence the decisions of his fellow developers? Would developers early in their career follow his missives without adequate reflection?
Computer engineers regularly deal with exploits on their technical systems, where an external party takes advantage of a bug or vulnerability. An “exploit” or social engineering of our personal information sources — motivated parties influencing subreddits, conferences, meetups, blogs, journalists, professors, industry analysts — should be perceived similarly. As this example shows, even your peers can be influenced.
As engineers, it’s valuable to develop the antibodies that enable us to make the best decisions — both for our teams and for our own careers — despite the desires of these motivated parties. Beyond identifying ways we can be influenced, I list out three techniques to help us make better decisions.
I’ve created a Github repo listing the some inputs for startup engineering decisions. For each, I’ve noted how they work — and how they can be exploited.
This is just one example in a larger war that rages in this global era of computing: that for the hearts of engineers in technology.
Some of the many groups that will try to shape our views include:
The content they create and sponsor (press releases, meetups, blog posts, conferences, speaker talks) play a critical role in the decisions we make — from the technology we adopt to the companies we join.
A press release indicating large traction (often with many missing numbers) may be a primary reason to join a startup. A blog post or popular talk on the latest database technology — like NoSQL — may be a reason to experiment and later switch. A college lecturer may partly be there to pitch their own company’s technology. The most upvoted Hacker News (HN) posts over a year may influence what technologies we choose to adopt. And yet, as consumers of this information, many of us don’t always critically examine the information and its source.
In startups, bad technology choices made based on these inputs can divert the focus, putting the mission at risk. For an individual engineer, learning the wrong technology can waste our time. Joining the wrong company that sells us false promises can set our career back.
Case Study: Dev Tool Marketing at MongoDB
MongoDB was one of the fastest growing NoSQL databases in the late 2000s that promised a “modern” database experience.
In my deep dive into the early marketing strategy behind MongoDB, I pointed out issues relevant in dev tools marketing including:
- Speakers invited to many MongoDB meetups/conferences would have their own motivations (trying to sell their consulting services, marketing for their own dev tools products)
- Hack academies and early programmer blogs were valuable allies for pitching the MEAN stack, while not seeming like a sales pitch when seen on HN/Reddit
- MongoDB would argue that nearly every use case should be solved with their database, without highlighting the tradeoffs in different use cases
- College hackathons were a way to influence early developers who may not have fully understood the tradeoffs they were making
Even though I point out the issues in this case study, I am far from a dispassionate scientist when on a mission for my team. Given that, it’s at least important to ensure that all engineers have the general tools to understand some of the messages we see. And in this era of fake news, selective facts, and troll armies, these lessons can also help us assess information far beyond engineering content.
Unlike previous generations of marketing, the external influences in engineering media today are often disguised.
The purpose of a press release or a sales call is clear. When this content is used to give a talk at a conference or inspire someone else to write a blog post, they blur the line. With a powerful marketing budget or fundraise, it becomes increasingly easy to publicize and convince others with one’s message, regardless of the veracity.
In Silicon Valley, well placed friends tell me that they shudder at the information sources available to most — and how uncritically different messages are internalized. I worry especially about junior engineers (such as those in college and on r/learnprogramming) and those far from top tech centers, who may not realize how “the sausage is made.”
Case Study: Training Programs
Free Code Camp, a popular developer blog with 350k followers, carried a recent post that argued REST is dead and GraphQL is the future (“REST in Peace. Long Live GraphQL”). The article had been posted by the head of Free Code Camp and written by an author who was concurrently selling a GraphQL training program (my critique here).
Reflecting back on my own early days, I worry about junior engineers who might feel like they have to learn GraphQL based on an article like this rather than realizing all the conflicts involved. An older example from them is “The Real Reason to learn the MEAN stack: Employability” — which many of my most thoughtful friends would have strenuously disagreed with. I especially worry about what content like this does in early developer communities, where there many not be enough thoughtful engineers to disagree with these viewpoints.
Beyond direct influence, a surprising percentage of posts on engineering social media are content marketing. These posts aim to add value — but are written for the primary purpose of helping their companies sell their product. Readers can sometimes internalize them as “journalism,” even though their objective is to make a sale, influence a mindset shift, or improve SEO.
Case Study: Content Marketing
Union Square’s Fred Wilson notes how pervasive content marketing is in tech and encourages critical assessment of the message (emphasis added):
So how should entrepreneurs use this knowledge that is being imparted by VCs [like myself] …? Well first and foremost, you should see it as content marketing…
That doesn’t mean it isn’t useful or insightful. It may well be. But you should understand the business model supporting all of this free content. It is being generated to get you to come visit that VC and offer them to participate in your Seed or Series A round. That blog post that Joe claimed is not scripture in his tweet is actually an advertisement_. Kind of the opposite of scripture, right?_
For all these reasons and more, my most thoughtful friends use their own networks to understand reality instead: college/graduate school friends from select universities, frank talks with portfolio CEOs (for investors), ‘off the record’ conversations with connections at the various platform providers, trenchant blogs from front line engineers. Their information sources are less influenced by self-interest, providing a more objective view of the world. People can also say things privately they wouldn’t feel comfortable sharing broadly.
These friends of mine also have a deep-seated desire (specifically, a financial or technical motivation) to get at the truth, and so aim to understand the incentives driving every one of their sources. Though they may be outwardly agreeable, internally they’re deeply critical thinkers.
These issues have a basic lesson: nothing is a replacement for surrounding oneself with primary sources, a community interested in the truth, and thoughtful engineers.
For more, see my Hack an Engineer Github Repo
Beyond the self-interested factors, engineering media can be distorted for other reasons. For example, each upvote on Hacker News is equal — meaning that the world’s most thoughtful expert on a topic has the same voting power as someone inexperienced. Nearly every social media algorithm favors engagement over any other metric, meaning that what we read is simply a function of what those around us want to read (this echoes my research on what types of deaths are covered in a leading US newspaper).
Case Study: Cryptocurrency tribes and social media
In social media, the group you surround yourself by can shape your attitudes, even though engineering decisions should be based on empiricism and facts. Social media groups and algorithms pick content that confirms its users’ beliefs — and paint the opposite side in the worst light. This tribalism can be regularly seen in cryptocurrency subreddits, meaning that readers consume a distorted reality (see also filter bubbles).
As the cofounder of Coinbase, Fred Ehrsam, explains:
“Cryptocurrencies create strong tribalism. Once you own a currency, your incentives are to make that currency go up in value. Crypto tribalism can be seen on reddit every day. Subreddits generate and report news that support their holdings.
Crypto tribalism plays out in two common ways: 1) people promoting their own currency and 2) people discrediting other currencies. People promoting their own currency is evidenced by the imbalance of positive to negative news about a currency on its own subreddit.”
For now, I’ll suggest three simple points that go far in assessing engineering media:
First, consider a writer/speaker’s motivation — both directly (an employee or investor in the company) or indirectly (a public speaker who is a consultant looking for a gig or employee looking for a job/advisorship).
When I read a post or hear a speaker, I often ask myself a few questions:
When we see a lot of press from a company, it’s often a sign that they want something from us. Their employees have consciously spent the time to sit down with a journalist or taken the time to write a post.
Second, inculcate a desire to debunk claims.
For example, companies regularly trumpet their (tailor-made) benchmarks that show their products in the best light. It’s often not that difficult to run our own benchmarks that reflect our real world use cases.
The direct reading of codebases is a powerful corrective (where available). It’s a primary source that doesn’t have ‘spin’ built in.
Looking back on trends in technology and where they ended up is another approach, which is akin to backtesting: it’s easier to tune out hype, when we think about how people in a few years may perceive it, based on things we’ve seen in the past.
Say something in online communities when you disagree or see egregious incentive issues (here’s one favorite recent example on AV1 vs HEVC). On HN, this could be comments or a post of your own, even if it’s with a throwaway account. New social network designs might let us award karma to people who list out the key incentive issues in the future.
Third, examine the community that you lean on to determine important technical decisions.
It’s important to have a community of people who benefit from the truth, think critically, and have content expertise. This could be former employees, classmates, or your coworkers. You can augment this with the most thoughtful engineers you can find. Online, you can often find some of these groups in the comments of niche HN posts and open source mailing lists/chat rooms. In your early years, finding a thoughtful mentor is critical.
In many ways, this approach mirrors the web of trust in computer security, where we look to trusted friends — and people these trusted friends are connected to.
This point underlines the benefits of a thoughtful engineering curriculum taught by dispassionate teachers. I’m unaware of any school who teaches early engineers how they should best consume engineering media — or how to choose a technical tool. Often, this means that we’re then vulnerable for a few years, until we’ve incorporated the lessons from working on a good team.
Having a few thoughtful experts in our community is also critical when assessing deeply technical decisions like what database to use or how best to architect a system. The danger with some experts — the non-thoughtful variety — is that their confidence in their abilities means they don’t question many deep seated assumptions and can’t make adjustments when the world changes. Thoughtful experts have both content expertise and strong opinions that are weakly held.
We engineers share some similarities with the machine learning algorithms we design. In both, data is used to make inferences about future decisions. As such, it’s important to have a thoughtfully curated training data set — and to adjust for any errors when our engineering inputs are compromised.
I’ve hardly touched on all the other questions we should be discussing, beyond encouraging critical thinking and finding trusted networks of thoughtful engineers. For example, should we debate alternate Hacker News and Twitter algorithms with the hope of creating better engineers? Should there be conflict disclosures for open source developers? Should popular engineering blogs such as Free Code Camp have a board of engineers who technically assess content and the background of writers? Should we teach engineering media literacy in bootcamps and CS programs?
Still, I’m encouraged by the fact that critical thinking and testing hypothesis is such a natural part of how engineers and scientists approach the world. This mindset has a critical role to play far beyond the day to day engineering problems we aim to solve.
Originally published at www.nemil.com. Get a free sticker and sign up for my mailing list here.