791 reads
791 reads

Data Analytics Career Growth

by Abhi SawhneyApril 4th, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

A strong technical skill set as a data analyst or data scientist is imperative, but it often falls short in isolation. Combining technical expertise with the right mix of behaviors can be a superpower. Anticipation and proactivity are two sides of the same coin. Empathy can positively impact your work and its actionability.
featured image - Data Analytics Career Growth
Abhi Sawhney HackerNoon profile picture

SQL, Python, and other technical skills are critical, but they are only half the battle


Why are some data analysts more effective than others? I often think about this as I reflect on my experiences leading and learning from a wide range of data analysts. The answer always seems to tie back to specific themes around a top performer’s mindset, approach, and systems, as opposed to their mastery in any one technical domain.


A strong technical skill set as a data analyst or data scientist is imperative, but it often falls short in isolation. Combining technical expertise with the right mix of behaviors across the five themes described below can be a superpower. I will use a few examples to highlight how any data analyst, with a certain amount of planning and care, can incorporate similar behaviors into their own workflow. Some of these themes feed into one another, but we will unpack each separately for clarity. I will write as if you are a data analyst at a music streaming service, but the underlying takeaways should be industry agnostic.


(1) Anticipation & Proactivity

For practitioners who are often quite excited by predictive modeling, we don’t spend nearly enough time exercising this predictive ability to prioritize our own daily or weekly tasks. The ability to anticipate and proactively deliver an analysis, model, or simply an email, can go a long way in earning trust and building credibility.


Let us go over a straightforward example of this in practice. Suppose you created a helpful automated email report that multiple teams rely on to track daily growth and engagement metrics. If a metric in this email drops significantly, several teams pause their ongoing work to deep dive and course-correct. Now, let us assume that it is right around Christmas, and you remember from a previous analysis that there is a seasonal drop in the number of weekly listeners in the first week of every new year. To avoid anxious colleagues reaching out to you for an explanation after the fact, you could get ahead of this trend and proactively send out an email with the required context before the drop occurs.


In hindsight, communicating this upcoming data anomaly may seem obvious, but it does not happen enough in practice. Most data analysts do a much better job at reacting to incoming questions or requests than at taking proactive action. A simple shift in the timing of an action can significantly impact how well it is received.


Anticipation and proactivity are two sides of the same coin. Being able to anticipate a need without taking any proactive action will have no external impact. It may help you validate your thought process but will do little else. Similarly, to be proactive in a manner that is useful and relevant, anticipation is key.


(2) Empathy & Level of Detail

Given how much has already been written about the importance of communication and storytelling as a data analyst, I want to focus our attention on a quality upstream of these skills — empathy. The value of being empathetic as a data professional can not be overstated.


Understanding the business problems important to your stakeholders, their pain points, and their strengths and weaknesses, will positively impact your work and its actionability.

Imagine you are a data analyst working with the Editorial team. For those unfamiliar with the music streaming domain, the Editorial team is responsible for playlist curation across various themes and genres. The head of this department reaches out to you for a report containing the best and worst performing playlists based on their number of listeners and streams. As a data analyst who empathizes with the team, you dig deeper to understand the underlying objective behind the request. In this case, it turns out that the team is trying to identify playlists driving the most repeat listening and overall engagement on the platform.


Given your deeper understanding of the problem, you decide to go beyond what was asked for by also including listener retention and skip rate metrics for each playlist. Finally, since each Editor within the Editorial team is focused on curation for a specific language, you also provide the ability to filter your report by individual languages.


In this situation, simply providing what was initially requested may have satisfied the Editorial team. However, by truly putting yourself in the Editors’ shoes and empathizing with them, you significantly improved the quality and actionability of the end product. While doing this, you also made an important decision about the level of detail required to be effective.


Consciously thinking about the appropriate level of detail to get your point across is a critical skill that goes hand in hand with empathy. Standout data analysts can identify situations requiring additional detail, such as in the example above, but are equally adept at knowing when a situation warrants summarization and simplicity.


For example, there may be times when you are working on a very technical project or a project where you explored several approaches before finding a breakthrough. In such cases, when presenting your final analysis to non-technical stakeholders, it is important to resist the urge to share every detail about the process and underlying technical theory. The emphasis should instead be on the key insights and proposed next steps. If the audience asks for more detail on the process and technical theory, definitely have this information handy, but do not make it the core focus of your presentation.

(3) Individual Systems and Processes

Given the limited bandwidth and a seemingly endless supply of incoming requests, it is understandable that data analysts often find themselves in a constant cycle of jumping from one operational task to the next.


Standout data analysts are aware of being in these cycles and know when it is time to jump out and focus on increasing overall productivity and effectiveness. They are unafraid to invest time upfront to build systems and processes if it saves them significant time in the future.


For example, let us assume that the executive team reaches out to you regularly to explain why overall business metrics moved up or down. On any given day, you could receive questions along the lines of:


  1. “It looks like our overall streams dropped a lot yesterday. Is that accurate? What happened?”
  2. “Hey, I was just putting together a view of our daily listener trend for the past year. I noticed several outliers in the daily numbers scattered throughout the year. Could you please tell me what happened on each of those dates? Apologies for the short notice, but I need this for an upcoming meeting. Can you please get back to me by the end of the day tomorrow? :) ”


The two questions above are both just instances of this common question theme:

Can you help explain why a specific key metric(s) increased/decreased more or less than expected?


Because you are a savvy data analyst, you recognize this theme and decide to set up a couple of systems to assist you with this recurring question type in the future. First, you create a simple spreadsheet with the following information updated in an automated fashion daily:


  1. Daily values for the core business metrics over the last few years
  2. Day-over-day percentage change in metrics, along with z-scores to indicate significance
  3. A “Notes” column for you to manually add comments to going forward. You can use this to call out any internal or external events that impacted a core metric(s) on a given date. For example, the value in this column on 3/26/23 may read: “Internal engineering issue: the app was down for 4 hours, negatively impacting daily listener and stream counts.”


In addition to the sheet above, you also begin investing time to develop detailed process flows that you can refer to when critical business metrics drop. These process flows help streamline your decision-making and reduce the lag between identifying an issue and the rollout of corrective action. You treat this system as a constant work in progress, iterating and improving it with every instance of a metric drop.


An example of such a process flow for the daily streams metric is shown below:

 An example process flow to navigate a drop in daily streams | Created by author


Finally, to conclude this section, I wanted to share a quote I recently came across in James Clear’s book, Atomic Habits. It articulates the importance of the “Individual Systems and Processes” theme better than I ever could:


“You do not rise to the level of your goals. You fall to the level of your systems.”


(4) Knowing the 20 Inside Out

Most of us may be familiar with the Pareto Principle or the 80/20 rule. At its simplest, it highlights how a small number of items can have an outsized impact on the final objective or outcome. For example:


  • 80% of your streams will come from just 20% of your listeners
  • 80% of your revenue will come from just 20% of your clients


The 80 and 20 are approximations, but they help highlight the larger point. This principle is just as appropriate when evaluating the most impactful areas of your work. There will always be a small set of definitions, metrics, queries, dashboards, or other items that are the most critical to know, inside out, at any given point in your career.


Data analysts who quickly identify what this 20 is for them and take the time to learn it deeply, become invaluable to their teams. For example, let us assume that you recently began working with the Paid Subscriptions team. This team is responsible for growing the company’s paying customer base. They rely heavily on your analytical opinion and actively seek your guidance during regular team planning and brainstorming sessions.


To contribute meaningfully to these sessions, you realize that there are a core set of data points you need to know at your fingertips. These data points are so fundamental to this domain that they are referenced in almost every discussion. So you put together a core list of what these data points are and decide to carve out 20–30 minutes of your day, every day, to study them. You do this repeatedly until this data, and the underlying insights, come automatically to you.


The actual list of these data points (“the 20”) will vary based on industry, company, and team context, but here is one example of what this list might look like for a Paid Subscriptions team:


  • Total number of paying subscribers and the 1–2 year historical trend
  • Top 5 markets by paying subscribers and the 1–2 year historical trend
  • Top 5 subscription plans (e.g., Monthly Student Plan, Annual Family Plan, etc.) and the 1–2 year historical trend
  • Number of new, returning, and reactivated paying subscribers
  • Retention rate (or churn) across the different paid subscription plans
  • CAC, LTV, and ARPU of a paying subscriber
  • Conversion rate from free trial to paid subscription


At first glance, this may look like a lot of information to remember, but it becomes very doable with some attention, planning, and repetition. Knowing the exact numbers off the top of your head is less important than having a strong sense of their ballpark levels and relative rankings. Moreover, since the actual values are unlikely to change drastically in short periods, the knowledge acquired will serve you well for an extended period.


This does not mean you should not refer to relevant dashboards, reports, or analyses in real-time when required. On the contrary, doing that is inevitable and sometimes preferred when the level of detail or accuracy needed is high. However, only some situations require that level of specificity.


It is much harder to get ahead of the game and actively participate in conversations if you have to refer to a document for every data point. Knowing your 20 inside out will help you avoid this and be more present.


(5) Consistent Learning

I am sure we have all responded to or heard someone respond to an interview question with something along the lines of, “I am a quick learner” or “I don’t know that, but I can learn it.”


Most data analysts convey the desire to learn a new skill or build expertise in a new area. However, it is often not the desire or curiosity to learn, but the approach to learning that is the key differentiator. Individuals who treat learning as an ongoing journey instead of an occasional ad-hoc task stand out over extended periods as their knowledge and skill set compound.


For many, the act of learning becomes solely tied to when a specific need for it arises at work. Doing this, however, gives your work the power to dictate what you learn and the volume and rate of your learning. This is not a reliable long-term strategy. It may work well in the first few months of a new job where there is much to learn, but this learning rate will inevitably plateau as you become more comfortable in your role.


In contrast, an “always on” learning mindset would mean consistently spending a small amount of time daily or weekly getting better at something. This something may or may not be tied to what you are doing at work at the time. This makes the relationship between work and learning less rigid or transactional and more sustainable over the years. In certain instances, work will dictate what you need to learn, while in other cases, what you learn will influence your roadmap at work.


Conclusion

In summary, most of us do well in staying up to speed on the latest technical skills that may advance our careers. However, we also need to apply the same level of rigor and attention to develop across the non-technical themes highlighted above. This ability to balance the two and effectively grow across both the technical and the non-technical is what elevates the top performers and helps them stand out.

Trending Topics

blockchaincryptocurrencyhackernoon-top-storyprogrammingsoftware-developmenttechnologystartuphackernoon-booksBitcoinbooks