Part 9 of The series where I interview my heroes.
Today, I’m honored to be talking with Mikel Bober-Irizar, (@anokas) or (aka anoukas)
If you’ve been living under a rock, Mikel is 17 years old and is The Youngest Kaggle Competition Grandmaster (ranked #31). He’s also a Kernels Master (ranked #13) and Discussions Master (ranked #15).
About the Series:
I have very recently started making some progress with my Self-Taught Machine Learning Journey. But to be honest, it wouldn’t be possible at all without the amazing community online and the great people that have helped me.
In this Series of Blog Posts, I talk with People that have really inspired me and whom I look up to as my role-models.
The motivation behind doing this is, you might see some patterns and hopefully you’d be able to learn from the amazing people that I have had the chance of learning from.
Sanyam Bhutani: Hi Mikel! Thank you so much for doing this interview.
Congratulations on becoming the Youngest Grandmaster recently!
Mikel Bober-Irizar: Thank you! I’m honored to be interviewed here :)
Sanyam Bhutani: How does it feel to be the youngest grandmaster?
Do you get to bunk school?
Mikel Bober-Irizar: I’m pretty proud of it — but unfortunately it doesn’t mean I get to bunk off school. I still have to find time outside of school for my ML projects; although I have been known to kaggle in class close to competition deadlines! 😅
Sanyam Bhutani: You’ve had a very inspiration learning path, you started out when you were 14!
Could you tell us what got you interested in AI?
Mikel Bober-Irizar: I actually got into machine learning because of the competitive aspect (On Kaggle and another website which had just launched called numer.ai) — it seemed like a fun challenge in both a fascinating domain and something I knew nothing about at the time.
Sanyam Bhutani: You were really young when you started kaggle. How has the journey been from a noob to a Grandmaster?
Mikel Bober-Irizar: It’s been a lot of fun. When I started I had no idea about how it all actually worked, but I made an account and started having a go at some of the competitions anyway — initially using simple GUI tools like KNIME and copying other people’s publicly shared kernels, while doing parameter tuning and other small modifications. It took a few months for me to start writing my own code (and my Python was very rusty to begin with), but learning through trial and error instead of just taking a course helped give me a good intuition for what techniques work and where they work.
The highlight of my journey has probably been working in teams — it’s a lot more rewarding to work on a competition with other Kagglers, where you can bounce ideas off each other and learn from each other’s skills. Other than that, it’s just been a case of doing more and more competitions — the leaderboard is a great motivator!
Sanyam Bhutani: What competitions do you seek today? How do you pick which ones to go for?
Mikel Bober-Irizar: The first thing I look for is good data. A lot of competitions have little or badly prepared data, or it may be unclear what the public/private leaderboard split is. These are the competitions where it’s obvious there’s going to be a ton of shakeup — and I try to avoid these. My favourite competitions are ones where there’s a lot of feature engineering to be done, or clever ways to combine multiple data sources, instead of just taking the data and sticking it into a neural network (I’m not an expert at making high performance NNs so I’d lose!).
But it also just depends on how much the specific competition interests me. I try out most competitions, but most of them I’ll lose interest after a day — I tend to only have one competition at a time which I’m working on extensively.
Sanyam Bhutani: Amazing!
During our chats, you’ve shared your approach to learning something which is different from just taking an online course.
How do you approach a new problem?
What are your go to techniques when approaching a new problem?
Mikel Bober-Irizar: Yeah, I tend not to take online courses but instead try and figure things out for myself — which helps me understand it better. If I have a new problem I haven’t seen before, I will do some research first, and then try solving it myself step by step. Whenever I have some issue (very often), a combination of googling and stackoverflow tends to solve it — even if it takes some time to figure out.
Sanyam Bhutani: You’re about to graduate from High school.
What are your future plans?
Mikel Bober-Irizar: I just applied to university, so my hope for next year is to be able to get into a good university! I’m also hoping to get a summer internship next year, but other than it’s all open. I plan to continue with ML in the long term though :)
Sanyam Bhutani: What would be your best advice to the people just starting out on kaggle?
Mikel Bober-Irizar: One thing I will say is that the competitions have gotten much harder over the last few years, so if you’re just starting out it’s useful to take a look at the playground competitions or previous competitions from the last few years. Kaggle Kernels are probably the best resource for learning, as people share tons of analysis and solutions to all the competitions. The most important thing is just to experiment for yourself and try to improve the score!
Sanyam Bhutani: Before we conclude, legend says you’re a robot from the future.
Can you comment on that statement?
Mikel Bober-Irizar: 4e 6f 20 63 6f 6d 6d 65 6e 74 2c 20 73 6f 72 72 79 2e
Sanyam Bhutani: You’re an inspiration to everyone starting out on kaggle. Thank you so much for doing this interview.
Mikel Bober-Irizar: Thank you for having me!
If you’re interested in reading about Deep Learning and Computer Vision news, you can checkout my newsletter here.