You're late to the party! 🎉
Most of the people are gone already and the good food is all out, but please make yourself comfortable. I'm sure we can find some crisps for you. Would you like a drink?
I wrote a small ebook about applying validation techniques to different types of real-world datasets. Going into short examples how different data types have to be treated to avoid overfitting.
I touch on the topics of:
Since you made it here, here are some of my all-time favorite links I love to share with people. You'll get them in your inbox too!
I started the newsletter, because I love sharing interesting things.
However, it's sometimes rough to keep up with the latest latest. Especially in machine learning and tech, so I share some new and some old things that I pick up around the internet.
In the tech industry and especially in machine learning, data science, big data and deep learning, there is a lot of hype and a lot of banter, you know, a lot of back and forth on who is better. Obviously, in the tech industry, whenever you have a lot of people, there are many opinions and a lot of marketing. I try to stay away from that as much as possible.
I just try to stay focused on what is tangible and share this with whomever likes to listen.
Obviously one inspiration and a sort of mantra I live by is the Lucky 10,000 by XKCD.
Python is one of the most used languages among data scientists. It has many great libraries and used by many businesses.
Python is a powerful language and like a powerful tool has many features that are not immediately obvious when using the language.
If you're a Python developer, you most likely use Stackoverflow.com, the page where any question you might have was already answered.
There's a fantastic Stackoverflow Thread that asked for Hidden Features in Python.
Scrolling through this thread regularly will for sure give you some insights into programming and how to up your Parselmouth game.
Deep learning technology, which uses large neural networks to solve problems and make decisions, is all around us.
Transformers are slowly eating the world.
From language translation, speech recognition, to self-driving cars and more, the demand for Transformers is exploding. The field of Transformers is increasingly dominated by research giants like Google, Baidu, and Microsoft, but how did we get here, and what might the future hold?
In my opinion, we should all learn about the concept of attention and with that the transformer architecture in an Illustrated Guide to Transformers, one of the best guides to understand the transformer architecture out there.
In case you're still learning, I have a course about data science using Python and a high level no-code data science masterclass on Skillshare. The links should give you a free premium trial and access to those classes.
Some people also support me on Patreon but that's of course optional!