Deep learning is the opposite of where I want to be. Here’s why: Deep learning and neural nets are famous because they . The missing implication is that you have a whole data-center’s worth of resources to burn. Simply put, I don’t have those resources. just work What I do have is insight into what might be happening inside those expansive models. I am betting on those models being slower to train and overall more expensive to build than optimal by several orders of magnitude. This is my margin. I’ve already tested simple heuristics against several OpenAI games. It is amazing how simple methods can accomplish complex tasks: This demo involved only a left/right decision based on a 4 LIDAR array input. This is from raw video, where the concept of LIDAR is even not well defined, yet it still works, sort of. This is the start of my search for the . In research I believe that there are recurring patterns in neural networks, that once deduplicated, will lead to at least a 1000x improvement in training, space, and computational requirements. Neural Platonic Solids It is awesome to see what is coming down the pipeline. Looking at frontier research and products is exciting, no doubt. But I will stick to my methods because I am a spoil-sport like that. I will just continue to clean and comb the nets into their rightful, original, and idyllic form. Hopefully it won’t involve too much math because that might go over my head. is how hackers start their afternoons. We’re a part of the family. We are now and happy to opportunities. Hacker Noon @AMI accepting submissions discuss advertising & sponsorship To learn more, , , or simply, read our about page like/message us on Facebook tweet/DM @HackerNoon. If you enjoyed this story, we recommend reading our and . Until next time, don’t take the realities of the world for granted! latest tech stories trending tech stories