Too Long; Didn't Read
GPT-3 will not take your programming job (unless you are a terrible programmer, in which case you would have lost your job anyway) The way such a model is trained is not magic nor mysterious at all. Given a bunch of words, the model is asked to predict the next word that makes the most sense in that particular context. It does not even perform back-propagation due to the massive amount of parameters it is equipped with. The input text is nothing more than what is publically available from the Internet: discussion forums, Reddit threads, digitized books, websites, Wikipedia.
Share Your Thoughts