Scientists in basic science (e.g., mathematicians and physicists) try to understand the cause-and-effect relationships in our world. When a certain level of understanding is reached, many cool practical problems can be solved. For example, Newton’s discoveries (Newtonian mechanics) allow us to design cars and anthropomorphic robots (hi, Tesla bot!). development, it inevitably turns out that previously accepted axioms are only partially true. At some level of knowledge Thus, Euclidean geometry stops being suitable for describing all motions of cosmic bodies, since it is true only for linear, non-curved spaces. And, as we know thanks to Einstein, our world is a curved space. As a result, it turns out that Euclidean geometry is only a special case of Lobachevsky geometry, where the fifth axiom of Euclid is radically reinterpreted. Newtonian mechanics applies to most problems in the macrocosm familiar to us, such as calculating the deceleration time of a car or the force to be applied to move a body. At the same time, it becomes inaccurate for the microcosm — describing the motion of particles, where the laws of quantum mechanics come into play. Many scientists are inclined to think that sooner or later the laws of Newtonian mechanics and quantum mechanics will be reconciled, with the result that Newtonian mechanics will become a special case of quantum mechanics. The same is true of the relationship between Newtonian and quantum mechanics. What was my point? You may notice that in our daily lives, You don’t need to know quantum mechanics to design a transporter. When designing the architecture of an IT application, we are constantly faced with the problem of choosing between laying down extensibility and avoiding over-engineering. to solve most engineering problems it is possible to use simpler, easier axioms and laws. Overengineering is, by definition, evil. We all don’t want to do it. But how do we distinguish between over-engineering and necessary extensibility? After all, if we are designing a laser, then using quantum mechanics instead of Newtonian mechanics is not over-engineering, but a forced measure. The answer is both simple and tricky: expand domain knowledge. are first and foremost great researchers, in the same way, for which they create applications. They are like physicists who try to derive universal laws to describe the real world. Only is not to derive laws to describe the real world but to . Just as great scientists great IT-engineers are first and foremost people deeply immersed in the domains in it environment, our task derive laws to describe the business of our company for many years to come So all IT engineers have to be hunters for new domain knowledge. The situation when only a product manager thoroughly understands how, by whom, and for what the application being developed is used is destructive and inevitably leads either to overengineering or to the necessity of radical refactoring at the next stages of product development. And as we know from experience, these 2 things are the main reasons for slowing down development in the middle and long run. In a good sense, of the current development needs. A year or more for established companies that have been on the market for several years. Such an environment leads to the highest quality architectural solutions and allows for maximum scalability and minimum overengineering. Most experienced engineers are already aware of this, which is why a concept such as DDD (domain-driven design) is more and more on everyone’s lips. However, very few teams realize that it is impossible to reach a full-fledged DDD without domain-driven knowledge growth ahead of current development needs. the domain knowledge of key engineers should be at least 3–6 months ahead Also published here.