paint-brush
New AI Speaks Two Languages at Once and Just Might Crack AGIby@canmingir
273 reads

New AI Speaks Two Languages at Once and Just Might Crack AGI

by Can MingirDecember 10th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Nucleoid uses an intermediate language as a bridge between Neural Networks and Symbolic Systems. The intermediate language plays a critical role in uniting the two paradigms. This hybrid approach is pivotal for advancing AGI.
featured image - New AI Speaks Two Languages at Once and Just Might Crack AGI
Can Mingir HackerNoon profile picture

Hello everyone! 👋


Over the past few months, we've been working on ARC benchmark as a part of our Neuro-Symbolic AI project that we’ve been able to achieve some promising results, and it feels incredible to see our approach—combining symbolic reasoning with neural network capabilities—making meaningful progress.


Nucleoid (aka nuc) adopts a Neuro-Symbolic AI architecture but introduces a novel twist: the intermediate language as a universal bridge between Neural Networks and Symbolic Systems.


The intermediate language plays a critical role in uniting the two paradigms. Based on our findings, the nuc lang helps Neural Networks to abstract patterns, which is eventually used in Symbolic System, and Knowledge Graph is built with logic and data representations in the intermediate language. In addition, LLMs surprisingly behave near deterministic while running on ARC-AGI.


Before diving into our approach:

What is ARC Benchmark?

The Abstraction and Reasoning Corpus (ARC) is a benchmark dataset and challenge designed to test AGI systems on their ability to perform human-like reasoning and abstraction. Developed by François Chollet, ARC is not a typical machine learning dataset—it intentionally avoids tasks solvable by brute-force statistical techniques or large-scale data training.


ARC Puzzle

Our Progress 🐋

We were able to get very promising and exciting numbers (Still incomplete tho). For example, in this puzzle, our project responded with this result without any prompt engineering.


More details here 👇
https://github.com/NucleoidAI/Nucleoid/tree/main/arc

nuc Response

...and this is ChatGPT o-1's answer

ChatGPT o-1 Response

🌱 What is Neuro-Symbolic AI?

Neuro-Symbolic AI combines the pattern-recognition capabilities of neural networks (subsymbolic AI) with the logical reasoning and structured knowledge of symbolic AI to create robust and versatile systems. Neural networks excel at learning from unstructured data, like images or text, while symbolic AI handles explicit rules and reasoning, offering transparency and precision. By integrating these approaches, Neuro-Symbolic AI enables generalization from smaller datasets, improves explainability, and supports tasks requiring both adaptability and logical consistency. This hybrid approach is pivotal for advancing AGI, as it bridges the gap between learning from data and reasoning through.

🌍 System 1 and System 2

Neuro-Symbolic AI aligns intriguingly with the concepts from Daniel Kahneman’s Thinking, Fast and Slow, which describes two systems of human thought: System 1 (fast, intuitive, and automatic) and System 2 (slow, deliberate, and logical).


  • Neural Networks in Neuro-Symbolic AI parallel System 1, as they excel at processing unstructured data, recognizing patterns, and generating outputs rapidly without explicit reasoning. They mimic intuitive, subconscious processes that are data-driven and reactive.
  • Symbolic AI, on the other hand, mirrors System 2, as it relies on explicit rules, logic, and structured reasoning to solve problems in a deliberate and explainable manner, akin to conscious, rational thought.


By combining these two paradigms, Neuro-Symbolic AI reflects the dual systems of human cognition, enabling it to tackle problems requiring both fast intuition (pattern recognition) and slow reasoning (logic and planning). This hybrid approach not only enhances AI's adaptability but also brings it closer to human-like intelligence by integrating the strengths of both modes of thought.

🦆 Duck Test

Duck Test


"If it looks like a duck, swims like a duck, and quacks like a duck, then it probably is a duck"


Simply, it is System 1 at work. We have seen ducks probably thousands or millions of times throughout our lives, which forms well-defined patterns in our cognition. When we come across something resembling a duck, human cognition doesn’t trigger System 2 because identifying the object as a duck is automatic and intuitive. Our brains rely on a mental "duck schema" even the individual parts of a duck, such as its wings, bill, or webbed feet, are matched to their associated labels in realm of System 1.

Duck in the Lake

Again, we won't be surprised if we seen duck in the lake, because we have enough labelled patterns to make the call.

Duck in the Lake

Duck in the City

In this case, as our experiences typically associate ducks with natural settings, it is uncommon to see a duck in an urban environment at night. This unfamiliarity triggers System 2 to take over, engaging in more deliberate reasoning. So, System 2 is now responsible for reasoning as cities being dangerous after dark, and the duck is in the city, duck may not be safe. System 2 overrides System 1’s instinctive identification of "a duck" and shifts the focus to evaluating the broader circumstances of the duck's well-being.


Duck in the City

CPU vs GPU

CPU vs GPU


Neuro-Symbolic AI architecture orchestrates a seamless harmony between the precision of CPUs in reasoning and the raw power of GPUs in pattern recognition.


CPUs are specialized in logical branch operations used in decision-making, and GPUs execute parallel arithmetic operations for matrix multiplications in pattern recognition. So, it is important to understand how they are different.


CPUs are designed for versatility, with fewer cores optimized for high-performance single-threaded tasks and low latency, making them well-suited for complex decision-making, sequential processing, and multitasking. In contrast, GPUs have thousands of smaller, energy-efficient cores optimized for massive parallelism, enabling them to handle tasks like matrix computations, image rendering, and deep learning efficiently. GPUs excel in throughput-oriented tasks where large data sets can be processed simultaneously, while CPUs focus on general-purpose computing and running the operating system. Additionally, CPUs often feature larger caches and more sophisticated control logic to handle diverse workloads, whereas GPUs prioritize raw computational power and bandwidth to accelerate specific workloads. This fundamental difference makes GPUs indispensable for tasks requiring high parallelism, while CPUs remain the backbone of general computing and coordination.


While building a modular AI system, it is crucial to design with the underlying hardware in mind. In advanced systems like AIs, hardware constraints can sometimes conflict with algorithmic requirements. In Neuro-Symbolic standpoint, Neural Network (System 1) needs GPU for pattern recognition and CPUs for knowledge representation and reasoning in Symbolic (System 2). It is worth noting, System 1 and 2 is just for building AI foundation, expecting more and more systems like them...


In short, it is not possible doing efficient reasoning with GPU or pattern recognition with using CPU.


🌿 Stay Tuned for Recap 02 in Road_to_AGI series...