GPU owners can now register to earn rewards by contributing spare computing power to FAR AI, a decentralized inference network launched today by FAR Labs. FAR AI connects idle gaming PCs, workstations, and small servers into a unified AI infrastructure designed to be faster, cheaper, and more resilient than centralized cloud providers.
A small number of hyperscale providers control access to the GPUs required for AI inference. That concentration drives up prices, creates single points of failure, and locks out independent builders. FAR AI was built as a structural response to this problem.
Inference requests flow across distributed nodes instead of through a single cloud. A node can be a gaming PC, a workstation, or a small server, basically, any machine with spare GPU or CPU capacity. If it can run inference, it qualifies. Node operators contribute real resources and earn rewards based on actual usage.
What an operator earns depends on hardware type, compute delivered, network demand, and the monthly reward pool. Stronger hardware gets a larger share. The model is strictly tied to each operator’s output.
FAR Labs has released the FAR AI Node Reward Calculator to help operators understand potential returns before the network opens. The tool models estimated monthly rewards based on individual hardware specs and operating costs. Users select their GPU, adjust for electricity prices, and see projected earnings under different network scenarios.
The early response has been strong. More than 1,200 users have registered through the calculator since it launched, and those who save their calculations will receive priority access when node onboarding goes live.
GPUs are excellent at parallel processing, which modern AI models need for inference. However, high-performance chips are expensive, and operating them at scale demands specialized facilities.
That cost structure concentrated AI power in a few corporate hands. But the hardware required for inference already exists. More than 100 million consumer-grade GPUs sit idle in homes, offices, and studios around the world. Most are used at less than 20 percent capacity.
FAR AI connects that dormant compute into a distributed grid. Costs drop, energy overhead falls, and geographic resilience improves — with no new hardware required.
"The world already owns the compute required to power AI," said Ilman Shazhaev, Founder and CEO of Dizzaract, the gaming studio behind FAR Labs. "It’s just disconnected and underutilized. FAR AI connects that compute and distributes both access and rewards. We're building infrastructure that is faster, cheaper, and open to everyone."
About FAR Labs
FAR Labs builds decentralized AI infrastructure. The company developed FAR AI, a distributed compute network that runs open-source language models faster and cheaper than traditional cloud providers by connecting idle GPUs into a unified inference layer.
FAR Labs is the deep-tech division of Dizzaract, the largest gaming studio in the Middle East and North Africa, backed by the Abu Dhabi government and employing more than 80 professionals across 25 countries. The studio previously built Farcana, a competitive shooter, and GAMED, an AI-powered player identity platform.
