- NEAR Protocol has reached 46 million monthly active users and processes over $6 billion in cross-chain volume through its Intents protocol.
- Illia Polosukhin co-authored “Attention Is All You Need,” the 2017 paper that introduced the Transformer architecture behind ChatGPT, Gemini, and every major large language model.
- Polosukhin and Alexander Skidanov raised $542 million to build NEAR, a sharded Layer-1 blockchain that hit 1 million transactions per second in public testing.
- NEAR AI is building decentralized, user-owned AI infrastructure — positioning NEAR as the backend for an agent-driven internet.
The blockchain that calls itself “the blockchain for AI” now serves 46 million monthly active users. NEAR Protocol processes over $6 billion in cross-chain volume, runs on nine shards, and ranks among the top Layer-1 networks by developer activity — with 2,500+ contributors building on its infrastructure every month.
Its co-founder, Illia Polosukhin, didn’t come from crypto. He came from the lab that built the architecture powering every major AI model on the planet. His path from a flat in Kharkiv to the center of both the AI and blockchain revolutions is one of the more unlikely origin stories in tech.
A Boy in Kharkiv Who Learned to Code Before He Learned to Drive
Polosukhin grew up in Kharkiv, Ukraine — a post-Soviet industrial city shaped by hyperinflation and repeated bank failures. His mother, an engineer and mathematics specialist, gave him something more valuable than a stable currency: a structured mind. He started coding at ten.
By university, he was competing in competitive programming — what he later called “the Olympics of coding.” He earned a Master’s degree in Applied Mathematics and Computer Science from Kharkiv Polytechnic Institute, working part-time for a software company headquartered in San Diego while still a student.
”I grew up in a place where banks would just fail and your money would be gone. That shapes how you think about systems and trust.” — Illia Polosukhin
From Salford Systems to Google Brain in Six Years
After graduation, Polosukhin moved to the United States and joined Salford Systems in 2008, where he spent six years building machine learning tools. In January 2014, he joined Google Research. Within twelve months, he was an engineering manager.
At Google, Polosukhin contributed to TensorFlow — the open-source machine learning framework that would become the industry standard. But the work that would define his career happened over a single lunch in 2017. During a brainstorming session with seven colleagues, the group sketched out a new approach to sequence modeling that replaced recurrence with self-attention.
Polosukhin returned to his desk and built what may have been the very first Transformer model. The result was “Attention Is All You Need” — a paper now cited over 173,000 times, placing it among the ten most-cited research papers of the 21st century. Every major AI model today, from GPT-4 to Gemini, runs on the architecture laid out in those 15 pages.
The Lunch That Launched a Trillion Parameters — and Then a Pivot
By the time the paper was published in late 2017, Polosukhin had already left Google. He and Alexander Skidanov, a former Director of Engineering at MemSQL and ICPC gold medalist, founded NEAR AI — a startup that would teach machines to code using crowdsourced data.
The plan was straightforward: recruit computer science students around the world to write code snippets for training data, then pay them for their work. The AI part worked. The payments part didn’t. PayPal failed in China. Wire transfers broke in Ukraine. Ethereum gas fees made small payments absurd.
”We tried to pay contributors through Ethereum and quickly realized it was not fit for the job. We decided to build our own blockchain.” — Illia Polosukhin
What started as a payments headache became the thesis for an entirely new company. If global financial infrastructure couldn’t handle paying a student $50 in Shenzhen, something was fundamentally broken — and fixing it meant building from scratch.
Building a Sharded Blockchain That Actually Scales
NEAR Protocol launched its mainnet in April 2020 with a technical bet few others were willing to make: sharding from day one. While Ethereum spent years debating how to shard, Polosukhin and Skidanov shipped it. Their approach — called Nightshade — splits the network into parallel shards that process transactions simultaneously, then recombine the results into a single chain.
The fundraising matched the ambition. NEAR raised $542 million across eight rounds from investors including a16z, Tiger Global, and Dragonfly Capital. By 2022, the last round valued the protocol among the most heavily funded Layer-1 projects in crypto history.
But capital alone doesn’t build networks. The technical milestones did. In 2025, NEAR scaled from six to nine shards, boosting throughput by 50%. Nightshade 2.0 with stateless validation quadrupled transaction capacity. And in a public test using live core code, NEAR hit 1 million transactions per second — a benchmark that put it in a different category from every other Layer-1.
Chain Abstraction and the Bet That AI Agents Will Run the Internet
Polosukhin’s most provocative claim is also his clearest: blockchains won’t be used by humans. They’ll be used by AI agents. In a March 2026 interview, he laid out the vision in blunt terms.
”AI is going to be on the front end, and blockchain is going to be the back end. The goal is to make your AI hide all the blockchain. The fact that we have explorers is effectively a failure, because we don’t abstract the technology.” — Illia Polosukhin
This is what NEAR calls chain abstraction — the idea that users should interact with AI, not wallets or gas fees. The NEAR Intents protocol, which has already processed over $6 billion in volume across 120+ assets, is the infrastructure layer for that vision. In 2026, NEAR is scaling it further with multi-party computation, trusted execution environments, and a decentralized marketplace for open-source AI models through NEAR AI.
The Transformer Co-Author Who Wants to Decentralize His Own Invention
Most of the eight “Attention Is All You Need” co-authors went on to build AI companies that concentrate power — bigger models, more data, more compute, fewer hands on the wheel. Polosukhin went the other direction. His argument is that the same Transformer architecture he helped create is now being used to build what he calls a surveillance economy.
He presented new research on decentralized confidential machine learning at NVIDIA’s GTC 2025, demonstrating how users can run AI models without revealing personal data — using zero-knowledge proofs and trusted execution environments built on NEAR’s infrastructure.
The roadmap for 2026 is aggressive: scale the intents protocol, expand the MPC network, and ship AI Cloud tools already serving over 100 million users. Polosukhin isn’t building another chatbot. He’s building the operating system underneath all of them — one where the user owns the data and the model runs on infrastructure no single company controls.
There’s a specific kind of founder who sees a broken system and builds a fix, only to realize the fix requires rebuilding the entire foundation. Polosukhin did it twice — once with attention mechanisms that replaced how machines process language, and once with a blockchain designed to replace how they transact. The Transformer paper gave AI its backbone. NEAR is his attempt to make sure no single company owns the spine.