Detailed info about Bignoodle :
Agentic DeFi?
+
We're reframing "COMPUTE" as a DeFi primitive, we believe it is time to abstract AI tokens (and compute in general) in order to measure the importance of compute. We're inspired by other mechanisms found in DeFi but this is not a financial tool - As you read this, please note that we're being careful about our choice of words, in order to make clear this is a wholly unique and new DeFi primitive.
This system enables participants to sponsor and provision compute capacity within a decentralized network, without needing to own or operate physical hardware. Instead of purchasing and maintaining GPUs directly, participants allocate funds that correspond to compute capacity / services running within the network’s infrastructure. These capacity units contribute to real AI workloads and services operating across the network.
Here's an example of the user experience for you:
Step 1 – A new epoch is announced, including the amount of compute capacity the network is prepared to support with support from a usdc contribution.
Step 2 – Participants may choose to allocate a portion of their available balance from their agent-controlled wallet toward that epoch.
Step 3 – During the epoch, these allocations enable compute operations across the network, supporting AI workloads and related services.
Step 4 – At the end of the epoch, allocations are reconciled automatically based on actual network usage. Participants may receive a variable outcome reflecting the level of compute activity supported during that period.
Step 5 – A new epoch is announced with updated parameters based on recent network performance.
Note: If an epoch does not reach the required participation level, it is canceled and no allocations are utilized.
What does it mean to “sponsor” compute?
+
Sponsorship refers to enabling compute capacity to exist and operate within the network. When a participant sponsors compute:
-they are assigned attributable capacity units
-those units represent a portion of active GPU infrastructure
-those units are used to process AI workloads and deliver compute services
Ultimately, contributes enable work & not considered passive-income.
Do participants own hardware?
+
No. Participants do not take physical possession of hardware. Instead, they are assigned virtualized compute capacity slices that:
-are backed by real infrastructure
-are tracked and attributable within the system
-contribute to overall network performance
This allows participants to support and operate compute capacity without the overhead of running machines themselves.
How are outcomes determined?
+
Outcomes are tied to the utilization of the compute capacity a participant sponsors. When sponsored capacity is, it generates usage-based fees.
All results depend on real demand for compute resources and actual network activity.
What are Epochs?
+
Epochs are defined time periods during which:
-compute capacity is provisioned and made available
-workloads are processed across the network
-utilization and activity are measured
At the end of each epoch:
-capacity usage is reconciled
-outcomes are calculated based on actual compute activity
Each new epoch reflects updated network conditions, including demand and prior usage.
Is this a financial product or investment?
+
No. This system is designed as a compute infrastructure coordination layer, not a financial product.
Participants:
-are not lending funds
-are not depositing into a managed pool
-are not relying on discretionary management
Instead, they are enabling and operating compute capacity, with outcomes tied to how that capacity is utilized within the network.
Does the platform earn from participant sponsorship?
+
No. The platform does not generate revenue from participant allocations or their outcomes.
Platform revenue is derived separately from:
-external AI products
-network service fees
-infrastructure operations
This participation mechanism operates independently of platform revenue.
Company info
+
We operate as Bignoodle Design, a Delaware S Corp (web2 arm of the business) and Bignoodle Ltd based in Hong Kong (digital assets + token issuance). We've gotten the blessing of a China based attorney with specialties in US-China cross border business and data compliance in regards to our DePin operations. We've also gotten the blessings of two international tax attorneys in regards to our tax-exceptions relting to IRS tax code 179 and our 179node.
Fun fact about this website.
+
This website was created via heybiggie.Bignoodle.com (not currently active), our deep research agent! It took a total of 4 prompts, with very little manual tweaking. The animated background was created in comfyUI (stable diffusion).
What is a sandbox? What's this 5% vs 95% narrative?
+
Let's go back to the early days of the depin. I won at the TAP hackathon at Bitcoin Amsterdam when I showed the depin in action with
a mini-pc with GPU dock (a 1 node depin) and my laptop (with no GPU) generating my Ordinals collection, HIROS. It was a demo as well as
live performance genAI :) At that time, I touted an exciting day 1 approaching, the release of the new RTX5000 series from Nvidia would
level the GPU playing field - we aimed to spin up very quickly via hordes of individuals buying GPUs locally and earning on our depin
while bigtech took time to rollout their upgrades. For a time, we'd have the most gpu power anywhere. However, the GPUs never came.. at
launch, many cities got stock in the single digits, or not at all. A paper launch. Over time, only bigtech got deliveries in meaningful
numbers, I had to rethink the depin...
Introducing SANDBOXES. We learned something key to agentic efficiency - a multi agent workflow (such as Manus or openAI Operator) only
requires an LLM (which in turn requires a GPU) just 5% of the time, for orchestration. The other 95% of the workflow are non-GPU tasks
performed by agents, usually browser use. When we realized how much of a bottleneck exists, we started to route the non-GPU tasks to CPU
based sandboxes (a dedicated OS running in a container). Not only did we make deep-research agents super cheap to run, we also found an
incredible use case for a depin with a low-cost entry for those who want to contribute and earn. Our 'miners' (in quotes because they're
not really mining in a PoW way) are running multiple docker containers, which in turn run multiple agent sessions. Hardware
costing <$150 usd can run about 16 agents doing everything from browser use to wallet transactions. There are two commercial services
which have a similar strategy, Daytona and E2B, and both are reportedly
running at max capacity (and both are very pricey imho) - as the world becomes more agentic, the sandbox need will grow.
An official project description?
+
Project Bio - Bignoodle is a DePin of sandboxes made to run AI agents. Our focus is on 'consumer' & 'labor' agents. Every agent is an NFT, with provenance & immutability. All of our agents can handle x402 as Bignoodle is an x402 facilitator. We also dogfood our network with web2 AI webapps - consumers show up with fiat, contributing to our token buy-and-burn. Our agents have superpowers, we give them dedicated wallets and a Windows 11 PC for automation not seen before. Our agents also drive AI glasses. We're changing the game on many fronts.
Built on Bitcoin? Why?
+
In a word - Inscriptions!!! Bitcoin is hardened, has never been hacked. Add that inscriptions on L1 are a superpower for immutability & provenance. Once ordinals were introduced, using Bitcoin's programbility was a no-brainer for us. Concepts such as DMT & Bitmap also have inbuilt communities as passionate & dedicated as any Punk or Ape.
Are we bitcoin maxi's? Sorta, maybe, we try not to be... We do aim to appeal to other ecosystems, so we came up with a simple, yet genius solution (if we do say so ourselves). As every agent on our platform is a bitcoin inscription, we hide our inscription id's in the metadata of Solana, Eth, Base, etc NFTs at mint. When folks from those ecosystems "connect wallet", we mine the metadata for their digital asset identity. They're all on bitcoin's rails, whether they care or not. This is also true of our consumer web2 audience. Our version of built-on-bitcoin is largely utilitarian, making use of BTCs strengths, but there's a bit of degen motivation here too... As we can batch many transactions, we also fold fee-taking & our buy-and-burn strategy into said batches. We aim to accumulate btc while transacting btc in order to use btc for our depin rewards in the future. Most don't know about this hidden utility on the motherchain - possibly the best kept secret in crypto.
The founders?
+
Dr. Mien PhD is a computational astrophysicist turned crypto dev. She has authored over a dozen scientific papers—including AI-driven approaches to galaxy classification and information theory in cryptocurrency markets. Her journey into Bitcoin began in 2014 & has led to joining Bignoodle as co-founder where she's building on-chain AI agents to the decentralized world. Dr.Mien believes the future of intelligence isn't just artificial—it's trustless & transparent.
Alpay has run San Francisco based Bignoodle which was an experiential design firm focused on UX (since 2004). As a long-time
founder, he's worn many hats and has run small and large teams. After being acquired, Alpay did two stints at Google for two teams, as
manager of a black ops dev team and as an XR specialist. He's pivoted fulltime into web3 with cofounder Dr.Mien to create on-chain AI agents
with super powers. He's brought the Bignoodle brand back to life as the name is extremely applicable.
VC investors? OG Pass? Community driven?
+
We've been completely community supported thus far. We Launched an ordinals collection called HIROS 1 yr ago which provided Superfan
(Alpay) with dev runway which led to the announcement of plans to create Bignoodle. Alpay percolated on this for some time, winning a
major hackathon (for the Bignoodle depin at Bitcoin Amsterdam). Superfan appealed to HIROS community for a TGE budget by selling OG
Passes, rewarding holders with tokens after TGE. We've not pitched or accepted any VC investment. We are seeking an investment in the form
of liquidity provision to bolster our upcoming token. We will start pitching VC's at some point soon in order to grow our depin rapidly.
The need for agentic 'sandboxes' is about to explode. We'll be well positioned to serve in a space currently controlled by E2B and
Daytona, however, they have trouble meeting demand. Our own sandbox needs include the hosting of Win11 agentic PC's, n8n backend, and
Linux for agentic tools use (such as browser use).
The Bignoodle name?
+
Alpay is a hardcore sci-fi fan. Bignoodle is borrowed from an obscure
PKD book called Divine Invasion. In the book, Bignoodle is an AI which
is essentially Earth's momma. It takes care of worldwide logistics to
ensure humanity is well taken care of.
Your data is valuable - Why?
+
Large AI companies (such as openAI) has already scraped the web and is now scraping its own output. Learning from data which has been distilled by AI introduces nothing new to the dataset while the costs of training still exist. After multiple generations of these models, we may find a diminishing in the quality of the model in addition to diminishing returns on training investment. As such, new/fresh datasets will become incredibly valuable. Private data which has not previously been in the public domain, in addition to data yielded from new authorship/research becomes very valuable to aggregators. This is especially true in cases where the data is niche.
Example : A real use case of the Bignoodle Depin is a medical diagnosis project called Ophira created in association with the University of Rochester. The founders will use this depin for inference of large publicly available datasets of retinal scans in order to diagnose (or provide early warnings) of several diseases with a photo of one's eye. The results and user behavior are data that don't yet exist. Ophira will create datasets offering unique correlations not found anywhere making the data valuable for all manner of research, treatment, and product design.