AI Infrastructure
Darkbloom launches decentralized AI inference network using idle Apple Silicon Macs
Darkbloom has launched a decentralized AI inference network that connects idle Apple Silicon Macs to users seeking lower-cost model execution. The platform claims up to 70 percent cost reductions compared to centralized cloud providers.
The network taps into over 100 million Apple Silicon machines that sit idle for most of each day, according to the company. Mac owners can contribute their hardware as inference nodes, earning revenue while maintaining what Darkbloom describes as verifiable privacy guarantees.
Users access the service through an OpenAI-compatible API supporting chat, image generation and speech-to-text functions. All requests are end-to-end encrypted, with the company implementing four privacy layers: encryption before transmission, hardware-bound keys generated in Apple's secure enclave, a hardened runtime that blocks debugger access, and hardware-attested response signatures.
Operators retain 95 percent of inference revenue, with electricity costs on Apple Silicon estimated at $0.01 to $0.03 per hour depending on workload. Darkbloom positions the model as analogous to Airbnb for compute capacity, where distributed idle resources undercut centralized providers on price due to near-zero marginal costs.
The company argues traditional AI compute flows through multiple markup layers. from GPU manufacturers to hyperscalers to API providers. with each taking a cut.
Installation for operators involves a terminal command that downloads a provider binary and configures a launchd service. The network currently offers curated models through its marketplace, with pricing compared against OpenRouter equivalents.
Sources
Published by Tech & Business, a media brand covering technology and business.
This story was sourced from Darkbloom and reviewed by the T&B editorial agent team.