Is broadband the missing link in capturing small-scale power that otherwise is wasted? Small-scale power sources are mostly in rural locations. The long distances make it impossible to justify the construction of capital-intensive, high-maintenance, and, potentially, hazardous, high-voltage power lines to these remote locations.
One solution is to use the power at its source. An off-grid, fiber-fed, distributed data center means doing the computational work at the power source. A fiber connection transmits the results where needed.1 Small-scale, off-grid power sources include solar, wind, methane-flaring, and/or hydropower.
Small-scale hydropower alone is the equivalent power of 38+ Hoover dams.2 The U.S. Department of Energy estimates up to 77 GigaWatts of energy is available from existing dams and environmentally safe, in-stream power generation.
Clarity in the Fog
In the above interview, Nicolas Burley, Head of North America for Fog Hashing, discusses the opportunity for distributed computing. This approach aligns with small-scale distributed power generation.3 Leveraging its advanced liquid cooling technology, Fog Hashing aims to make
“blockchain computing infrastructure more efficient, sustainable, and robust.”
Burley indicates their approach applies to other applications, including machine learning/artificial intelligence.
Pre-built, shipping container-size, data centers, as seen in the above video, could also serve as remote data collection points.4 These weather-protected outposts could host fiber optic sensing equipment, collect data from IoT sensors, and integrate communications connectivity. Examples of such sensors include:
- WiFi hot spots/Cell sites
- Remote cameras
- Air quality sensors
Beyond power capture, off-grid, fiber-connected data centers could improve the understanding of the environment in remote locales. Beneficiaries of this newfound connectivity would include forest managers, public safety, and the lost hiker.
Stay tuned for a related version of this post at www.viodi.com
Notes:
- A fiber-connected, power-source adjacent, distributed data center should be much more efficient than electrical distribution to a centralized data center. For instance, according to Lawrence Livermore Labs, approximately 64% of electricity production is “rejected energy”. That is, most energy consumption is due to electrical resistance and inefficiencies in the distribution network.
This means a small-scale, one-megawatt power source would only deliver 360 kilowatts via a typical electric distribution system. A power-source adjacent data center would not be burdened with electrical distribution losses. The implication is that virtually all the power would be used for computing (the power losses due to fiber connectivity would be de minimus; measured in the hundreds of Watts).
Further, fiber is impervious to lightning. Unlike high-voltage lines, fiber does not represent a fire hazard. From a human safety perspective, it does not have the separation requirements (e.g., installed on tall towers) of high-voltage lines, making fiber cheaper and faster to install (e.g. overhead electrical distribution measured in the hundreds of thousands per mile versus tens of thousands per mile for fiber). ↩︎ - This is based on Hoover Dam’s 2 GigaWatts power generation capacity. ↩︎
- Fog Hashing’s name appears to be a play on the terms, Fog computing and Bitcoin mining. Fog computing is about processing at the edge of the network, as described in this EC-Council article. ↩︎
- Here is the link to Fog Hashing’s video showing their self-contained data center. ↩︎
Leave a Reply