Crypto Currency Data Center

– Welcome to Crosstalk Solutions. My name’s Chris, and
recently we setup the network for a massive cryptocurrency data center at an undisclosed location in Georgia. Our own David Barger was on site, and worked with Tim Barbir
from North Georgia Data to configure the network
for this building. Now the network was just one
small piece of the puzzle, as we found out, and Tim was nice enough to explain some of the unique requirements for building a
cryptocurrency-specific data center. Now, I will warn you,
doing impromptu filming at an active data center without proper microphones creates a lot of noise. So I tried to fix that
as much as possible, but this video is still pretty noisy. Because of that, I had
subtitles created for the video, which should help out a lot. So turn those subtitles on now, and enjoy a look at a
cryptocurrency data center built by North Georgia Data. (upbeat music) – Hey, I’m David from Crosstalk Solutions, and today we’re out with
Tim from North Georgia Data. We can here, initially,
to setup the network, but I wanted to take a quick
video, kind of impromptu, of some of the cool things
that are happening here. So there’s a lot of similarities between a normal data center, and a
cryptocurrency data center. And one of the big things here is power, maybe Tim could talk a
little bit about that. – Yeah, so in your traditional data center you can have anywhere from
half a million to a million square feet, depending on
the size of the data center. And the unique part
about what we’re doing is we have a 45 hundred square foot facility that, when it’s all said and done, will have a little over
10 megawatts of power. And so to give you an idea, 10 megawatts of power
is enough power to run over 10 thousand full-sized homes. So it’s a lot of power crammed
in a really small space, and that’s also the reason why we need all this fancy equipment
that you can see behind us. The unique part in what we
do is that we are delivered 480 volt power from the power company, and we step that down just
a little bit to 415 volts, and the reason why we do that is because running and distributing 415-volt
power in the data center, costs us less, and is
dramatically more efficient on the end of the equipment. So we can usually see anywhere
between two and three percent savings in running that higher voltage, as opposed to running something
lower like 208 three phase, which a lot of traditional
data centers use. – Yeah, so one of the
things I’ve been noticing with this data center here is we’re trying to squeeze efficiency out of everything. I mean, look at everything
on top of this building here, the type of airflow system we have. We’re gonna take a look
inside in just a minute here of how that all works, but needless to say it’s one heck of a system. – (laughs) Yeah, so one
of the unique things about the airflow system that we have, well it’s a unique thing
because of the unique problem that we have,
so because we have such a concentrated amount of
power per square foot, the equipment that we run in
there generates a ton of heat. And this is one of the challenges that all cryptocurrency-focused data centers have, and we’ve been able to
overcome that challenge by pumping over a million and a half CFM of air through this facility. And so this solves a lot of problems that a lot of other facilities have, and we’ve successfully overcome that by just moving a ton of air. (door banging) – [David] All right Tim,
this doesn’t look like a normal data center,
it looks like we’re in a shipping container,
can you explain that? – Yeah, I sure can. So with this type of a build,
we have unique requirements that are put on us by our customer, and also by the hardware providers. And so after a lot of time planning, engineering, and architecting, we just discovered that
repurposing shipping containers, that are a dime a dozen,
solved two problems for us. And one was construction time. We were able to drastically
reduce the time that it takes to stand up a unique facility like this by using widely available
shipping containers. And also, secondly, it was able to save us quite a bit of money. From using standard framed walls, and floor systems, that
can get very, very costly. – So the other thing is that stands out, this isn’t a normal network rack setup. – No, it’s definitely not. And that’s another one of the differences between a traditional data
center and what we’re doing. So the equipment that we
receive is not rack-friendly. There are adapters to be
able to adapt it into a rack, but just due to the unique nature of the equipment, we had to adapt. And so that’s why we’ve developed our own 100% custom shelving
system, that is able to fit just about any cryptocurrency-focused
device on the market. – And it looks like your rack is specifically designed for airflow. Like everything here is
thought in mind of cooling. Can you explain how the
airflow system works in this pod here? – Absolutely. So as I mentioned
previously, in the facility we have about a million and a half CFM running through the entire building. And so in this pod, this pod is connected to those big
smoke-stack-fan-looking-things that are on the roof, and
so what we were able to do, we were able to setup our racking in a way to where the path of least
resistance, naturally, is running through the machines. And that does two things. It insures that the machines stay cool, and it basically eliminates
the possibility of hot pockets. And so this is one of the
key issues that a lot of crypto-focused data centers run into is how do we avoid having
hot pockets in the facility? And this is how we’ve done it, and it’s working great so far. – So the main reason I’m
here is for the network. A few months ago we talked
about a few requirements for this data center, do
you mind discussing those? – Sure, so in any
crypto-focused data center you really have two main requirements. That’s reliable power
and a reliable network. So when I initially sat down with David, I told him “Listen, I need three things. “I need something that’s cost effective, “I need something that’s reliable, “and I need something that
has excellent uptime.” And so after evaluating
all the different options, we collectively decided
that Ubiquiti’s EdgeMax line was gonna be a great fit for us. And so far it’s been awesome. – Well, some of the more
technical aspects of the network include setting up VRP routers, so that if a router happens to die, the other one just kicks in immediately and starts taking its place. Of course, we have
multiple WAN connections. Something special about
the WAN connections for a crypto center is, most of all, the connections are going out. With a typical data
center, you have a lot of special rules for things that come in. Because something like Voltore or AWS, they gotta get their clients access to a whole bunch of stuff inside. Where here, all of the connections are originating from inside, going out. So one of the benefits of
selecting EdgeMax was UNMS. That made managing the network very easy, as well as helping you out, right? – Yeah, it absolutely does. UNMS has a wonderful interface, it’s very similar to the
UniFi interface, right? And one of the challenges that you have in a facility like this
is the number of posts. I mean, when we are done with phase two we’ll have close to eight thousand or so networked and active devices inside here. So having really good
visibility into the network, what’s connected, where it’s connected, what VLAN it’s on, et cetera,
et cetera, is very important. Especially from the
perspective that we need to make it easy to find for employees that we have here on site. So having UNMS, being able to
setup multiple user profiles, different restrictions,
and just being able to find different cryptocurrency
miners in the facility, UNMS just makes it a whole
heck of a lot easier. – Yeah, so it also makes scaling in the future easy for my end. Whenever he needs to have more switches or anything like that,
we just ship them to you, and you plug them in, give me a call, and I make sure everything’s
working all right. I think some of you guys might
have saw on Chris’ Twitter the big box of all the
switches, that was weird. Lots of work, but it
came very easy to manage. Think our lives would have been easier. But that’s pretty much it for the network, taking a high-level overview of it. Thanks, Tim. We appreciate you letting
us film a little bit here, learning a little bit more. – Absolutely. – How do we find info about you? – Yeah, so the best way
to find out more about us, or get in touch with us, is our website at – All right, thank you. – All right. (upbeat music)

19 thoughts on “Crypto Currency Data Center”

  1. A cool video , but the whole idea of crypto mining just rubs me the wrong way. Lets run a load of machines for nothing more than some virtual currency. What a waste of carbon.

  2. If I calc correctly he is talking about 418m² of DC space so a bit over 23kw per m². Thats very tight.
    At one of my DC Sites we have 120MW and allow a max. of about 1,1-2kw per m². So we are at about 70.000m² of IT space at one campus at the moment. A colo customer buyes about 3-6kw per Rack normally.
    400V Distribution to the Racks is also normal in Europe 400V 3 Phase (L-L) and devices get one Phase out of the Powerstrip so 230V (L-N). But we have everything on UPS and Generator Backup, also indirect cooling.
    If I offer my customers cooling with Outside Air, they would go crazy, because of dirt, humitity and stable temperaturs.
    Also power must be verry cheap for that, because normal DC Power is 26-32 Cents € per kwh (including tax, cooling, ups & generator overhead)

  3. So the world is struggling to fill its power requirements….an here they are wasting 10,000 homes worth of energy ?

  4. What a huge waste of power…
    What is the next use case on this facility ? (after cryptocurrency are banned and scamlisted…)

  5. No airfilters. Mining rig and it´s fans will get dusty very fast.
    How to open the container-/pods-doors, if the huge fans are on full power? It´ll alwas have huge underpressure in den container/pod.

  6. Data centers are incredibly valuable in the right context. This however is a waste. Politicians want to regulate the automotive industry for the best interest of the environment, but where are these air heads when these facilities are being built?

  7. Chris, you usually provide some fairly high quality content. These two guys standing in front of a massive air intake trying to explain stuff, and never actually showing anything tangible, is fairly low effort on your part. Honestly, I am not sure I even get why you posted this video as-is.
    We didn't even get to see the network, just some b-roll of a PDU powering up. Disappointing mate.

  8. Good idea with the sub titles but half of the dialogue is missing from the subtitles ? however you did well with the audio considering the environment ?

  9. I tried unify and ended up returning as it brickes.

    AP was permanently disconnected then after forgetting it never showed up again for adoption and the same thing happened with my usg and poe 24 port switch.

    Another serious bug was bandwidth controls never worked even while setting and assigning groups

  10. Would you be able to show the topology layout in UNMS? 8000 network devices would require 167, 48 port switches at the access layer. I'd love to see how it all comes together

Leave a Reply

Your email address will not be published. Required fields are marked *