Overview of the data center IXcellerate (the largest machine hall in the Russian Federation)

Overview of the data center IXcellerate (the largest machine hall in the Russian Federation)  
The most important thing is that after the turbine hall hot wet wipes are served, as in Japanese restaurants. It has nothing to do with the technical part, but it's humanly nice.
And this is a guitar with Paul McCartney's autograph, and yes, it hangs at the entrance to the turbine hall:
This data center has a certain mood.
It looks like a huge hangar in the industrial zone near the metro station "Otradnoe". The height of the hangar is 14 meters, inside the data center is 9 meters high. The remaining space plays the role of a heat insulator, which affects the local cooling features. The building was chosen in such a way that it did not go out on the road with edges (certification protocols can not get high levels if you can get into the truck through a truck, ramming the wall - apparently there were cases), had two different routes and was relatively close to the metro. The MCC station was also opened nearby, but the road goes along such fun places
short story .
Naturally, no matter how steep the data center, you always need a second one in case the meteorite hits the first one. A geo-distributed platform is a guarantee of health and a long life.
We chose the second site as carefully as the first one. We needed a T3 UI (more precisely, the regulations from it), the ability to work with PCI DSS (that is, our own certificate for physical protection of the data center) and a lot of other features. As a result, we stood in the data center IXcellerate - this is where the stands of Orange were plastered inside one of the turbines, one of the very large game company * is pirating *, there are big banks and state companies, Reuters and Nestle. "Kommersant" wrote that Apple and Booking for Persdans in Russia have also chosen this place. In general, if they have something to fall, they just cover. That is, the motivation to do everything correctly is good enough.
The second feature is that the leader is not a beginner. Not a beginner. Guy Wilner is an entrepreneur from England. He builds data centers for the first time and not for the tenth time. Prior to the project, he owned his own network of 14 data centers in Western Europe, and then sold it to Equinix (the operator of data center number 1 in the world). After that, he decided to repeat the adventure and open a huge data center in Brazil, Russia or Turkey. In Brazil, he was not allowed by his wife and children (because he was needed at home), and in Russia he met Cliff Gauntlett. This is a legendary personality in the Runet in the telecommunications field: once he was vice-president of the famous Golden Telecom and ROL - perhaps, with their cards for access to the Internet, you started to go online. So, he also wanted to build a huge data center. And even managed to catch him while he was alive.
To what I'm telling all this - the data center was built by a British businessman with good European connections, so the European customers came in immediately. This means very clear reporting on all transactions, which gives some certainty that the processes and regulations inside the data center are fully implemented.
Data centers were built in parts, launching the machine-room behind the turbine hall. The first stage was for 220 racks (they started to build in 201? they were put into operation in mid-2013). The second machine-building is 515 racks (1040 square meters, where our first servers are), and the third phase (where our core) is 1100 persistent places (this is 2500 squares).
The three turbines are independent of each other, but these are not three different data centers, because there is a cross-section reservation of part of the nodes. The first two turbines are certified by IBM L? the third one is certified Uptime TIER III. If very short, they are almost the same and the difference at the project level, for example, is that for UI you need dedicated parking for engineers, physical barriers for cars, somewhere to resist burning 45 minutes (windows in UI) vs. an hour (IBM), different sizes of doors and so on.
The general plan was made by the company Arup. The facility has Arup's certificate for the best solution in the field of cooling and power supply. This is because the data center is commercial: the lower the PUE, the cheaper it is to operate. Therefore, they introduced everything in parts (that is, no equipment was idle, the engineer was purchased as scaling), while the struggle for each kilowatt was going on. Now PUE - ?15-?4 depending on the season. There are quite a lot of Chinese customers in the data center, for which, with their approach to ecology and the use of electricity, these numbers are simply space. At them usual PUE can reach and to two.
At the time of operation, the balance of customers they had about 50/50 of domestic to foreign. At that time, all famously moved within the framework of the law on the protection of personal data and, as a consequence, in the Russian market of cloud infrastructures (this is what we do) there was an increased demand for data centers (this is where cloud providers had to be located).
It is now that more and more companies are coming from abroad, because there is a landmark customer - the foreign provider Orange, which for many is a vital reference point and a corporate standard. And the third Maschal was financed by the Goldman Sachs Group, which adds credibility to the project in the eyes of foreign leaders. Data centers do not change every year, so it is important for them to think for 10 years ahead.
Given the abundance of external contacts, support for the first line communicates in writing in Russian, English and Chinese. All inscriptions are duplicated in Chinese:
Data center, thanks to all the same Guy, consists of two alliances of data centers - European and Asian. It's like airline alliances, when you can buy a ticket with a transfer between members of the alliance and you will overload your luggage without your participation. Very many need georeferenced accommodation, and therefore such alliances are made. The customer of one of the data centers of the alliance actually works with the entire network in one window. In our market, this is still a unique phenomenon.

Physical protection

For PCI DSS-objects, it is very important to protect the building. The first perimeter is an external fence, which is equipped with sensors, chambers and barbed wire. At the entrance to the territory of the checkpoint. Then you need to get into the building itself (open by a pass or a camera), then you need to go through security (passes, search, seizure of bottles with kefir), then you need to go into the machine hall (biometric palm scanners, fingerprints or codes). In the turbine hall, the racks themselves are most often in the fences (who will either put them, usually fingerprints or keys, like there was one mantrap for the bank: it's a vestibule where you can enter by a pass or a finger, and the second door only by the guard's permission from the camera). Everywhere video surveillance.
Subjectively, the physical perimeter in the DataPro is slightly stronger if you stand without a fence. With a fence here everything is much more interesting, because, for example, next to us are the Japanese. They have their own cell standard and a rod thickness of 6 mm. They say that there are those who have 8 mm, but I did not measure. Many in the fences have infrared motion or intersection sensors, there is a pair of cells from floor to ceiling.
There are a lot of fences here. More than anywhere else in commercial data centers across Russia. At least from what I know.


Oddities begin right after the entrance. The rest room for admins is more like a library:
You can look around a bit and see a turntable with real vinyl. There is vinyl Led Zeppelin and Deep Purple. There is no lack of gear.
All because the investor loves rock even more than data centers. These same comrades make the festival Rockin'Russia. Cliff (who was the director before commissioning) wrote two songs for Scorpions (for example, One and One Is Three).
There is a shoe and a thistle:
Next car, which moistens and heats napkins:
And food with water (free):

Unloading area

Here, the preparation of the equipment is carried out to drift into the turbine hall. Here, perhaps, the most interesting equipped zone of unloading among Russian data centers. First, it's big. Secondly, there is a tennis table. Thirdly, there is a special air purification. This is the acclimatization chamber for the first turbine hall (we loaded the main equipment through another, similar):
Pay attention to the height of the gate.
Equipment is brought in here, it is vacuumed by Kercher (the usual model of the most powerful ones), then the iron in the boxes is defended, then it is released from the boxes (inside the carton is prohibited - in general, all fuel is prohibited).
Very convenient stops with the possibility of fastening the door:


In general, the data center needs three things: proper cooling, proper communication and proper nutrition. About cooling, I add that they have a zoo from conventional air conditioners, chiller-fan coil and freecooling in different rooms. But about the connection is very important, that there is a direct connection with the M-9 and there are peering nodes. That is, if traffic runs from one data center server to another data center server (possibly another company) via a "nine" or similar European nodes, then the peer-to-peer will run it directly through the local network. Well and further already peering of providers. Given that there is one very large gaming company here, I guess who needed it in the first place, so that users did not lag.
On the site at the time of publication of the post 43 provider. A year ago this was the second place in Russia in terms of connectivity. Now, it is quite possible that IXcellerate has already reached the first, but there has not been research so far.
On the roof there is a platform for satellite equipment, you can put any antennas, but so far nothing. They say that one of the customers planned to put a satellite teleport there, but in the end did not deliver.
Let's just walk around, and I'll show you what you could shoot.
Many cells:
Optimization of cooling - fences and panels with removable lamellas:
Thermal sensors:
PDUs are made for them according to their design at the Moscow plant. They say it's an import substitution that could. They, by the way, are locked:
Here is the big machine hall:
Here behind such doors of technical location and secret levels:
It happens that behind them are closed rooms with 25-35 racks for privacy. But it may turn out to be battery. In the first turbine hall ordinary batteries, in the third already lithium-ion. The latter are good for the individual telemetry of each battery, the service life is up to 12 years (in lead-free ones a maximum of ? but rather 5), they are more resistant to temperature changes and heating, plus two degrees do not knock out a year out of service life. In the long term, they are cheaper than lead for more than 5 years. Each battery has its own separate cover:
Here are the optical trays. Look, what care about people: here these plugs do not allow a cable to be cut about edges of the holder:
This is a gravity. I propose to guess what it is:
In some cells, customers store stools, ladders, terminals. There are a couple of cabinets with spare parts. The only requirement is that of non-flammable material. In a pair of racks are suitcases (this is put on the sled and designed specifically for storing things in the rack). But the characteristic detail: metal pipes for a weak point.
Our iron:


On the territory comes two independent city beams at 13 MW.
Now the food is 2N. Next year there should be 3N - build an energy center for three gas turbines.
Here are the diesels:
Concrete blocks, so that the machine does not even accidentally enter the diesel:
Next cooling:
And at the end we are waiting for a lawn for music festivals and football. There you can relax:

TTX Data Center

Certificates: PCI DSS, ISO 27000 (27001), ISO 9000 (9001), ISO 14001: 2004.
The building has an area of ​​6 thousand square meters, 1835 persistent places, capacity - 13.7 MW.
Energy - 2N, UPS + DGU, cooling - chillers with freecooling, reserve - N + 1.
For now. The data center employs 50 people, including non-engineering personnel. We very long chose the second platform and seemed to be chosen correctly (at least, our equipment has been here for more than a year). But once again I will note: if something changes, I will keep in touch.
+ +1 -

Add comment