28Sep
How Data Centers Have Changed Through the Years

How Data Centers Have Changed Through the Years

On: September 28, 2022 Comments: 0
Follow Us for the Latest Insights on Data Center Cooling!
 

As technology continues to evolve, data centers are growing in importance. Innovations build on each other and require ever-increasing computing capacity. In addition, the Internet of Things (IoT) and cloud computing have pushed the industry beyond the technical constraints of conventional data centers.

Today, data centers can host high-density servers with minimal site requirements and costs thanks to innovative cooling technology. The newest advancement is called liquid immersion cooling and it offers a generational leap over preceding systems.

Let’s look at how data centers have changed over the years, what the future might hold, and how your business can stay ahead.

As technology has evolved, so have data centers.
Source: Shutterstock

Early Beginnings (1940s-1960s)

Computing as we know it started with the military. Back in 1946, the United States Army built the Electronic Numerical Integrator and Computer (ENIAC) to assist with artillery firing and nuclear weapon design. It was the first general-purpose electronic computer and opened the door to the ever-increasing complexity that has characterized IT in the ensuing decades.

While ENIAC filled an entire room and needed a massive air conditioner just to perform a few thousand calculations per second, today you can easily cool far more computers in a GRC ICEraQ. We’ve come a long way indeed.

By the 1950s, transistors had replaced vacuum tubes, increasing the hardware’s reliability while decreasing size and power consumption. However, it wasn’t until the 1960s that computers became commercially viable.

Mainframes from IBM spread throughout the industry as applications and programming languages made these devices more useful. IBM mainframes not only remain in use today, they also influenced the other systems that would eventually fill data centers.

The First Microprocessor and Other Firsts (1970s)

In 1971, Intel released the 4004; the first commercial microprocessor. Boasting more and faster transistors than preceding tech, it combined multiple computational functions on a single integrated chip. The 4004 worked as a reusable module, which engineers could incorporate into a range of electronics.

Today many servers in data centers use Intel processors with hundreds of millions of transistors. All those transistors emit heat that needs to be cooled, driving the development of better cooling solutions like GRC’s liquid immersion.

Also during this time, the Xerox Alto pioneered the first graphical user interface for desktop computers, making the machines more user-friendly. This research computer included a display screen and networking and set the direction for how regular users would come to depend on computers. Among other firsts, the Alto gave us visual tools for word processing, email, and graphics.

The earliest common networking technology, ARCnet, has connected computers since the late seventies. The forerunner of the Internet, this system worked with devices from different vendors to multiply processing power. The culmination of early networking and computing inventions would pave the way for data centers to grow into the workhorses they are today.

The First PC, Dot-Com Revolution, and More (1980s-1990s)

The personal computers of the 1980s and ‘90s overtook the large computers of previous decades. Information technology went mainstream; even more so when the Internet took off with the dot-com boom. Ever-faster connectivity drove the development of massive data centers to service the now-plentiful computer users.

Enterprises built their own server rooms and facilities, in some cases housing thousands of servers. In addition, businesses sprang up to operate data centers for public clients. However, the increasing reliance on data centers also raised the need for server cooling. Air cooling sufficed during this time, although data centers would later switch to liquid cooling—especially for high-performance computing (HPC).

Several other tools contributed to the popularization of data. For example, virtualization technology enabled each machine to serve more customers and increased compatibility among different operating systems. As a result, a single physical server could host multiple software environments, making it easier to manage applications.

Further Innovations (2000s)

As data centers advance, so have cooling methods.
Source: Shutterstock

In the first decade of the new millennium, features were developed to support more adaptable servers for the data center. For instance, Amazon Web Services started offering cloud storage services customers could use on demand. This marked the start of a transition from dedicated enterprise hosting to larger shared data centers, with both approaches now coexisting.

Virtualization technologies also advanced, assisting in the transition to the cloud. It became common to run virtual machines on cloud servers to which customers would connect, increasing performance and cutting costs. Database-backed apps became widespread, and mobile apps increasingly connected to cloud data centers.

A useful innovation from this period is the modular data center: a self-contained data center you can deploy anywhere. Modular units are efficient and convenient, and this approach remains valuable. For example, GRC’s ICEtank puts leading-edge liquid immersion cooling in a shipping container that companies can use anywhere.

The Cloud Revolution (2010s)

The cloud grew even bigger over the last decade, with many organizations moving most or all of their data and applications into the cloud. Hyperscalers like Google spent billions on data centers. And with more people working remotely, the trend continues.

A landmark in the move to the cloud is the Open Compute Project. This consortium brings together many of the top companies in technology to produce new data center designs. Members include HP, Dell, IBM, Intel, Google, Meta, Cisco, and Seagate, among others. Projects share efficient designs for servers, storage, and racks.

One key focus of the Open Compute Project is to make data center cooling more efficient. The most effective technique is liquid immersion cooling, which far outperforms the alternatives. As a result, data centers are more profitable and environmentally friendly. With the cloud taking on a larger role in society, these are necessary improvements.

What the Future Holds

Data centers continue to evolve, and we can expect to see more and bigger facilities that adapt to how data is used. For example, many data centers will continue to shift from an ownership-based model to one where users rent the resources they need.

This everything-as-a-service approach results in less waste and more productivity. The automatic allocation of resources keeps equipment working closer to its full capacity. It also becomes easier to use IT infrastructure securely. And since resources are carefully managed the entire data center, from the processors and storage to the power and cooling, is more sustainable.

Finally, we’ll see data centers of the future using supremely efficient green cooling technologies. Not only do they cost less and run quietly, these powerful systems will give data centers the capacity to run 5G, artificial intelligence (AI), IoT, and other applications people choose. The industry is already moving to cooling solutions like liquid immersion, and we expect to see the revolution well underway in the coming years.

Find the Right Data Center Solution for You With GRC

The future of data center cooling is already here: GRC’s innovative liquid immersion. It has lower upfront and operating costs, it’s far better for the environment, and it supports practically unlimited compute density. With immersion cooling in place, you’ll also eliminate many of the unnecessary carbon emissions produced by conventional air cooling.

Just as computer technology has improved over the years to become faster, data center cooling has improved to become more efficient. The industry is moving toward data centers that use less electricity to produce faster computations. GRC’s immersion cooling helps attain these goals by letting your data center capture and reuse server heat—another technological feat.

The future will ask more of technology at every level, and data centers will need to keep adapting to meet customer demands. Companies that want to stay ahead of the game can save money (and the environment) by deploying GRC’s next-gen liquid immersion cooling now. Contact us to learn more.