Digital Edition

SYS-CON.TV
Does Big Data Need an HPC Boost?
What are firms doing with HPC?

This post is sponsored by The Business Value Exchange and HP Enterprise Services

When should High-Performance Computing (HPC) be considered an integral and essential part of the so-called ‘business transformation' process? This is not a question often posed.

We normally center our fascination with the business transformation process around any given firm's path toward secure (but productive) enterprise mobile computing, Big Data analytics, cloud computing flexibility and new age workflow constructs that embrace social media and online intercommunication.

But how about plain old raw power and High-Performance Computing - or HPC as we normally call it, shouldn't this opportunity to turbo-charge also form part of our current transformation plans?

Advanced and Complex Applications
HPC is defined as a computing environment that employs the use of parallel processing for running what we will call "advanced and complex applications" in an effective and efficient manner. To be more precise and define the term exactly - it applies to any computing environment where the system has the capability to function above a teraflop (or 1012 floating-point operations per second) in its operation.

Although it is true to say that most HPC systems up until now have been tasked with performing compute jobs in fields including scientific research, molecular engineering and (for example) high-grade military uses... thing are changing.

HPC is used to perform tasks including data storage (and, of course, analysis) and what we used to call (and sometime still do) data mining. It will also be used for running complex simulation scenarios and also in deep mathematical calculations and for the visualization of complex data.

... and so today, with so many firms becoming increasingly heavily digitised, the argument to take HPC forward into a wider range of business applications now arises.

In terms of wider usage, HPC can be used to develop, test and redesign products at the same time as optimizing production and delivery processes. Ultimately, Big Data will need HPC in order to be able to store, analyze and produce insight. HPC can also be used (alongside Big Data intelligence) to execute customer trend monitoring, searching and/or profiling.

What Are Firms Doing with HPC?
HP has announced that Airbus has boosted its HPC capacity for aircraft development to 1,200 Teraflops by deploying a new generation of HP Performance Optimized Datacenters (PODs) in Toulouse and Hamburg.

Each of Airbus' 12-meter-long containerized HP PODs delivers the equivalent of nearly 500 square meters of data center space and contains all the elements of an HP Converged Infrastructure: blade servers, storage, networking, software and management (as well as integrated power and cooling).

"Organizations like Airbus need creative scenarios to cater for future business needs," said Peter Ryan, senior vice president & general manager, HP Enterprise Group EMEA. "HP will continue to provide the newest, most powerful technology and operations to support Airbus' HPC for the next five years."

Big Data needs HPC and HPC needs to work on Big Data - this is a marriage made in heaven.

About Adrian Bridgwater
Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole.

In order to post a comment you need to be registered and logged in.

Register | Sign-in

Reader Feedback: Page 1 of 1



ADS BY GOOGLE
Subscribe to the World's Most Powerful Newsletters

ADS BY GOOGLE

Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism...
Dynatrace is an application performance management software company with products for the informatio...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Ser...
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience w...
NanoVMs is the only production ready unikernel infrastructure solution on the market today. Unikerne...
All in Mobile is a mobile app agency that helps enterprise companies and next generation startups bu...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO Silicon Valley 2019 will cover all of these tools, with the m...
SUSE is a German-based, multinational, open-source software company that develops and sells Linux pr...
Yottabyte is a software-defined data center (SDDC) company headquartered in Bloomfield Township, Oak...
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are rep...
Serveless Architectures brings the ability to independently scale, deploy and heal based on workload...
Technological progress can be expressed as layers of abstraction - higher layers are built on top of...
When building large, cloud-based applications that operate at a high scale, it’s important to mainta...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it wil...
Big Switch's mission is to disrupt the status quo of networking with order of magnitude improvements...
Every organization is facing their own Digital Transformation as they attempt to stay ahead of the c...
"Calligo is a cloud service provider with data privacy at the heart of what we do. We are a typical ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, disc...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (No...
Chris Matthieu is the President & CEO of Computes, inc. He brings 30 years of experience in developm...