Digital Edition

SYS-CON.TV
Does Big Data Need an HPC Boost?
What are firms doing with HPC?

This post is sponsored by The Business Value Exchange and HP Enterprise Services

When should High-Performance Computing (HPC) be considered an integral and essential part of the so-called ‘business transformation' process? This is not a question often posed.

We normally center our fascination with the business transformation process around any given firm's path toward secure (but productive) enterprise mobile computing, Big Data analytics, cloud computing flexibility and new age workflow constructs that embrace social media and online intercommunication.

But how about plain old raw power and High-Performance Computing - or HPC as we normally call it, shouldn't this opportunity to turbo-charge also form part of our current transformation plans?

Advanced and Complex Applications
HPC is defined as a computing environment that employs the use of parallel processing for running what we will call "advanced and complex applications" in an effective and efficient manner. To be more precise and define the term exactly - it applies to any computing environment where the system has the capability to function above a teraflop (or 1012 floating-point operations per second) in its operation.

Although it is true to say that most HPC systems up until now have been tasked with performing compute jobs in fields including scientific research, molecular engineering and (for example) high-grade military uses... thing are changing.

HPC is used to perform tasks including data storage (and, of course, analysis) and what we used to call (and sometime still do) data mining. It will also be used for running complex simulation scenarios and also in deep mathematical calculations and for the visualization of complex data.

... and so today, with so many firms becoming increasingly heavily digitised, the argument to take HPC forward into a wider range of business applications now arises.

In terms of wider usage, HPC can be used to develop, test and redesign products at the same time as optimizing production and delivery processes. Ultimately, Big Data will need HPC in order to be able to store, analyze and produce insight. HPC can also be used (alongside Big Data intelligence) to execute customer trend monitoring, searching and/or profiling.

What Are Firms Doing with HPC?
HP has announced that Airbus has boosted its HPC capacity for aircraft development to 1,200 Teraflops by deploying a new generation of HP Performance Optimized Datacenters (PODs) in Toulouse and Hamburg.

Each of Airbus' 12-meter-long containerized HP PODs delivers the equivalent of nearly 500 square meters of data center space and contains all the elements of an HP Converged Infrastructure: blade servers, storage, networking, software and management (as well as integrated power and cooling).

"Organizations like Airbus need creative scenarios to cater for future business needs," said Peter Ryan, senior vice president & general manager, HP Enterprise Group EMEA. "HP will continue to provide the newest, most powerful technology and operations to support Airbus' HPC for the next five years."

Big Data needs HPC and HPC needs to work on Big Data - this is a marriage made in heaven.

About Adrian Bridgwater
Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole.

In order to post a comment you need to be registered and logged in.

Register | Sign-in

Reader Feedback: Page 1 of 1



ADS BY GOOGLE
Subscribe to the World's Most Powerful Newsletters

ADS BY GOOGLE

"Akvelon is a software development company and we also provide consultancy services to folks who are...
Enterprises are striving to become digital businesses for differentiated innovation and customer-cen...
In this presentation, you will learn first hand what works and what doesn't while architecting and d...
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, ...
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point wh...
Modern software design has fundamentally changed how we manage applications, causing many to turn to...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provid...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discu...
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22...
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT...
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing w...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held Novemb...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018,...
Dynatrace is an application performance management software company with products for the informatio...
Disruption, Innovation, Artificial Intelligence and Machine Learning, Leadership and Management hear...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 1...
SYS-CON Events announced today that IoT Global Network has been named “Media Sponsor” of SYS-CON's @...
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22n...
A valuable conference experience generates new contacts, sales leads, potential strategic partners a...
SYS-CON Events announced today that Silicon India has been named “Media Sponsor” of SYS-CON's 21st I...