Story image

Your data is at risk

01 Sep 2008

There is no escape and your data is at risk. However, there are opportunities to achieve increased security, reliability and compliance, and to improve business performance significantly. These threats and opportunities are staring you in the face. What I want to explore here is why you may not be seeing them.

We are frequently informed that the ‘digital universe’ is ever increasing, and that you are legally responsible for protecting your data. How does your data get from A to B reliably, securely and traceably? This obvious question is rarely asked.

I am asking it because I believe this is a pressing issue that must be treated as an imperative. In the IT community we seem to have a blind spot when it comes to recognising and solving new challenges like file transfer.

The question raises some provocative insights. Why, in the face of serious threat, do vast quantities of your commercially or personally sensitive information continue to be routinely transported on physical media like CDs, or held on individuals’ laptops because these files cannot be transferred in a better way?

Why, in the light of abject inefficiency, are these large files being transferred by totally inappropriate technology, consequently frequently failing to complete, resulting in the transfer having to be started from the beginning?

What process is failing us or is missing? Do we have difficulty moving from old generation technology to the new?  Or is it because we don’t know that a better alternative exists and actually works?

Your core business is at risk in these areas:

1.     Security

2.    Compliance

3.    Competitive advantage

4.    Intellectual property protection

5.    Business continuity

6.    Operational efficiency

Implement a managed file transfer mechanism to address the following pain points:

  •     Problems transferring information over the WAN
  •     Email systems inadequate for file transfer
  •     Receiving files from external parties
  •     Replicating files between remote locations
  •     Relying on physical media to for file transfer

FTP is dead. As the Wikipedia entry for FTP says, “The original FTP specification is an inherently insecure method of transferring files because…”.  One could write an entire article dedicated entirely to the reasons FTP is no longer an appropriate method for file transfer. Look for a solution that provides the following:

  •     Use browser independent solutions. Web browsers are designed for rendering web pages, not for transferring large sensitive files securely over the internet.
  •     Use industry-standard HTTPS encryption. 
  •     Insist on three-way fault tolerance – tolerance against client failure, network failure, and server failure. Expect full fault tolerance against network outage, network swap, client PC outage or reboot or server outage or reboot.
  •     Make sure the transfer will resume from the point where it was interrupted and does not need to start from the beginning all over again!
  •     Store files on a separate file server or SAN. This offers better file security, less risk of sensitive files being compromised, and conforms to your organisation’s backup and operational procedures.
  •     Lock down your web server. Make your web server more secure and less susceptible to malicious compromise or misuse.
  •     Ensure the solution supports clustering and load balancing for highly available and scalable solutions.

So, here is my challenge to you. Rethink the way you transfer information. Help me identify why the IT community appears to have a blind spot regarding file transfer best practice, and in utilising a second generation technology. What things are happening, or have happened to you and your organisation? How you are transferring large files?    

Protecting data centres from fire – your options
Chubb's Pierre Thorne discusses the countless potential implications of a data centre outage, and how to avoid them.
Opinion: How SD-WAN changes the game for 5G networks
5G/SD-WAN mobile edge computing and network slicing will enable and drive innovative NFV services, according to Kelly Ahuja, CEO, Versa Networks
TYAN unveils new inference-optimised GPU platforms with NVIDIA T4 accelerators
“TYAN servers with NVIDIA T4 GPUs are designed to excel at all accelerated workloads, including machine learning, deep learning, and virtual desktops.”
AMD delivers data center grunt for Google's new game streaming platform
'By combining our gaming DNA and data center technology leadership with a long-standing commitment to open platforms, AMD provides unique technologies and expertise to enable world-class cloud gaming experiences."
Inspur announces AI edge computing server with NVIDIA GPUs
“The dynamic nature and rapid expansion of AI workloads require an adaptive and optimised set of hardware, software and services for developers to utilise as they build their own solutions."
Norwegian aluminium manufacturer hit hard by LockerGoga ransomware attack
“IT systems in most business areas are impacted and Hydro is switching to manual operations as far as possible.”
HPE launches 'right mix' hybrid cloud assessment tool
HPE has launched an ‘industry-first assessment software’ to help businesses work out the right mix of hybrid cloud for their needs.
ADLINK and Charles announce multi-access pole-mounted edge AI solution
The new solution is a compact low profile pole or wall mountable unit based on an integration of ADLINK’s latest AI Edge Server MECS-7210 and Charles’ SC102 Micro Edge Enclosure.