'The Network Is The Computer'

– John Gage, 5th employee at Sun Microsystems

decision_support_systemIn our connected world of distributed computing, feeding and sucking data from databases and mainframes, sorting data in large oceans of unstructured information (looking at you “big Data”), we often take for granted the conduits that allow that data to pass between our processing engines.  The sum of the parts is often far greater than the individual parts themselves.  Stepping further into this analogy, the quality and urgency of our information is related to the speed and efficiency with which we can process our data, moving between our processing engines to provide the answer we seek.

Nowadays, a 10 Gigabit per second speed switched fabric is standard in our data centres, with 100GE the norm between data centres, while we copy our information assets to convince ourselves it’s all protected.  With ever growing data volumes, consideration of the network moving it around is critical when understanding what to move, when to move it, and why we’re moving it.  Making copies for “backup” and “recovery” is one reason, but do we really need to?  A question worth considering, especially with “Big Data” reservoirs which storage vendors embrace with gusto.


At LB we look at the business justification for a network design, whether it is fit for purpose, what it needs to be if not and how it can better serve the operational goals of the data repositories and processing engines.  Taking this holistic view, LB are experts in explaining to your business why the switch investment is necessary, or the in-line IPsec encryption is necessary, or why the MPLS network isn’t fast enough to move an Accounts Payable ledger to an AWS EC2 hosted Oracle EBS instance.  Our independence means we describe the realities, the warts, and the benefits with equal fervour.  We’re not selling you a licence or a service, only our knowledge and experience.

(Visited 230 times, 1 visits today)
Share This