Organisations today have many more options when it comes to storing and managing their data and supporting their IT infrastructure. Laurence Baker looks at how organisations can ensure they have a future-ready solution and discusses the rise of the modular solution and the benefits it can bring.
Ten years ago there was no real outsourcing model in our sector and organisations had to build and run their own data centres. So they invested heavily and built huge facilities in anticipation of strong, predicted growth. In many cases the facilities were big, with highly resilient Tier IV infrastructures, as they believed what they needed was 100% availability and, above all, absolute protection against any data loss. Then the economic downturn happened.
In some cases these decisions were made without properly considering their requirements. As a result many have ended up with, at best, facilities that have been both expensive to construct and continue to be operationally complex and expensive to run.
At Keysource we find that our customers rarely deploy a full IT load from day one, if ever, so this raises the question about whether the infrastructure to support this needs to be in place in day one. To determine this we believe that early engagement is key to ensure you are making the right decisions. Not having all the relevant stakeholders involved from the outset may mean that the team fails to understand the real business and IT requirements or that the wrong solution is specified and deployed.
This is increasingly important as businesses today are becoming ever more dependent on IT systems and associated data due to a number of changes such as an upsurge in ‘Cloud’ services, digitalisation and the internet of things. As a result a key priority is ensuring the availability of these systems, with companies looking for the best solution to meet their requirements, as efficiently as possible. For many the biggest challenge is how to keep aligning the IT infrastructure to the fast moving and ever changing business environment and ensuring that any solution is future ready, whilst also keeping costs to a minimum.
As a result, many organisations are opting for modular data centre solutions which are constantly evolving to address a wider range of business and operational requirements. Traditionally modular solutions were developed to overcome construction and deployment challenges, but now there is an overwhelming demand for these scalable facilities that also deliver high levels of performance, resilience and efficiency…
Read the full article on page 20 of Data Centre News Magazine
Find out more about our modular data centre solutions
It’s great news that we have once again been shortlisted as a finalist in two categories at this year’s Datacentre Dynamics (DCD) EMEA Awards.
We have been recognised in the Public Services Digital Delivery category for our work with the Metropolitan Police Service, having been chosen as a key partner in the transformation of its data centre estate, as well as being responsible for managing IT assets, capacity, efficiency, certification and compliance.
And Tom Blundy, an employee at Keysource since 2013, has been chosen as a finalist for the Young Mission Critical Engineer of the Year Award in recognition of his creative design expertise and hard work on key projects such as Jaguar Land Rover, Sky and Unisys.
The Awards are part of a unique global series that provides worldwide recognition to outstanding individuals, teams and projects. An independent panel of industry experts reviewed all the entries for the awards before selecting the finalists. The award winners will be announced at the DCD Awards Gala Ceremony at the Hilton Hotel on Park Lane on the 7th December.
Mike West, Chairman at Keysource, commented:
“We are absolutely delighted to have been chosen as a finalist in two fantastic categories. It is testament to the hard work and commitment of our team, quality of our people and our ability to manage high profile, complex projects. We thoroughly look forward to finding out the result at the awards evening in December.”
The impact of last months’s unexpected referendum result is yet to be fully defined, however it is likely to affect all of us in the data centre world and possibly acerbate some of the challenges ahead. The issues around the ‘Safe Harbour’ agreement is just one example of the confusion that we are facing, and are likely to continue to face, in the coming months.
It was last October that The European Court of Justice ruled that the “Safe Harbour” agreement, which was designed to provide a “streamlined and cost-effective” way for US firms to get data from Europe without breaking EU rules, was no longer valid. The result following the ECJ decision was several months of confusion and in some cases panic before the Privacy Shield Pact was introduced instead. The main difference is that US companies can no longer rely on self-certification and must seek to strike “model contract clauses” in each case. These agreements will then authorise the transfer of data outside of Europe.
The UK’s decision to leave the EU means that we will no longer be bound by decisions of the ECJ and we are likely to have to create our own regulations. However I don’t believe we should ignore its findings or the views of European Data Protection Supervisor Giovanni Buttarelli, who criticised the Safe Harbour’s replacement describing it as ‘not robust enough’ and needing ‘significant improvements.’ The UK will need to factor these in to ensure that its citizens’ personal information remains safe.
There is no doubt that the UK’s decision to leave the EU has added instability to an already uncertain market. Before the results of the referendum were known CBRE’s quarterly review of data centre supply and demand in Frankfurt, London, Amsterdam and Paris, reported that the amount of data centre space taken up during the first quarter of 2016 was well above average leaving spare capacity in short supply. In fact the amount of spare data centre capacity in four major European cities is at its lowest level since the end of 2013, as cloud providers respond to user demand for locally hosted services.
The report suggests that this is in part due to the uncertainty surrounding the successor to the Safe Harbour US data transfer agreement which is motivating more IT infrastructure firms to consider data centres in Europe. Whilst this is understandable it is likely to increase costs which will ultimately be passed along the supply chain.
At Keysource we are working closely with our clients who are affected by these issues and helping them to continue to operate cost effectively and remain compliant during this difficult time.
This was originally published on Data Centre Solutions blog on the 6th July 2016.
Keysource, the expert in business critical environments, has been appointed to design and build a new data centre for a leading pharmaceutical company at its production site in the North of England. It will replace an existing, ageing facility and will underpin the critical services being delivered to the business for the next 10 to 15 years, saving over £250k in energy costs.
The location for the new data centre will be an existing IT services office. The Keysource team will strip this room and make any health and safety, as well as aesthetic refurbishments, before installation commences. The project will be completed under live conditions so that the existing data centre and staff working at the campus are not disturbed while construction takes place.
Designed to be concurrently maintainable, with N+1 critical cooling and power the new data centre will also be highly efficient, fully utilising the ASHRAE recommended temperature range. In addition, an environmental monitoring system will also be deployed. This system will allow provided real time insight across the data centre environment allowing cooling to be further optimised.
Mike West, Managing Director at Keysource concluded,
This new data centre will be developed in line with the latest regulations and industry standards. It will not only guarantee long term reliability and availability of critical services to the business but ensure they are delivered in a sustainable and efficient way, thereby maximising the return on investment.
When customers are looking to build a new data centre, they often lose sight of the operational aspects of the facility. ‘Design for operation’ is an approach which focuses on the long term running of the facility when considering the initial design and one that we, as an industry, should be championing. Richard Clifford, Data Centre Consultant at Keysource explains.
Whilst there are some good standards in our industry around data centre design including the Uptime Institute’s established Tier system, there is very little guidance in the market around the actual operation of a facility. The Uptime Institute have introduced their Management & Operations (M&O) Stamp of Approval but it is relatively new and is not compelling for consultants and specialist contractors who are not involved in the ongoing management of the data centre. This means that organisations need to manage their risk carefully or they may end up with an expensive facility that is expensive to run and not fit for purpose.
The data centre design tenders and contracts we’ve seen rarely reference any kind of operational or FM standards and I would go as far to say that in many cases they are not considered at all. This can be for a number of reasons, the main one being that often the team that is tendering and procuring the data centre design is not the one that will be responsible for operating and maintaining it and these teams have not been consulted.
Having over 30 years of experience designing, building and then operating data centres and other businesses critical facilities, we always encourage different stakeholders, both internal and external, to be part of the process from the outset, as we feel that this delivers the best results. This early engagement is key as not having all the stakeholders involved may mean that the team fails to consider the design implications on the maintenance requirements, Total Cost of Ownership or understand the risks around downtime as they often don’t have a specialist subject knowledge. This can be a key challenge for public sector organisations or SMEs where cost is the key driver and in-house resources are scarce.
The importance of this cannot be overstated and companies need to ask themselves operational questions, such as whether the design can continue to support the critical business services under maintenance conditions and how the maintenance will be undertaken? Can the design help to streamline the ongoing operation of the facility reducing risk and cost? For example does work need to be delivered out of hours, or can it be done during normal working hours thereby reducing servicing costs.
The design and build team may not understand the resilience factors but the FM teams will know that the data centre cannot be taken offline, and that concurrent maintainability should be considered as part of the solution.
As an industry we need to put more of a focus on ensuring that data centres are ‘designed for operation’ and the team responsible for maintaining and running the facility is engaged from the outset.
Bringing together multiple stakeholders is always a challenge and it will need the industry to work together and be more open and engaging to share best practice and insight. We should remember that the life of the data centre could be up to 25+ years and by taking a little more time in the early stages organisations can ensure that the design will meet their requirements operationally and provide the best value for money at the lowest risk
Originally published by Digitalisation Word