Most financial institutions have a lot of ingrained complexity due to the amount of disparate systems across internal divisions and constituents.
Some of these systems have resulted following M&A activity, and have then needed to be brought together across different lines of business.
Rather than taking the time to invest in appropriate integration, rudimentary means of incorporating them have instead been inefficiently implemented, without a strategic approach to system infrastructure.
Instead of having modern data movement processes in place, many institutions are still reliant on archaic, spaghetti scripted FTP-based file transfer methods that are open to data breach and don’t provide the transparency needed to remain compliant.
Regulatory concerns and the risk of data breaches are enough to have banks’ compliance departments waking up in a cold sweat. When high-profile regulatory fines seem to be an almost monthly occurrence, it is little wonder that data security has become one of the most important considerations.
To combat the rising numbers of data breaches, regulatory compliance is becoming ever more stringent in order to stamp down on the high-profile breaches that are costing banks both financially and in customer confidence and reputation, ultimately damaging the image of the financial sector as a whole.
The reality of threats to banks’ data is that they aren’t only facing external threats, with names as ominous sounding as Advanced Persistent Threats (APTs) and Distributed Denial of Service (DDoS) attacks, but also from their own employees.
Trends around Bring Your Own Device (BYOD) and an increasingly tech-savvy workforce have meant that employees are accessing and sharing business critical data in unprotected environments.
It is therefore necessary to mitigate vulnerabilities posed by inadvertent, accidental or sometimes malicious data breaches caused by people within the business.
On top of this, all businesses, regardless of sector or size, will always look to “do more with less” as the most fundamental approach to operational efficiency. Technology budgets are becoming more and more constrained, meaning existing and new systems have to work harder and more efficiently without incurring costs.
Compound this with competition from other financial institutions and technology will play an even greater role in how institutions remain competitive.
Financial institutions, when it comes to IT, have previously had a reputation (rightly or wrongly) as almost ‘glacial’ when it comes to change, due in part to their reluctance in moving away from perceived ‘stability’.
But recently, a lot of financial institutions have woken up to the reality that their outdated, legacy computing systems have become ineffective and, in most cases, prove a hindrance to their growth and competitiveness, and that now is the time to modernise and consolidate their complex systems.
Institutions have realised that the financial and reputational cost of a data breach can be catastrophic. Regulation will likely only become more stringent over time, so organisations need to be taking proactive steps now to safeguard against the risk of data breaches.
One of the difficult issues to deal with for financial institutions is migration and a great deal of innovation is required to migrate from one platform to another in order to reduce complexity when you have large communities but in doing so, institutions gain far greater levels of agility.
Fortunately, innovative solutions are now available that provide the ROI required and allow you to decommission old insecure solutions and to reduce the overall complexity and add needed agility. In answer to these demands, technology has evolved to the point where systems can be integrated and deployed far more rapidly, meaning the benefits and cost savings can be reaped sooner rather than later.
The tipping point is therefore now.
Derek Schwartz is senior vice-president financial services at SEEBURGER.