Quantcast

The New Reality: Persistent Automation for Fund Management

  • Wednesday, May 14, 2014
The New Reality: Persistent Automation for  Fund Management

INTRODUCTION
This is the year for big data.i Across industries, firms have unprecedented amounts of both public, and private information sets—from user profiles and consumer habits, to business outputs and proprietary algorithms. But access to data, or information at large, does not guarantee a valuable yield. Jonathan Shaw, managing editor of Harvard Magazine notes, “The [data] revolution lies in improved statistical and computational methods, not in the exponential growth of storage or even computational capacity.”ii Data is ubiquitous but not intrinsically valuable – it needs to be smartly processed, not just farmed.

For hedge funds, data processing is the quiet, invisible process that moves through the trade lifecycle—accessed from external entities like exchanges and brokers, modified and adjusted in execution, and at times, frozen in snapshots for an increasingly complex group of investors and regulators. More operational credibility and regulatory compliance is required than ever before, with increased scrutiny of the secret buy-side manna that goes along with it.iii

Smarter data management can be expensive and time-consuming as funds seek to keep up with regulatory, compliance, and transparency requirements while navigating through a sea of market opportunities. Good fund management starts and ends with precise, accurate data management. Truly taking advantage of data, and smarter computational methods, requires not only shedding the skin of outdated models, but categorically understanding a whole new data ecosystem, with new methods of processing, through selective automation and augmented observation. Once that new data ecosystem has been embraced, fund managers can spend their time mastering alpha generation and capital building initiatives.

LIFECYCLE CONVERGENCE
While data management has historically been the purview of three separate functions (front-, middle-, and back-office), funds are now considering data inflows and outflows as simultaneous and holistic activities that not only govern market data and transparency capabilities, but also the capacity to be position-aware. This new viewpoint not only extends to in-house modifications, but will play an increasingly larger role amongst fund/service provider relationships. According to an Aite report from earlier this year, “… regardless of whether firms currently outsource or plan to outsource, the most common impressions of the benefits of using a single front- to back-office vendor for fund operations revolve around the attractiveness of holistic functionality, the expected contribution of a specialized vendor’s experience gained from other firms, and the vendor’s potential to better service clients.”iv

Essentially, funds are approaching operations as an ecosystem—instead of a train-like pipeline where only one train moves in one direction. The ecosystem houses converging cross-office data functionalities that are near-simultaneous activities, beyond the linear progression of the traditional lifecycle. Risk is moving to the front office. Portfolio Management is constant. And Compliance is everywhere. No longer do funds hand off a piece of paper from their trader(s), to the risk officer, over to compliance for the stamp of approval, call down to the floor to reconcile all activity, and then spend countless hours updating disparate systems and colleagues, and later investors, of the impacts on performance and risk. That is the pre-data model from the ‘80’s and 90’s—non-computational and hindered by actual human movement, where data moves in a single line, waiting in turn to be moved in and out of an outdated fund architecture by personnel who may or may not exist in today’s hedge fund reality.

The data map has changed—it’s time for a new hedge fund model.


The New Reality: Persistent Automation for  Fund Management

In this age of data management—this new state of cross-office functionality—operational models must be able to house, curate, and level-off information sets as they happen. Funds must not only actively manage a growing universe of market data but also tackle performance reporting, risk projections, disaster planning, and partitioned client data.

To successfully, and simultaneously, manage these activities, funds must have a data operational model that supports automation, where it makes sense:

• Continuous processing, as an underlying system

• Consistent normalization, across the board

• Historical, since inception view

• Defensive measures, to protect the operation

Processing
Real-time, continuous actions are the new normal in today’s hedge fund reality. Funds are expected to understand, identify, and take advantage of opportunities as they occur. However, from a data standpoint “real-time” is only a point on a larger continuum of activity that occurs when a participant observes or captures a single event in time. Continuous processing is the underlying current that accepts and captures, or rejects data inflows and outflows. As pressures increase from both investors and regulators, managers should rely on continuous, automated services, processes, and technology to support their business, not only as a viewable segment, but constantly, throughout the lifespan of the fund.

Normalization
While the amount of data increased, the types of data and their origin/ sources have multiplied as well. That means that systems that previously could only recognize one or two sources, are now challenged with a more complex ferrying of information sets from counterparties, exchanges, fund admins, and primes. Normalization is the process that guarantees safe passage of these data packets, regardless of origin, as the data becomes available to converge with its intended destination(s) within the fund infrastructure. Consistent data, through consistent ongoing normalization, translates into accurate pricing and valuations for use in real-time and forward-looking portfolio management, as well as precision analysis and reporting for investors.

Defense
While data trafficking, shaping, and viewing are relatively benign activities, when it comes to true data management, a fourth component is critical: the ability to uncover and recover from adverse events, and the greater protection of investor interests.

A solid wall to prevent co-mingling of client data within an underlying architecture keeps critical, and proprietary, data safe. When it comes to planning for the unplanned, like adverse events both in the digital and physical worlds, automated services can provide the second life for a fund—without interruption. Cloud technology provides the best option for funds to house data infrastructures— not only providing secure and convenient access, but also virtual warehouses that are automated, back-up systems, shielding the business from any physical hardware environmental risks like earthquakes, floods, or outages. Thus, it’s not only important how data is managed but where it is managed.


The New Reality: Persistent Automation for  Fund Management


Achieving institutional credibility to attract fresh capital from investors was the main driver behind a technology project at a $100 million dollar, U.S. based hedge fund. Legacy infrastructure, consisting of disparate systems and disparate databases, was in place and cost the business approximately $150,000 per annum. Its cost did not take into account manual processes handled by the portfolio management team to tie out trading activity on a nightly basis and toll-way fees charged by an incumbent provider to the fund’s executing brokers.

In the eyes of the fund manager, credibility equated to repeatable, auditable, real-time controls across trading, portfolio and risk management, and reporting— all driven by an outsourced data warehouse that securely housed the fund’s data and made the fund’s data available for real-time, custom analysis at any level of granularity.

CONCLUSION
Data management is a differentiator for funds. The organizations with solid, modern infrastructures will be poised to sift through the clutter, take advantage of opportunities, and shore-up against unforeseen events. All of which can be achieved through smarter automated services—nurturing ongoing processes, maintaining precision and valuations—accessed in real-time, everywhere and all- the-time.

Many current market solutions fail to incorporate multi-tenant architecture, and other modern capabilities, like cloud technology. With separate databases and outdated legacy systems, many funds will fall short of the data management automation to not only present their story to clients uniquely and securely, but to also meet real-time data requirements for regulatory, compliance, and investor reporting.

comments powered by Disqus


Print

Fill out the form below to access this page for the next 30 days

NOTE: Access to this page is from the current machine and browser, If you clear your cookies you will have to resubmit your data for future access.