By: Andy Slade
One important distinction that was made by a number of the super funds at the recent My Platform Rules conference in Australia was that of data versus information. A huge quantity of data is not valuable unless it can be transformed and aggregated into useable, quality assured, and contextual information.
The post-GFC focus on investment risk is also driving funds’ needs for better access to better data. For example, if look-through capabilities are in place, impacts of changes in exposures during volatility can be identified in real-time and different investment decisions can be made. Whether the decision is made by the fund itself or its investment manager, small changes in investment decisions can have a significant impact on the retirement savings of tens of thousands of members.
This is a ‘blank sheet’ for even the largest funds. A number of funds want to increase internal data analysis capabilities without impacting material cost. In fact, the end goal is to actually provide significant cost reduction. As some funds evolve their strategy and operational models to improve their data programs, other funds will wait and watch proven models, then act quickly to catch up.
This is understandable, since super funds operate in an environment of intense scrutiny and are conservative by nature given their duty to protect the retirement assets of their members. However, heavy M&A activity and competitiveness dictate that standing still is not an option. Data analysis capabilities must be made a priority. Data isn’t enough. You need information.
To learn about the challenges super funds face when developing data-driven operational models to meet the future needs and expectations of their members, read “It’s Your Data: Super Funds to Meet the Challenge: Part 1”.