Posted by Sharyn Tauro | 14 Jan
Data Aggregation: Gold Panning of the 21st Century
Darwin in his theory of Evolution quoted that, “It is not the strongest who survive most, but the most adaptable”, a statement highly relevant in the world of financial planning and wealth management. This industry as we know it is undergoing a rapid metamorphosis to meet the ever growing and rapidly evolving needs of consumers and clients who provide services to them. The onset of tougher economic conditions has inevitably resulted in a paradigm shift towards higher cost control measures, decreasing cash flow with a pressure for increase in productivity, all the time maintaining a positive customer experience.
In addition, the transition from brick and mortar establishments with paper application processes to an ever increasing electronic environment has made clients accustomed to having all their data at the click of a button. Australian credit institutions especially are feeling the increased pressure of reducing the turnaround times for loan applications without compromising on the unflinching responsible lending obligations defined by Federal legislation. This has resulted in an industry need for an agile environment that can provide a holistic view of a client, by aggregating information from multiple accounts from a host of different institutions on a single webpage.
Theoretically, collated data falls into two broad categories: data from a customer’s personal records that is password protected (bank account statements, tax returns, Insurance, Health cover) and data freely available in the public domain (Social media sites like Facebook, Stores, and News websites visited). There are three distinct aggregation models currently in use:
User controlled aggregation: In this category the customer uses software provided by the aggregation technology supplier on a dedicated computer chosen by the customer which stores all the customer credentials, with the customer downloading and collating all the data themselves. Though more trustworthy from a customer’s viewpoint, this method depends on the assumption of trusting the customers user space, a factor not controllable by the aggregation companies and often feared in the “era of hacking”, besides involving more work for the end user.
Third party permissive aggregation or direct feeds: The most secure of the three options, this is where the aggregation technology supplier enters into individual contracts with service providers to get direct official data feeds from the site of data origination, though the costs involved in setting and maintaining the data feed as well as time involved make this option incredibly prohibitive.
Third party non-permissive aggregation or screen scraping: The most widely used of the three categories due to the nature of its competitive pricing, this involves the aggregation technology supplier entering a service provider website using credentials provided by the client, without gaining the explicit consent of the service provider. The collected data is then stored on the aggregator server to be processed, analyzed and provided to the client as a 360° overview report. Though most up to date and accurate, with regards to content, non-permissive screen scraping raises concerns of security, both with the clients and financial institutions a question not adequately addressed by many aggregation companies.
The Incredo Difference
Here at Incredo, we make client security and privacy our business while providing users with an easy to use application. How Incredo Analitycs addresses all the security concerns and why it is so incredible can be read in our article, “The Internet is all about Trust” by Clint Davis, a Solutions architect with over 13 years’ experience and Incredo’s Chief Technology Officer.
If you think your business can do with some Analytics support get in touch with us at Incredo Analytics. In the meantime please leave us your thoughts and feedback on our articles in the comment section below and watch this space for our weekly releases.