The cost of electronic storage has declined by 90% every five years in recent decades. Data storage thus costs only 0.01% of what it cost 20 years ago.1 This dramatic decline in storage costs and advances in computing technology have caused the amount of data stored electronically to explode.
With the advent of the Internet and wireless communications that data has become widely accessible. Gone are the days when you had to go to your public or university library and copy data from microfiche. Today, that data is available online.
Epidemiological researchers can now search for disease clusters and investigate their causes at the level of individual census blocks. Financial researchers can analysis trading activity time stamped to milli seconds. Businesses can target promotional activities to individual house addresses and even IP addresses.
SLCG has state of the art computing technology and data storage capabilities.
Data sitting on a hard drive doesn’t generate insights; data needs to be analyzed often starting with traditional statistical tools such as regression analysis.
Exploratory data analysis and data visualization allows researchers to search for relationships in the data which are missed when rigid preconceived structures are imposed through regression analysis.
Machine learning is a mode of data analysis that extracts maximum insights from the data. For example, the gradient boosting machine (“GBM”) combines the outputs of many “weak” predictive models (a model that predicts marginally better than random) to produce an ensemble model with superior predictive performance.
SLCG’s professionals have advanced degrees in applied mathematics and statistics and have published widely in peer reviewed journals applying advanced mathematics and statistics to real world problems.
SLCG applied machine learning techniques to substantially improve the ability of traditional statistical models in predicting professional misconduct. This research resulted in a peer-reviewed publication illustrating the application of this innovative data technique to the records of over 1.2 million professionals.
SLCG has developed software to scan and extract data from websites. Our analytics group downloaded, cleaned and parsed information from over 30,000 424B2 and FWP filings from SEC’s Edgar system. This data informed a peer-reviewed publication which would have been infeasible without SLCG’s data-scraping abilities.
Our analytics group analyzed over 220 GB data of transactions and order data (over one billion orders) to identify potentially suspicious orders.