This podcast episode features Alex Antic, a data science expert, exploring the evolution of quantitative techniques in the financial sector, the need for explainability in data science models, and the significance of data literacy in modern decision-making processes.
Takeaways
• Quantitative techniques saw a rise in popularity before the 2008 crisis but shifted towards risk aversion post-crisis.
• Data-driven approaches are increasingly used in both private and public sectors, accompanied by a demand for data literacy among senior executives.
• Explainability, ethics, and fairness are crucial factors in building trust and utilizing data science models responsibly.
• Data literacy is crucial as more decisions rely on data, requiring individuals to adapt to fully automated systems.
• Data pooling can lead to benefits like enhanced language translation, but concerns regarding privacy and individual rights arise.
• Collective data sharing and privacy-preserving techniques are necessary to address challenges in detecting financial crimes.
• Transfer learning enables the application of algorithms across domains and confidential computing safeguards data privacy during processing.
• The distinction between competence and will in AI clarifies that competence alone doesn't lead to a system having the intention to perform tasks.
• The advancement of AI and the integration of tacit knowledge present both opportunities and complexities for the field.