InfoQ Homepage Presentations Using Bayesian Optimization to Tune Machine Learning Models
Using Bayesian Optimization to Tune Machine Learning Models
Summary
Scott Clark introduces Bayesian Global Optimization as an efficient way to optimize ML model parameters, explaining the underlying techniques and comparing it to other standard methods.
Bio
Scott Clark has been applying optimal learning techniques in industry and academia for years, from bioinformatics to production advertising systems. Before SigOpt, Scott worked on the Ad Targeting team at Yelp leading the charge on academic research and outreach with projects like the Yelp Dataset Challenge and open sourcing MOE. Scott was chosen as one of Forbes' 30 under 30 in 2016.
About the conference
Managing Big Data has become a major competitive advantage for many organizations and hence maintaining a proper analytics platform is vital for an organization's survival. This conference provides insights and potential solutions to address Big Data issues from well known experts and thought leaders through panel sessions and open Q&A sessions.
This content is in the AI, ML & Data Engineering topic
Related Topics:
Sponsored Content
-
Related Editorial
-
Related Sponsors
-
Popular across InfoQ
-
AWS Introduces ECS Managed Instances for Containerized Applications
-
GitHub Introduces New Embedding Model to Improve Code Search and Context
-
Google DeepMind Introduces CodeMender, an AI Agent for Automated Code Repair
-
Building Distributed Event-Driven Architectures across Multi-Cloud Boundaries
-
OpenAI Adds Full MCP Support to ChatGPT Developer Mode
-
Mental Models in Architecture and Societal Views of Technology: A Conversation with Nimisha Asthagiri
-