A Comprehensive Analysis of AutoML
Table of Contents
- Introduction
- What is AutoML?
- Why is AutoML the need of the future?
- Various AutoML Platforms
- Will AutoML replace data scientists?
- Conclusion
- References
Introduction
An old idea that has changed everything is automation. Every tool and method we have ever created, in one way or another, involves some level of automation. An emerging field known as automated machine learning, or AutoML, automates the process of creating machine learning models to model data. With the best AutoML for ML projects, machine learning projects are now incredibly simple to finish. Both novices and experts can benefit greatly from using AutoML libraries for machine learning applications at any moment without risk of error. These libraries are well-known for offering resources to automatically identify the top machine learning initiatives for a task involving predictive modeling. Let’s explore some of the top ten AutoML libraries for machine learning projects that are becoming popular among researchers in recent times.
What is AutoML?
The term "AutoML" refers to the automation of some or all of the machine learning model construction processes, including feature selection and configuration, performance metric tuning, feature selection, and construction, training multiple models, assessing model performance, and selecting the best model.
In a pipeline with multiple preprocessing steps (missing value imputation, scaling, PCA, feature selection, etc.), the hyperparameters for all of the models and preprocessing steps, as well as various ways to ensemble or stack the algorithms within the pipeline.
AutoML takes into account various machine learning algorithms (random forests, linear models, SVMs, etc.).
The benefit of adopting AutoML is that it automates the machine learning processes' least engaging and time-consuming component. It allows data scientists to concentrate on more creative and strategic tasks rather than wasting time automating laborious yet computationally demanding modeling stages.
The drawback of utilizing AutoML is that automated feature engineering and pre-processing might make it challenging to tell whether a model is overfitting. Additionally, a strong performance may not necessarily come from automating the model training.
Why Is AutoML the Need of the Future?
The demand for machine learning systems has grown dramatically over the past few years from an application standpoint. Many different applications have incorporated machine learning. Although it has been demonstrated that machine learning may improve support for some businesses, many businesses are still having difficulty deploying ML models.
The replacement of certain human labor is one of the theoretical goals of AI. Particularly, adopting appropriate algorithms can help accomplish a significant portion of the design work for AI. Using parameter tuning as an illustration by utilizing increased computational power, algorithms like Bayes, NAS, and evolutionary programming can be used to substitute human labor in the parameter tuning process.
An organization first needs a team of seasoned data scientists who demand significant wages before deploying AI models. Even if an enterprise has a top-notch staff, choosing the model that works best for the business frequently requires more experience than AI expertise. Machine learning systems, which are meant to be user-friendly even for non-experts, are in growing demand as a result of machine learning's success in a range of applications. With minimal human intervention, Automl tends to automate as many ML pipeline steps as feasible while maintaining high model performance.
Three main benefits of using AutoML are:
- Run repetitive chores automatically to increase productivity. Data scientists can now concentrate more on issues rather than models as a result.
- Automated ML pipelines also aid in preventing potential mistakes brought on by manual labor.
- AutoML makes it possible for anyone to employ ML features, which is a significant step toward the democratization of machine learning.
Various AutoML Platforms
1. Auto-Sklearn
An open-source Python package called Auto-Sklearn was created to automate machine learning (AutoML) processes. It automates model selection and hyperparameter tuning for a range of classifiers, regressions, and clustering techniques, which is the most time-consuming but least exciting part of machine learning. Support vector machines (SVM), random forests, gradient boosting machines (GBM), k-means, and other ML techniques are implemented by Auto-sklearn.
2.AutoKeras
Through a collection of high-level Python APIs, AutoKeras automates pre-processing procedures, including feature extraction and scaling. Utilizing AutoKeras has the benefit of automating all difficult machine-learning activities, including data processing, model selection, and parameter tuning.
3. HyperOpt
An open-source library for large-scale AutoML is called HyperOpt. The popular Scikit-Learn machine-learning library is supported by HyperOpt-Sklearn, which is a wrapper for HyperOpt. This includes the collection of data preparation techniques and the classification and regression methods.
4. Databricks
You can easily create baseline models and notebooks using Databricks AutoML. Through the use of its MLlib library, which automates pre-processing tasks like feature extraction and scaling, it automates machine learning. Utilizing Databricks AutoML has the benefit of automating all difficult machine-learning activities, including data processing, model selection, and parameter tuning.
5. TransmogrifAI
TransmogrifAI is a well-known AutoML package for machine learning projects that is built in Scala and works on top of Apache Spark. Through machine learning automation and APIs, it is intended to increase the productivity of machine learning developers in ML projects. It facilitates the efficient construction of modular and tightly typed machine learning workflows as well as the training of high-quality machine learning models with less manual modification.
6. MLBox
With capabilities like quick reading, distributed data preparation or formatting, very robust feature selection and leak detection, precise hyper-parameter optimization, and prediction with model interpretation, MLBox is a well-known AutoML package for machine learning projects. It is concentrated on hyperparameter optimization, entity embedding, and drift identification.
7. H20 AutoML
One of the best AutoML libraries for machine learning applications is H20 AutoML, which automates iterative modeling, hyperparameter tuning, feature generation, and algorithm selection. It facilitates the error-free training and evaluation of machine learning models. It promises to lessen the requirement for machine learning knowledge to improve project performance.
8. AutoGluon
AutoGluon is an easy-to-use and easy-to-extend AutoML library for machine learning projects. It helps in automating stack ensembling, deep learning, as well as real-world applications spanning texts and images. It allows quick prototyping of deep learning and machine learning models with a few lines of code and leverages automatic hyperparameter tuning.
9. TPOT
TPOT is a well-known AutoML package for automatically finding top-notch machine learning models for jobs requiring predictive modeling. It is an open-source library with machine learning models and the scikit-learn data preparation framework. It is a Python AutoML tool for genetically modifying machine learning processes. The most appropriate pipeline out of thousands of potential ones is used to automate repetitive and tiresome processes.
10. Auto-ViML
Out of the enormous AutoML libraries, machine learning tasks are completed using Auto-ViML. Its purpose was to create highly effective, interpretable models using fewer variables. With just one line of code, several machine-learning projects can be automatically built. This AutoML package has appealing features, including SMOTE, Auto NLP, data time variables, and feature engineering.
11. Ludwig
With the help of a straightforward and adaptable data-driven configuration mechanism, the declarative machine learning framework Ludwig makes it simple to design machine learning pipelines. The Linux Foundation AI and Data host Ludwig, which can be used for a wide range of AI activities.
The input and output features, along with the appropriate data types, are declared in the configuration. Additional parameters can be specified by users to preprocess, encode, and decode features, load data from pre-trained models, build the internal model architecture, adjust training parameters, or perform hyperparameter optimization.
Ludwig will automatically create an end-to-end machine learning pipeline using the configuration's explicit parameters while reverting to smart defaults for those settings that are not.
12. Amazon Transcribe
By utilizing a deep learning method known as Automatic Speech Recognition (ASR), Amazon Transcribe makes it simple for developers to add speech-to-text capabilities to their applications.
Additionally, AWS offers Amazon Transcribe Medical, which enables clinical documentation apps to convert medical speech to text.
With a focus on automatic stack ensembling, deep learning, and practical applications covering text, image, and tabular data, AutoGluon offers simple-to-use and simple-to-extend AutoML.
Major benefits of Amazon Transcribe are:
- Create easy-to-read transcriptions
- Filter specific words
- Increase accuracy with customized transcriptions
Some example use cases for Amazon Transcribe are:
- Customer Experience
- Post call analytics
- Clinical conversation documentation
- Captioning and Subtitling workflows
- Cataloging audio archives
13. DataRobot
For predictive models, DataRobot offers automated machine learning on demand. Using all of the available data, it automatically performs feature engineering, model selection, and hyperparameter optimization without the need to retrain the model.
14. Amazon Sagemaker AutoPilot
Serverless and distributed automation of machine learning model training and scaling are provided by Amazon Sagemaker AutoPilot. Deploying machine learning models on Amazon ECM or Amazon SageMaker at any scale is possible using this fully managed solution.
15. Google Cloud AutoML
AutoML is offered by Google Cloud as a cloud service. It automates model building and hyperparameter tuning for machine learning issues, including sentiment analysis, natural language processing (NLP), picture classification, etc.
16. SMAC
SMAC (sequential model-based algorithm configuration) is an Automl library written in Python that automates both the training of numerous models (grid search) and the evaluation of the performance of the models for classification or regression problems using a variety of industry-standard evaluation metrics, such as accuracy.
17. Azure AutoML
By using its unique algorithms to configure, train, and score models with the most effective machine learning algorithm for your problem, Microsoft Azure's AutoML automates machine learning.
18. PyCaret
PyCaret is a well-known Python machine-learning framework that uses low code and is open-source for automating machine-learning models. It is a well-liked, practical, and successful model management and end-to-end machine learning solution to boost productivity. This automated machine-learning application has many different features, such as data preparation, model training, hyperparameter tuning, analysis, and interpretability.
19. AutoWeka
The name of this data mining software is AutoWeka. The Weka machine learning software serves as its foundation. It is suitable for both novices and experts because of its exceptional usability and powerful capabilities. This tool supports the quick construction of predictive data mining models using two machine learning techniques (i.e., support vector machines and artificial neural networks).
20. Splunk
The primary selling point of Splunk is real-time processing. You have no doubt noticed that while storage and CPU technology have advanced over time, data transport has not. So, Splunk takes care of this problem. This platform enables you to create knowledge objects for operational intelligence, receive alerts/events at the beginning of a machine state, and accurately anticipate the resources needed for infrastructure expansion.
21. Amazon Lex
This makes it possible for you to develop programs that have a voice- or text-based user interface and are powered by the same technology that powers Amazon Alexa. To design, build, test, and deploy conversational interfaces in apps, Amazon Lex, a fully managed artificial intelligence (AI) service, uses advanced natural language models.
22. BigML
BigML, one of AutoML's best-known solutions, makes it easy for companies to leverage a range of machine-learning models and platforms to advance their operations. A complete platform, quick access, easily understood and exportable models, collaborations, automation, adaptable deployments, and many other features are provided by this automated machine learning software.
23. AutoML JADBio
JADBio AutoML is a well-known AutoML system that provides user-friendly machine learning without scripting. Using this tool, AutoML, researchers, data scientists, and other users can successfully interact with machine learning models. Preparing the data for analysis, doing predictive analysis, learning new information, analyzing the results, and deploying the machine learning model that has been trained are the only five processes needed to use AutoML.
24. Akkio
Akkio is a user-friendly visual platform you may use to enhance your sales, marketing, and financial operations. In less than five minutes, AI models may be trained and put to use. not an advisor. No software has to be installed. No sales-related conversations. Previous AI experience is not necessary.
25. MLJAR
It is one of the best AutoML tools to exchange Python Notebooks with Mercury and obtain the top outcomes with MLJAR AutoML. For tabular data, the most sophisticated automated machine-learning algorithms are available. It makes it easier to build an extensive machine learning pipeline because of its thorough feature engineering, algorithm selection and modification, automatic documentation, and ML explanation. The MLJAR AutoML framework is well known since it has four built-in modes.
26. Tazi.ai
A well-known AutoML product for continuous machine learning that may be used with real-time data is Tazi.ai. It is advantageous to allow machine learning to be applied by business domain specialists in order to produce forecasts. The AutoML application makes use of supervised, unsupervised, and semi-supervised machine learning models.
27. Enhencer
Enhencer is an AutoML platform with a strong emphasis on usability and openness. Its cutting-edge user interface makes it possible to quickly develop Machine Learning models. Enhencer provides transparent performance indicators, making it easy to assess and fine-tune model performance. Additionally, the Enhencer interfaces allow for the tracking of model performance over time.
28. Aible
Aible develops AI that has a demonstrable influence on business in a straightforward, quick, and secure manner. Commercial people construct AI based on their actual cost-benefit tradeoffs and resource limitations when AI is trained for business effect, not accuracy. Aible handles the rest, from data to impact, requiring only three business questions to be answered.
29. dotData
Unique among machine learning firms, dotData was founded on the bold notion that anyone could profit from data science if it could be made as simple as possible. DotData was established with this goal in mind under the direction of Dr. Ryohei Fujimaki, a famous data scientist and the youngest research fellow ever appointed in the 119-year history of NEC. The business respects its customers and works hard to give them the best value in automated machine learning (AutoML). DotData was the first company to use machine learning to offer complete data science automation for the business. By speeding, democratizing, and operationalizing the entire data science process through automation, its data science automation platform reduces time to value.
30. ROBO
A Python-based Robust Bayesian Optimization system. The fundamental building block of Robo is a modular architecture that makes it simple to add and swap out Bayesian optimization components like various acquisition functions or regression models.
It includes a range of different acquisition functions, such as predicted improvement, the likelihood of improvement, lower confidence bound, or information gain, as well as various regression models, such as Gaussian processes, Random Forests, or Bayesian neural networks.
31. AUTOFOLIO
Choosing the optimal selection strategy and its hyperparameters allows AutoFolio to maximize the performance of algorithm selection systems.
Algorithm selection (AS) strategies, which entail selecting the algorithm from a group that is anticipated to solve a particular issue instance most effectively, have significantly advanced the state-of-the-art in resolving a number of well-known AI challenges.
32. FLEXFOLIO
Flexfolio is a portfolio-based modular and open solver architecture that incorporates several portfolio-based algorithm selection methods and strategies. It offers a special framework for contrasting and integrating several portfolio-based algorithm selection methods and approaches into a single, cohesive framework.
33. Dataiku
Dataiku is a platform that systematizes the use of data and AI. Its goal is to integrate AI and data such that it is an integral part of day-to-day operations. They specifically target businesses, tech professionals (such as analysts), and business experts (such as engineers, architects, and data scientists). Users with little experience in data science might not be the greatest candidates for Dataiku, as successful usage of the platform's capabilities may necessitate considerable technical expertise.
34. CreateML
Apple offers a no-code machine learning tool called CreateML that enables you to develop, train, and deploy models right on your Macs. Users can significantly reduce the amount of time it takes to train and deploy ML models by using CreateML, doing so in a very short amount of time. Model construction is now a lot more simple and more convenient thanks to the tool's drag-and-drop feature. Users can develop and use models to carry out tasks, including extracting meaning from text, identifying noises, recognizing activities in a video, and recognizing images.
35. Prevision.io
Prevision.io is an artificial intelligence (AI) platform created for data scientists and developers to rapidly and easily construct, deploy, monitor, and manage models so that more data science projects may be quickly put into production. Users can set up the platform in a matter of minutes because of its capabilities and clear user interface. The platform is available on the Google Cloud Marketplace and has a pay-as-you-go license model.
36. Obviously.ai
Obviously. AI is a no-code AutoML tool that makes it simple to create and maintain predictive machine-learning models. Because of the tool's no-code functionality, corporate users, citizen data scientists, and essentially anyone else can start making predictions without writing a single line of code. The issue of a dearth of data science skills is resolved by Obviously.ai's solution. Companies can still use ML for predictive analytics even if they don't have significant data science teams.
37. The AI and Analytics Engine
It is an end-to-end, no-code AutoML platform called the AI and Analytics Engine. Instead of taking days or weeks, the Engine speeds up the process from raw data to model deployment for consumers. With its simple AI-guided suggestions at each stage, the platform enables any user, regardless of their machine learning ability, to be able to construct and deploy models. The Engine targets a wide spectrum of users, from individuals and groups to businesses. As a result, there are subscription pricing plans to accommodate every level of use.
38. RECIPE
Another intriguing AutoML tool built on top of Scikit-Learn is RECIPE or REsilient ClassifIcation Pipeline Evolution. It stands out from other evolutionary frameworks because it can avoid producing invalid individuals and organizes a large number of potentially useful data pre-processing and categorization techniques into a grammar. A new level of flexibility is possible with RECIPE's use of genetic programming to evolve pipelines with context-free grammar definitions.
39. AutoGOAL
A Python package called AutoGOAL can automatically determine the most effective approach to do a task. It was created primarily for AutoML and is utilized in a variety of situations where the developer has multiple options for how to complete a task. It already has several low-level machine-learning algorithms that can be put together automatically into pipelines to address various issues.
It serves as a framework for program synthesis, which is the process of choosing the optimal programs to address a specific issue. The user must be able to specify the space of all potential programs for this to operate. ML programmers will like this additional AutoML toolbox since it gives versatility not typically found with such tools.
40. RapidMiner
RapidMiner's machine learning technology may significantly minimize the time and work needed to develop predictive models for any association or organization that doesn't care about the sector, the assets, or the estimates.
With the Auto Model, predictive models can be produced in about five minutes. It doesn't call for any particular expertise. Customers can easily transfer their data and determine the outcomes they require.
Auto Model will then produce high-esteem experiences at that point. Computerized data science can be finished with RapidMiner Auto Model. Analyzing and displaying data is part of this.
41. Alteryx
Through a range of software solutions, Alteryx provides data science and machine learning functionality. The self-service platform has more than 260 drag-and-drop building components, with Alteryx Designer as its standout feature. Alteryx Designer automates data preparation, data blending, reporting, predictive analytics, and data science. Users of Alteryx can easily choose and compare the performance of various algorithms as well as immediately see variable relationships and distributions. The software can be set up in a hosted environment, in the cloud, behind your own firewall, or both without any coding knowledge.
42. IBM Watson Studio
Users may create, run, and manage AI models at scale on any cloud with IBM Watson Studio. The item is a component of IBM Cloud Pak for Data, the organization's core platform for AI and data. The solution enables you to manage and protect open-source notebooks, deploy and execute models with one-click integration, prepare and construct models visually, manage and monitor models with explainable AI, and automate AI lifecycle management. Users of IBM Watson Studio can use open-source frameworks like PyTorch, TensorFlow, and sci-kit-learn because of the flexible architecture offered by the software.
43. KNIME
An open-source platform for data science development is KNIME Analytics. It offers a graphical drag-and-drop interface that enables the construction of visual workflows without the need for scripting. To design workflows, model each phase of analysis, regulate the flow of data, and guarantee work is current, users can select from more than 2000 nodes. To generate statistics, clean data, and extract and choose features, KNIME may combine data from any source. The software uses AI and machine learning to visualize data using both conventional and cutting-edge charts.
44. MathWorks MATLAB
MathWorks MATLAB combines a programming language that natively expresses matrix and array mathematics with a desktop environment optimized for iterative analysis and design processes. For writing scripts that mix code, output, and formatted text in an executable notebook, it comes with the Live Editor. Professionally created, thoroughly reviewed, and tested are MATLAB toolboxes. You may also test out various algorithms using your data using MATLAB programs.
45. TIBCO
For modern BI, descriptive and predictive analytics, streaming analytics, and data science, TIBCO has a wide range of products available. Users can prepare data, construct models, deploy those models, and monitor them using TIBCO Data Science. Additionally, it has embedded Jupyter Notebooks for sharing reusable modules, drag-and-drop workflows, and AutoML. Users can orchestrate open source using TensorFlow, SageMaker, Rekognition, and Cognitive Services while running workflows on TIBCO's Spotfire Analytics.
46. Auger
Auger. The most precise AutoML platform is AI. Faster predictive model development is achieved via Auger's proprietary Bayesian optimization-based search of algorithm/hyperparameter combinations. Developers can create predictive models utilizing any cloud-based AutoML provider thanks to the open-source A2ML project (Auger, Google Cloud AutoML, or Microsoft Azure AutoML). The Machine Learning Review and Monitoring (MLRAM) tool from Auger ensures the continued accuracy of trained predictive models developed on Auger or any other machine learning platform.
47. Amazon Polly
It is a service that simulates voice from the text. Utilizing the power of deep learning helps create new categories of speech-enabled goods and aids in the development of applications that talk. Additionally, it represents a significant advance in creating inclusive apps for those with disabilities.
Among other languages, Polly primarily supports English, Mandarin Chinese, Brazilian Portuguese, Danish, French, Japanese, Korean, and Danish.
Two speaking tenses are supported by Polly's Neural Text-to-Speech (TTS)-
Use cases for news narration in newsreader style.
Ideal for two-way applications like telephony and conversational style.
Additionally, it offers Amazon Polly Brand, which enables businesses to design their own voice.
Companies like FICO, USA Today, ProQuest, CBSi, Whooshkaa, MapBox, etc., use Amazon Polly Brand.
48. Dialogflow
A conversational user interface can be designed and integrated into mobile applications, web applications, and interactive voice response systems using Dialogflow, a platform for creating linguistic and visual bots. The technology can analyze a variety of inputs, including text and audio data.
The terms listed below are used in the Dialogflow environment:
Agents: The virtual agent that manages communications with end users is called an agent.
Intent: The end-aim user's communication is reflected in their intent. There may be numerous intents for each agent that combine to form a dialogue. Dialogflow intent carries out an intent classification task to match the end-user expression to the defined agent's best sense.
The parent intent automatically receives context, and the follow-up intent receives an input context with the same name.
Entities: A type named Entity exists for each intent argument, and it extracts the end-user expression.
Contexts: Conversation flow can be managed by contexts in Dialogflow.
Follow-up Intentions: A follow-up intention is a child of the parent's intention it is connected to. An input context with the same name is automatically added to the parent intent when a follow-up intent is established, and an output context with the same name is added to the follow-up intent.
Dialogflow Console: A web-based user interface for managing Dialogflow agents is the Dialogflow Console.
Example use cases for Dialogflow:
- Chatbots — Interfaces that can be configured to receive requests, set appointments, access orders, and respond to inquiries.
- Internet of things (IoT) — It can be used to improve the comprehension of context and the accuracy of responses from IoT devices.
49. Amazon Rekognition
In photos and videos, Amazon Rekognition may help recognize objects, people, scenes, text, and activities, as well as flag any offensive content. Additionally, it offers precise facial analysis and search capabilities to find, examine, and contrast faces for user authentication jobs.
Using Amazon Rekognition has several advantages:
It offers labels to distinguish between objects like bicycles, telephones, buildings, and sceneries like parking lots, beaches, and cities.
Custom labels increase the number of things that can be detected.
content censorship
Text recognition
Face recognition and evaluation
Face verification and search
Amazon Rekognition is used by some major corporations, including the NFL, CBS, National Geographic, Marinus Analytics, and SkyNews.
50. Amazon Comprehend
In order to uncover patterns and connections in a text, Amazon Comprehend uses machine learning to do natural language processing (NLP).
The technologies make use of machine learning to uncover patterns and connections in unstructured data. The service recognizes the sentence's language and extracts important words, phrases, names, organizations, or events.
In order to discover medical problems, drugs, and drug developments, Amazon Comprehend Medical is frequently utilized to extract the Medical Corpus information.
Some use cases of Amazon Comprehend:
Call center analytics
Index and search product reviews
Personalized content on the website
Customer support ticket handling
Clinical trial recruitment
Companies like LexisNexis, TeraDACT, FINRA, and Vidmob use Amazon Comprehend.
Will AutoML Replace Data Scientists?
No, is the answer.
Even though AutoMLs are good at creating models, they still can't handle the majority of a data scientist's tasks. To define business concerns, data scientists are still required. To create more helpful features, data scientists still need to employ their subject-matter expertise. Today, only a small subset of problems, such as classification and regression issues, can be handled by autoML. They are currently unable to create recommendation and ranking models. Most importantly, using AutoMLs alone will not yield useful insights from the data; data scientists are still required.
AutoMLs remain effective tools for data scientists to produce value for their stakeholders, nonetheless. Therefore, the next logical query is:
- How and when should we use AutoML?
- When may data scientists use AutoML platforms to their advantage?
Here, I'd like to list a few instances that might be worthwhile.
Performance takes precedence over interpretability: In other cases, the stakeholders may simply be interested in the accuracy of the models, and interpretability may not be the most important factor. Based on our tests, it appears that AutoMLs, when combined with proper feature engineering, can produce a performance that is satisfactory. The interpretability in our cases, nevertheless, was only applicable to features that were crucial for both platforms. In other words, AutoMLs may be the best option for increased precision if the feature importance is sufficient for your situation.
Rapid deployment into production: You may easily deploy your models into production using Google and Azure. For instance, batch prediction and online prediction are both easily accessible through Google Cloud. You can also use their API to deploy your model to your website. These characteristics can help data scientists produce work more quickly and with less effort.
Better time management: Data scientists are faced with a myriad of duties that might be exhausting. Time may be your most limited resource as a data scientist. Your days are filled with several meetings with stakeholders (product managers, employees from business units, and clients), the upkeep of current models, the gathering and cleaning of data, getting ready for the next meeting, and so on. AutoML can be a fantastic time-saving tool because it simply takes a few clicks and a few dollars to train a model that performs well. As a result, you can concentrate on activities that are most beneficial (occasionally, investing time in creating a fantastic presentation is more valuable than increasing the model's accuracy by 1%).
Conclusion
I hope you got a preface to the concept behind AutoML through this article. The main goal of AutoML is to free up data scientists' time so they can concentrate on practical business problems by automating repetitive tasks like pipeline creation and hyperparameter tweaking. Additionally, AutoML enables everyone, not just a chosen few, to use machine learning technologies. The development of ML can be accelerated by data scientists who use AutoML to build extremely effective machine learning.
Success or failure will depend on how AutoML is used and how the area of machine learning develops. However, AutoML will undoubtedly be important in the future of machine learning.
References
- AutoML - Wikipedia
- "Open Source AutoML Tools: AutoGluon, TransmogrifAI, Auto-sklearn, and NNI." Bizety. 2020-06-16.
- Ferreira, Luís, et al. "A comparison of AutoML tools for machine learning, deep learning, and XGBoost." 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021. https://repositorium.sdum.uminho.pt/bitstream/1822/74125/1/automl_ijcnn.pdf
- Erickson, Nick; Mueller, Jonas; Shirkov, Alexander; Zhang, Hang; Larroy, Pedro; Li, Mu; Smola, Alexander (2020-03-13). "AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data." arXiv:2003.06505 [stat.ML].
- Hutter F, Caruana R, Bardenet R, Bilenko M, Guyon I, Kegl B, and Larochelle H. "AutoML 2014 @ ICML". AutoML 2014 Workshop @ ICML. Retrieved 2018-03-28.