Skip navigation

상단메뉴

글로벌메뉴

좌측메뉴

계산과학부

검색

세미나

Seminar
FIELD Computational Sciences
DATE January 19 (Wed), 2022
TIME 15:00-16:30
PLACE 7323
SPEAKER Hussain, Dildar
HOST Hussain, Dildar
INSTITUTE School of Computational Sciences
TITLE GS_C_GO _Simulated Annealing in Features Selection (Machine Learning)
ABSTRACT The significant growth of modern technology and smart systems has left a massive production of big data. Not only are the dimensional problems that face the big data, but there are also other emerging problems such as redundancy, irrelevance, or noise of the features. Passing data with irrelevant features might affect the performance of the model because model learns the irrelevant features passed in it. The central premise when using a feature selection technique is that the data contains some features that are either redundant or irrelevant, and can thus be removed without incurring much loss of information. Therefore, feature selection (FS) has become an urgent need to search for the optimal subset of features.
In traditional regression analysis, the most popular form of feature selection is stepwise regression, which is a wrapper technique. It is a greedy algorithm that adds the best feature (or deletes the worst feature) at each round. The main control issue is deciding when to stop the algorithm. In machine learning, this is typically done by cross-validation. In statistics, some criteria are optimized. This leads to the inherent problem of nesting. More robust methods have been explored, such as branch and bound and piecewise linear network.
Subset selection algorithms can be broken up into wrappers, filters, and embedded methods. Wrappers use a search algorithm to search through the space of possible features and evaluate each subset by running a model on the subset. Wrappers can be computationally expensive and have a risk of over fitting to the model. Filters are similar to wrappers in the search approach, but instead of evaluating against a model, a simpler filter is evaluated. Embedded techniques are embedded in, and specific to, a model.
? Many popular search approaches include:
? Exhaustive
? Best first
? Simulated annealing
? Genetic algorithm
? Greedy forward selection
? Greedy backward elimination
? Particle swarm optimization
? Targeted projection pursuit
? Scatter search
? Variable neighborhood search
This seminar specifically talks about Simulate Annealing (SA) optimization for improving the performance of some FS algorithms. The SA boosts the performance of the FS algorithm and helps to fee from the local optima.
FILE  
  • list

date

~