목록인공지능 (4)
경주장
고려대학교 산업경영공학부 DSBA 연구실 강필성 교수님의 강의를 보고 정리한 것입니다. AdaBoosting . A weak model could be boosted in to arbitralily accurate strong model . New classifiers should focus on difficult cases. here weak model - slightly better than random gussing Ensemble size T 만큼 반복 : Get Some rule of thumb (= weak model) Reweight the examples of the training set, concentrate on hard cases previous rule Derive the next..
정의 - an ensemble method consisting of ➀ a bagging of un-pruned decision tree learners - with ➁ a randomized selection of features at each split Bagging과의 차이점 - Bagging : untilizint the same full set of predictors to determine each split while constructing a tree - Random Forests : selecting the best features in a randomly selection subset of feature at each split (Additinal Randomness) Additiona..
Decision Tree kNN SVM Intrinsically Multiclass 🟢 🟢 🟠 Handles Apples and Orange features 🟢 🔴 🔴 Scalability(large data set) 🟢 🔴 🔴 Prediction Accuracy 🔴 🔴 🟢 Parameter tuning 🟢 🟠 🔴 Handles Apples and Orange features - 입력에 Categorical Value가 있어도 되는가? Decision Tree의 장점을 유지하면서 Prediction Accuracy를 높이는 방법이 있을까? Ensemble - Accuracy of each model should be high enough - Correlation of each model should be..
Data Preprocessing 1. Step 1 : Preprocess the data Image 에서는 zero-centered 까지만 하는것이 보편적이다. Feature의 Spatial한 정보들이 뭉게지기 때문 In practive for Images - zero center only consider [32,32,3] images 1. Subtract the mean image (e.g. AlexNet) - mean image = mean([32,32,3] array) 2. Subtrace -pixel-wise mean (e.g. VGGNet) Weight Initializat https://reniew.github.io/13/ 가중치 초기화 (Weight Initialization) An Ed ..