Darron's Devlog

  • Home
  • About
  • Projects
  • Studies
  • Blog
  • All Posts
Search
Cs231n

cs231n - Lecture 9. CNN Architectures

Review LeCun et al., 1998 $5\times 5$ Conv filters applied at stride 1 $2\times 2$ Subsampling (Pooling) layers applied at stride 2 i.e. architecture is [CONV-POOL-CONV-POOL-FC-FC] Stride: Downsample output activations Padding: Preserve input spatial dimensions in output activations Filter: Each conv filter outputs a “slice”...

Cs231n

cs231n - Lecture 8. Training Neural Networks II

Optimization Problems with SGD What if loss changes quickly in one direction and slowly in another? What does gradient descent do? Very slow progress along shallow dimension, jitter along steep direction What if the loss function has a local minima or saddle point? Zero gradient,...

Cs231n

cs231n - Lecture 7. Training Neural Networks I

Activation Functions Sigmoid $\sigma(x)=1/(1+e^{-x})$ Squashes numbers to range [0,1] Historically popular since they have nice interpretation as a saturating “firing rate” of a neuron. Problem: Gradient Vanishing: Saturated neurons “kill” the gradients; If all the gradients flowing back will be zero and weights will never...

Cs231n

cs231n - Lecture 6. Hardware and Software

Deeplearning Software The point of deep learning frameworks (1) Quick to develop and test new ideas (2) Automatically compute gradients (3) Run it all efficiently on GPU (wrap cuDNN, cuBLAS, OpenCL, etc) Computational graph example import numpy as np np.random.seed(0) N, D = 3, 4...

Cs231n

cs231n - Lecture 5. Convolutional Neural Networks

Convolutional Neural Networks ConvNets are everywhere Classification, Retrieval, Detection, Segmentation, Image Captioning, etc. Recap: Fully Connected Layer $32\times 32\times 3$ image $\rightarrow$ stretch to $3072\times 1$ Then a dot product of $3072\times 1$ input x and scoring weights W, Wx is in $10 \times 3072$....

Cs231n

cs231n - Lecture 4. Neural Networks and Backpropagation

Image Features Problem: Linear Classifiers are not very powerful Visual Viewpoint: Linear classifiers learn one template per class Geometric Viewpoint: Linear classifiers can only draw linear decision boundaries Image Features: Motivation After applying feature transform, points can be separated by linear classifier $f(x,y) = (r(x,y),...

Cs231n

cs231n - Lecture 3. Loss Functions and Optimization

Linear Classifier (cont.) Todo: Define a loss function: how good the classifier is Optimization: efficient way of finding the parameters that minimize the loss function Loss function given a dataset of examples \(\left\{ (x_i,y_i) \right\}_{n=1}^N\) where $x_i$ is image and $y_i$ is (integer) label Average...

Cs231n

cs231n - Lecture 2. Image Classification

Image Classification: A Core Task in Computer Vision The Problem: Semantic Gap considering image as a tensor of integers between [0,255] with 3 channels RGB Challenges: Viewpoint variation Background Clutter Illumination Occlusion Deformation Intraclass variation An image classifier def classify_image(image): # Some magic here? return...

Projects

GNN-based Fashion Coordinator

1. About 1.1. Project Goal 1.2. Model Architecture 2. Load Data and Preprocess 3. Initialize a Graph Model 3.1. Generate graphs and edge data for each item categories 4. Model Training 4.1. Load Evaluation Data 4.2. Train with HinSAGE and Link Prediction Error 1. About...

Projects

Stock Market Portfolio Modeling with R

Projects

K-POP Fandom Data Analysis with networkX

1. Introduction 2. Supporting Activities 2.1. Fandom Supporting Ratio (Total Supporting activities in Total activities) 2.2. Case by Gender Type 2.3. Case by Agents (Sum of artists data in a company) 3. Correlation between supporting and supported activities 3.1. by Pearson correlation 3.2. by Spearman...

Projects

Stock Market Cluster Analysis with NetworkX

1. Data Load and Preprocessing 2. Data Visualization 2.1. Stock Price Volatility 2.2. Rolling Average of Stock Price Correlation 3. Network Analysis 3.1. Build Graph with Correlation table 3.2. Setting threshold on weights 3.3. Community Detection 3.4. Visualization with Gephi 1. Data Load and Preprocessing...

Projects

R - Air Pollution Data Analysis

1. About 2. Load Data and Preprocess 2.1. Set Attributes 2.2. Handling NA values 3. EDA 3.1. 통합대기환경지수(Comprehensive air-quality index), CAI 계산 3.2. Timestep-wise Visualization 3.2.1. Hourly 3.2.2. Daily 3.2.3. Montly 3.2.4. Quarterly 3.2.5. Half-yearly 4. Summary 1. About 2020-2, Data Science and R, Final...

Projects

NLP - Korean Language Text Analysis with RNN

1. Data Load 2. Preprocessing 2.1. Remove duplicates 2.2. Regexp on Korean Language 3. Tokenizing with konlpy-Okt 4. Train-test Data 5. LSTM Model # 한국어 자료 import sys import os import numpy as np import nltk import konlpy import pandas as pd import re import...

Islr

ISLR - Chapter 10. Deep Learning

Chapter 10. Deep Learning 10.1. Single Layer Neural Networks 10.2. Multilayer Neural Networks 10.3. Convolutional Neural Networks 10.3.1. Convolution Layers 10.3.2. Pooling Layers 10.3.3. Architecture of a Convolutional Neural Network 10.3.4. Data Augmentation 10.4. Document Classification 10.5. Recurrent Neural Networks 10.5.1. Sequential Models for Document...

Islr

ISLR - Chapter 9. Support Vector Machines

Chapter 9. Support Vector Machines 9.1. Maximal Margin Classifier 9.1.1. What Is a Hyperplane? 9.1.2. Classification Using a Separating Hyperplane 9.1.3. The Maximal Margin Classifier 9.1.4. Construction of the Maximal Margin Classifier 9.1.5. The Non-separable Case 9.2. Support Vector Classifiers 9.2.1. Overview of the Support...

Islr

ISLR - Chapter 8. Tree-Based Methods

Chapter 8. Tree-Based Methods 8.1. The Basics of Decision Trees 8.1.1. Regression Trees Prediction via Stratification of the Feature Space Tree Pruning 8.1.2. Classification Trees 8.1.3. Trees Versus Linear Models 8.1.4. Advantages and Disadvantages of Trees 8.2. Bagging, Random Forests, Boosting, and Bayesian Additive Regression...

Projects

NLP - Text Analysis with ML algorithms

1. Data Info 2. Preprocessing 2.1. duplicated data found in train_data 3. Comparing between classification models 3.1 With Tf-idf vectorizer 3.1.1 MultinomialNB 3.1.2 LogisticRegression 3.2 With CountVectorizer 3.2.1 MultinomialNB 4. Balanced sampling approach - imblearn 5. Result: Best Model import sys import pandas as pd...

Islr

ISLR - Chapter 7. Moving Beyond Linearity

Chapter 7. Moving Beyond Linearity 7.1. Polynomial Regression 7.2. Step Functions 7.3. Basis Functions 7.4. Regression Splines 7.4.1. Piecewise Polynomials 7.4.2. Constraints and Splines 7.4.3. The Spline Basis Representation 7.4.4. Choosing the Number and Locations of the Knots 7.4.5. Comparison to Polynomial Regression 7.5. Smoothing...

Islr

ISLR - Chapter 6. Linear Model Selection and Regularization

Chapter 6. Linear Model Selection and Regularization 6.1. Subset Selection 6.1.1. Best Subset Selection 6.1.2. Stepwise Selection Forward Stepwise Selection Backward Stepwise Selection Hybrid Approaches 6.1.3. Choosing the Optimal Model Validation and Cross-Validation 6.2. Shrinkage Methods 6.2.1. Ridge Regression in Singular Value Decomposition 6.2.2. The...

« Newer Posts Page 2 of 3 Older Posts »
Darron's Devlog © 2022
Latest Posts

Search Darron's Devlog