pytorch_day01
写在一开始
被同学拉着一起学,那就正式学习一下pytorch. 择日不如撞日.
* 代码部分放在colab上,这里总结一点主要/延申知识
Day 1 Structured Data Modeling Example
- Using Titanic dataset. The goal is to predict whether a passenger is surived.
- The dataset contains 10 features,within them:
- 4 valued feature
- 4 categorical feature
- 2 other feauture(ticket number & name)
- Among the features, some of them has missing values
- The tutorial then do the data preprocessing, building a MLP with one hidden layer, and write the training function.
- About the Optimizer,loss function and evaluaiont metrics. plz see the summarization notes.
- Pre-processing:
- Some meaningless features are directly dropped
- Categorical features are encoded by one-hot
- Missing value are used as an assistant feature
Some functions used in the program
1 | pd.get_dummies(Series,DataFrame) |
- Convert categorical feature to one-hot encoding.
- Para:
- columns: when input is DataFrame,
- add prefix and
- treat NaN as a new class for extra col
- drop__first = True: drop the first class(used in linear model)
- dtype: define the datatype
1
2pd.isna()
// alias of pd.isnull()
- detect missing value, return a list wiht the same length of original object.True if missing value detected, false if not.
Training code
- It depends on different coding style. Here is one frame:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21def train(x_train,y_train,epochs,loss_func,):
# forward process
for epoch in range(0,epochs):
model.train()
total_loss,step = 0,0
loop = tqdm(enumerate(x_train), total =len(x_train),file = sys.stdout) # tqdm 将iterated object 包装成带进度条的iterator 说人话就是可视化训练进度
for i,batch in loop:
x,y = batch
pred_y = net(x)
loss = loss_func(preds,y)
# backward process
# The below three lines are always here.
loss.backward()
optimizer.step()
optimizer.zero_grad()
total_loss += loss.item()
# Can add code for log/tracking below
Evalution code
1 | def eval(): |
Save the model
1 | # Save params only |
1 | # Save the whole model |
All articles on this blog are licensed under CC BY-NC-SA 3.0 CN unless otherwise stated.