作者taylor0607 (加菲貓星人)
看板DataScience
標題[問題] LSTM預測問題
時間Fri Jun 1 00:17:12 2018
各位大大好~
小弟學過DM的一些方法
正在嘗試Keras預測
目前遇到的問題是
我想用25個欄位去預測Y值
傳統Dm的方法是,將新的資料吃進訓練好的模型中,讓他預測新的Y
但在Keras裡,他預測出來的y是array形式
想請問這樣的話,我要如何將新的資料轉成訓練模型可吃的形式,謝謝!
以下是我的程式碼:
import numpy
import pandas as pd
from sklearn import preprocessing
numpy.random.seed(10)
all_df = pd.read_csv("/Users/mac/Desktop/123.csv")
cols=['x1','x2','x3','x4','x5','x6','x7','x8','x9','x10',
'x11','x12','x13','x14','x15','x16','x17','x18','x19',
'x20','x21','x22','x23','x24','x25'] #欄位名稱
all_df = all_df[cols]
msk=numpy.random.rand(len(all_df)) < 0.8
train_df = all_df[msk]
test_df=all_df[~msk]
train_Features=all_df[['x1','x2','x3','x4','x5','x6','x7','x8','x9',
'x10','x11','x12','x13','x14','x15','x16','x17','x18','x19',
'x20','x21','x22','x23','x24','x25']]
train_Label = all_df['x25']
test_Features=all_df[['x1','x2','x3','x4','x5',
'x6','x7','x8','x9','x10','x11','x12','x13','x14',
'x15','x16','x17','x18','x19','x20','x21','x22','x23','x24','x25']]
test_Label = all_df['x25']
print(len(train_df))
print(len(test_df))
print(len(all_df))
from keras.models import Sequential
from keras.layers import Dense,Dropout
model = Sequential()
model.add(Dense(units=40,input_dim=25,
kernel_initializer= 'uniform',
activation = 'relu'))
model.add(Dense(units=30,
kernel_initializer= 'uniform',
activation = 'relu'))
model.add(Dense(units=1,
kernel_initializer= 'uniform',
activation = 'sigmoid'))
model.compile(loss='binary_crossentropy',
optimizer = 'adam',metrics=['accuracy'])
train_history = model.fit(x=train_Features,
y=train_Label,
validation_split=0.1,
epochs=30,
batch_size=30,verbose=2)
train_history
scores = model.evaluate(x=test_Features,y=test_Label)
scores[1]
以下是我的模型結果:
runfile('/Users/mac/.spyder-py3/temp.py', wdir='/Users/mac/.spyder-py3')
74
16
90
Train on 81 samples, validate on 9 samples
Epoch 1/30
- 1s - loss: 0.6929 - acc: 0.4198 - val_loss: 0.6937 - val_acc: 0.1111
Epoch 2/30
- 0s - loss: 0.6902 - acc: 0.1852 - val_loss: 0.6944 - val_acc: 0.1111
Epoch 3/30
- 0s - loss: 0.6877 - acc: 0.1605 - val_loss: 0.6951 - val_acc: 0.1111
Epoch 4/30
- 0s - loss: 0.6851 - acc: 0.1605 - val_loss: 0.6957 - val_acc: 0.1111
Epoch 5/30
- 0s - loss: 0.6813 - acc: 0.1605 - val_loss: 0.6963 - val_acc: 0.1111
Epoch 6/30
- 0s - loss: 0.6767 - acc: 0.1852 - val_loss: 0.6970 - val_acc: 0.1111
Epoch 7/30
- 0s - loss: 0.6708 - acc: 0.2099 - val_loss: 0.6975 - val_acc: 0.1111
Epoch 8/30
- 0s - loss: 0.6628 - acc: 0.2222 - val_loss: 0.6979 - val_acc: 0.1111
Epoch 9/30
- 0s - loss: 0.6534 - acc: 0.3210 - val_loss: 0.6984 - val_acc: 0.1111
Epoch 10/30
- 0s - loss: 0.6397 - acc: 0.3580 - val_loss: 0.6986 - val_acc: 0.2222
Epoch 11/30
- 0s - loss: 0.6244 - acc: 0.4321 - val_loss: 0.6990 - val_acc: 0.2222
Epoch 12/30
- 0s - loss: 0.6039 - acc: 0.4815 - val_loss: 0.6990 - val_acc: 0.2222
Epoch 13/30
- 0s - loss: 0.5758 - acc: 0.5309 - val_loss: 0.6988 - val_acc: 0.2222
Epoch 14/30
- 0s - loss: 0.5467 - acc: 0.5432 - val_loss: 0.6990 - val_acc: 0.2222
Epoch 15/30
- 0s - loss: 0.5088 - acc: 0.5432 - val_loss: 0.6991 - val_acc: 0.2222
Epoch 16/30
- 0s - loss: 0.4600 - acc: 0.5432 - val_loss: 0.6986 - val_acc: 0.3333
Epoch 17/30
- 0s - loss: 0.4149 - acc: 0.5556 - val_loss: 0.6988 - val_acc: 0.3333
Epoch 18/30
- 0s - loss: 0.3513 - acc: 0.5679 - val_loss: 0.6993 - val_acc: 0.4444
Epoch 19/30
- 0s - loss: 0.2774 - acc: 0.5556 - val_loss: 0.6992 - val_acc: 0.4444
Epoch 20/30
- 0s - loss: 0.2010 - acc: 0.5556 - val_loss: 0.7004 - val_acc: 0.4444
Epoch 21/30
- 0s - loss: 0.1163 - acc: 0.5556 - val_loss: 0.7034 - val_acc: 0.4444
Epoch 22/30
- 0s - loss: 0.0139 - acc: 0.5556 - val_loss: 0.7056 - val_acc: 0.4444
Epoch 23/30
- 0s - loss: -8.1930e-02 - acc: 0.5679 - val_loss: 0.7121 - val_acc: 0.4444
Epoch 24/30
- 0s - loss: -1.9559e-01 - acc: 0.5679 - val_loss: 0.7214 - val_acc: 0.4444
Epoch 25/30
- 0s - loss: -3.2348e-01 - acc: 0.5679 - val_loss: 0.7327 - val_acc: 0.4444
Epoch 26/30
- 0s - loss: -4.4836e-01 - acc: 0.5802 - val_loss: 0.7467 - val_acc: 0.4444
Epoch 27/30
- 0s - loss: -5.7915e-01 - acc: 0.5802 - val_loss: 0.7694 - val_acc: 0.4444
Epoch 28/30
- 0s - loss: -7.3865e-01 - acc: 0.5802 - val_loss: 0.7944 - val_acc: 0.4444
Epoch 29/30
- 0s - loss: -8.9148e-01 - acc: 0.5802 - val_loss: 0.8236 - val_acc: 0.4444
Epoch 30/30
- 0s - loss: -1.0620e+00 - acc: 0.5802 - val_loss: 0.8666 - val_acc: 0.4444
90/90 [==============================] - 0s 49us/step
--
※ 發信站: 批踢踢實業坊(ptt.cc), 來自: 119.14.41.117
※ 文章網址: https://www.ptt.cc/bbs/DataScience/M.1527783435.A.9A9.html
推 HYDE1986: 你是模型已經train好,要用新的資料來進行預測嗎?06/01 09:18
推 ax61316: 你的LSTM預測的結果長怎樣?能否秀出程式?06/01 11:02
推 tsoahans: 你要做的事many-to-one還是many-to-many的預測?06/01 13:56
→ tsoahans: 是06/01 13:57
※ 編輯: taylor0607 (27.246.68.149), 06/01/2018 15:04:55
→ taylor0607: 好的 補上了 06/01 15:05
→ taylor0607: 對 我想像DM一樣 預測出一個新欄位06/01 15:05
→ taylor0607: 回t大 我是many to one 06/01 15:06
→ Kazimir: 我怎麼沒看到LSTM在哪裡 你import 的不是Dense嗎06/01 16:42
※ 編輯: taylor0607 (27.246.68.149), 06/01/2018 17:04:09
→ taylor0607: 啊抱歉 是Keras 06/01 17:04
推 Kazimir: 你模型訓練時是怎樣測試就怎樣進去阿 scores沒東西嗎? 06/01 17:18
→ tsoahans: 你要問的是怎麼predict test data嗎? 06/01 17:54
推 HYDE1986: Keras直接用model.predict就可以了呀 官方文件有參數說 06/01 17:58
→ HYDE1986: 明 06/01 17:58
→ taylor0607: 好 謝謝~ 06/01 19:15