RNN LSTM basic
RNN model
- ์ํ์ ๊ฒฝ๋ง ๋ชจ๋ธ
* NotImplementedError ์ค๋ฅํด๊ฒฐ๋ฒ : array_ops.py ํ์ผ ๊ต์ฒด(data ํด๋) -> restart
C:\\UsersITWILLanaconda3envs\tensorflow\Lib\site-packages\tensorflow\python\ops
C:\ProgramData\Anaconda3\envs\tensorflow\Lib\site-packages\tensorflow\python\ops
array_ops.py ํ์ผ ๊ต์ฒด
import tensorflow as tf #seed value import numpy as np #ndarray from tensorflow.keras import Sequential #model from tensorflow.keras.layers import SimpleRNN, Dense #RNN layer tf.random.set_seed(123) #seed๊ฐ ์ง์
many-to-one : word(4๊ฐ) -> ์ถ๋ ฅ(1๊ฐ)
X = [[[0.0], [0.1], [0.2], [0.3]], [[0.1], [0.2], [0.3], [0.4]], [[0.2], [0.3], [0.4], [0.5]], [[0.3], [0.4], [0.5], [0.6]], [[0.4], [0.5], [0.6], [0.7]], [[0.5], [0.6], [0.7], [0.8]]] Y = [0.4, 0.5, 0.6, 0.7, 0.8, 0.9] X = np.array(X, dtype=np.float32) Y = np.array(Y, dtype=np.float32) X.shape #(6, 4, 1) - (batch_size, time_steps, features) input_shape = (4, 1) model = Sequential()
RNN layer ์ถ๊ฐ
model.add(SimpleRNN(units=30, input_shape=input_shape, activation='tanh'))
DNN layer ์ถ๊ฐ
model.add(Dense(units=1)) #์ถ๋ ฅ : ํ๊ท๋ชจ๋ธ
model ํ์ตํ๊ฒฝ
model.compile(optimizer='adam', loss='mse', metrics=['mae'])
model training
model.fit(X, Y, epochs=100, verbose=1)
model prediction
y_pred = model.predict(X) print(y_pred)\
timeSeries RNN
- ์๊ณ์ด๋ฐ์ดํฐ + RNN model = ์๊ณ์ด๋ถ์
import pandas as pd #csv file read import matplotlib.pyplot as plt #์๊ณ์ด ์๊ฐํ import numpy as np #ndarray import tensorflow as tf #seed ๊ฐ from tensorflow.keras import Sequential #model from tensorflow.keras.layers import SimpleRNN, Dense #RNN layer tf.random.set_seed(12) #seed๊ฐ ์ง์
1. csv file read : ์ฃผ์๋ฐ์ดํฐ
path = r'C:\ITWILL\5_Tensorflow\workspace\chap08_TextVectorizing_RNN\data' timeSeries = pd.read_csv(path + '/timeSeries.csv') timeSeries.info()
RangeIndex: 100 entries, 0 to 99
Data columns (total 2 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 no 100 non-null int64
1 data 100 non-null float64
data = timeSeries['data'] print(data) #์ฃผ๊ฐ ์ ๊ทํ ์๋ฃ plt.plot(data, 'r--', label='time Series') plt.legend() plt.show()
2. RNN ์ ํฉํ dataset ์์ฑ
x_data = [] for i in range(len(data)-10) : #0~89 for j in range(10) : #0~9 x_data.append(data[i+j]) #90 * 10 = 900 len(x_data) #900
list -> array
x_data = np.array(x_data) x_data.shape #(900,) ''' 0~9 : 0+0~9 1~10 : 1+0~9 2~11 : 2+0~9 : 89~98 : 89+0~9 ''' y_data = [] for i in range(len(data)-10) : #0 ~ 89 y_data.append(data[i+10]) #90
list -> array
y_data = np.array(y_data) y_data.shape #(90,) ''' 10 11 12 : 99 '''
train(700)/val(200) split
x_train = x_data[:700].reshape(70,10,1) x_val = x_data[700:].reshape(-1,10,1) x_train.shape #(70, 10, 1) - (batch size, time steps, features) x_val.shape #(20, 10, 1)
train(70)/val(20) split
y_train = y_data[:70].reshape(70,1) y_val = y_data[70:].reshape(20,1)
3. model ์์ฑ
model = Sequential() input_shape = (10, 1)
RNN layer ์ถ๊ฐ
model.add(SimpleRNN(units=8, input_shape=input_shape, activation ='tanh'))
DNN layer ์ถ๊ฐ
model.add(Dense(units=1)) #์ถ๋ ฅ - ํ๊ท๋ชจ๋ธ - mse, mae
model ํ์ตํ๊ฒฝ
model.compile(optimizer='sgd', loss='mse', metrics=['mae'])
model ํ์ต
model.fit(x=x_train, y=y_train, epochs=400, verbose=1)
model ์์ธก
y_pred = model.predict(x_val)
y_true vs y_pred
plt.plot(y_val, 'y--', label='real value') plt.plot(y_pred, 'r--', label='predicted value') plt.legend(loc='best') plt.show()
'๋ฐ์ดํฐ๋ถ์๊ฐ ๊ณผ์ > Tensorflow' ์นดํ ๊ณ ๋ฆฌ์ ๋ค๋ฅธ ๊ธ
DAY72. Tensorflow Text Vectorizing RNN (0) | 2021.12.31 |
---|---|
DAY71. Tensorflow Face detection (2) (0) | 2021.12.30 |
DAY70. Tensorflow Face detection (1)face landmark (0) | 2021.12.29 |
DAY69. Tensorflow Selenium Crawling (0) | 2021.12.28 |
DAY68. Tensorflow CNN model (2)ImageGenerator (0) | 2021.12.27 |