国产秋霞理论久久久电影-婷婷色九月综合激情丁香-欧美在线观看乱妇视频-精品国avA久久久久久久-国产乱码精品一区二区三区亚洲人-欧美熟妇一区二区三区蜜桃视频

【數(shù)據(jù)競賽】Kaggle秘技,用Sigmoid函數(shù)做回歸問題!

共 26558字,需瀏覽 54分鐘

 ·

2021-02-06 16:36

作者:? 塵沙櫻落,杰少


基于Sigmoid的回歸損失函數(shù)設(shè)計


背景


這是一個非常有意思的Loss設(shè)計,在你的問題是回歸問題的時候,都可以考慮嘗試使用一下,并不能保證所有的問題都能奏效,但是在某些特定的問題中卻可以帶來巨大的提升,最不濟也可以作為一個用于后期stacking的方案。

該方案是設(shè)計者是:數(shù)據(jù)科學(xué)家danzel ,作者對于該設(shè)計奏效的原因描述如下,

I used a sigmoid-output and scaled its range afterwards (to look like the target). Training like this helps the model to converge faster and gives better results.

設(shè)計思路


假設(shè)對于我們的回歸的問題為最小化平方損失,而且我們第個標(biāo)簽為,

  • , 為我們的樣本個數(shù);

1. Baseline Loss

  • 一般都是Dense(1,activation = 'linear')的形式

2. 基于Sigmoid的Loss

  • 是Dense(1,activation = 'sigmoid') * (max_val - min_val) + min_val的形式;
  • ,
案例


上面說的究竟靠譜不靠譜呢?我們摘取kaggle數(shù)據(jù)進行實驗,眼見為真。有興趣的朋友可以去文末鏈接下載。

1.導(dǎo)入工具包

1.1導(dǎo)入使用的工具包

import?pandas????????????????as?pd?
from?sklearn.metrics?????????import?mean_squared_error
from?sklearn.model_selection?import?KFold
import?xgboost???????????????as?xgb
from???tqdm??????????????????import?tqdm
import?numpy?????????????????as?np
import?pandas????????????????as?pd?
import?tensorflow????????????as?tf?
from?lightgbm????????????????import?LGBMRegressor
from?sklearn.model_selection?import?KFold
import?numpy?????????????????as?np
import?seaborn???????????????as?sns
from?sklearn.metrics?????????import?mean_squared_error

def?RMSE(y_true,?y_pred):
????return?tf.sqrt(tf.reduce_mean(tf.square(y_true?-?y_pred)))

1.2 數(shù)據(jù)讀取

train?=?pd.read_csv('./data/train.csv')
test??=?pd.read_csv('./data/test.csv')
sub???=?pd.read_csv('./data/sample_submission.csv')

2. 數(shù)據(jù)預(yù)處理

2.1 數(shù)據(jù)拼接

train_test?=?pd.concat([train,test],axis=0,ignore_index=True)
train_test.head()

idcont1cont2cont3cont4cont5cont6cont7cont8cont9cont10cont11cont12cont13cont14target
010.6703900.8113000.6439680.2917910.2841170.8559530.8907000.2855420.5582450.7794180.9218320.8667720.8787330.3054117.243043
130.3880530.6211040.6861020.5011490.6437900.4498050.5108240.5807480.4183350.4326320.4398720.4349710.3699570.3694848.203331
240.8349500.2274360.3015840.2934080.6068390.8291750.5061430.5587710.5876030.8233120.5670070.6777080.8829380.3030477.776091
350.8207080.1601550.5468870.7261040.2824440.7851080.7527580.8232670.5744660.5808430.7695940.8181430.9142810.2795286.957716
480.9352780.4212350.3038010.8802140.6656100.8301310.4871130.6041570.8746580.8634270.9835750.9004640.9359180.4357727.951046

2.2. 用于神經(jīng)網(wǎng)絡(luò)預(yù)處理的GaussianRank

  • 如果希望知道細節(jié),可以參考之前分享的RankGaussian的部分
import?numpy?as?np
from?joblib?import?Parallel,?delayed
from?scipy.interpolate?import?interp1d
from?scipy.special?import?erf,?erfinv
from?sklearn.preprocessing?import?QuantileTransformer,PowerTransformer
from?sklearn.base?import?BaseEstimator,?TransformerMixin
from?sklearn.utils.validation?import?FLOAT_DTYPES,?check_array,?check_is_fitted

class?GaussRankScaler(BaseEstimator,?TransformerMixin):
????"""Transform?features?by?scaling?each?feature?to?a?normal?distribution.
????Parameters
????????----------
????????epsilon?:?float,?optional,?default?1e-4
????????????A?small?amount?added?to?the?lower?bound?or?subtracted
????????????from?the?upper?bound.?This?value?prevents?infinite?number
????????????from?occurring?when?applying?the?inverse?error?function.
????????copy?:?boolean,?optional,?default?True
????????????If?False,?try?to?avoid?a?copy?and?do?inplace?scaling?instead.
????????????This?is?not?guaranteed?to?always?work?inplace;?e.g.?if?the?data?is
????????????not?a?NumPy?array,?a?copy?may?still?be?returned.
????????n_jobs?:?int?or?None,?optional,?default?None
????????????Number?of?jobs?to?run?in?parallel.
????????????``None``?means?1?and?``-1``?means?using?all?processors.
????????interp_kind?:?str?or?int,?optional,?default?'linear'
???????????Specifies?the?kind?of?interpolation?as?a?string
????????????('linear',?'nearest',?'zero',?'slinear',?'quadratic',?'cubic',
????????????'previous',?'next',?where?'zero',?'slinear',?'quadratic'?and?'cubic'
????????????refer?to?a?spline?interpolation?of?zeroth,?first,?second?or?third
????????????order;?'previous'?and?'next'?simply?return?the?previous?or?next?value
????????????of?the?point)?or?as?an?integer?specifying?the?order?of?the?spline
????????????interpolator?to?use.
????????interp_copy?:?bool,?optional,?default?False
????????????If?True,?the?interpolation?function?makes?internal?copies?of?x?and?y.
????????????If?False,?references?to?`x`?and?`y`?are?used.
????????Attributes
????????----------
????????interp_func_?:?list
????????????The?interpolation?function?for?each?feature?in?the?training?set.
????????"""


????def?__init__(self,?epsilon=1e-4,?copy=True,?n_jobs=None,?interp_kind='linear',?interp_copy=False):
????????self.epsilon?????=?epsilon
????????self.copy????????=?copy
????????self.interp_kind?=?interp_kind
????????self.interp_copy?=?interp_copy
????????self.fill_value??=?'extrapolate'
????????self.n_jobs??????=?n_jobs

????def?fit(self,?X,?y=None):
????????"""Fit?interpolation?function?to?link?rank?with?original?data?for?future?scaling
????????Parameters
????????----------
????????X?:?array-like,?shape?(n_samples,?n_features)
????????????The?data?used?to?fit?interpolation?function?for?later?scaling?along?the?features?axis.
????????y
????????????Ignored
????????"""

????????X?=?check_array(X,?copy=self.copy,?estimator=self,?dtype=FLOAT_DTYPES,?force_all_finite=True)

????????self.interp_func_?=?Parallel(n_jobs=self.n_jobs)(delayed(self._fit)(x)?for?x?in?X.T)
????????return?self

????def?_fit(self,?x):
????????x?=?self.drop_duplicates(x)
????????rank?=?np.argsort(np.argsort(x))
????????bound?=?1.0?-?self.epsilon
????????factor?=?np.max(rank)?/?2.0?*?bound
????????scaled_rank?=?np.clip(rank?/?factor?-?bound,?-bound,?bound)
????????return?interp1d(
????????????x,?scaled_rank,?kind=self.interp_kind,?copy=self.interp_copy,?fill_value=self.fill_value)

????def?transform(self,?X,?copy=None):
????????"""Scale?the?data?with?the?Gauss?Rank?algorithm
????????Parameters
????????----------
????????X?:?array-like,?shape?(n_samples,?n_features)
????????????The?data?used?to?scale?along?the?features?axis.
????????copy?:?bool,?optional?(default:?None)
????????????Copy?the?input?X?or?not.
????????"""

????????check_is_fitted(self,?'interp_func_')

????????copy?=?copy?if?copy?is?not?None?else?self.copy
????????X?=?check_array(X,?copy=copy,?estimator=self,?dtype=FLOAT_DTYPES,?force_all_finite=True)

????????X?=?np.array(Parallel(n_jobs=self.n_jobs)(delayed(self._transform)(i,?x)?for?i,?x?in?enumerate(X.T))).T
????????return?X

????def?_transform(self,?i,?x):
????????return?erfinv(self.interp_func_[i](x))

????def?inverse_transform(self,?X,?copy=None):
????????"""Scale?back?the?data?to?the?original?representation
????????Parameters
????????----------
????????X?:?array-like,?shape?[n_samples,?n_features]
????????????The?data?used?to?scale?along?the?features?axis.
????????copy?:?bool,?optional?(default:?None)
????????????Copy?the?input?X?or?not.
????????"""

????????check_is_fitted(self,?'interp_func_')

????????copy?=?copy?if?copy?is?not?None?else?self.copy
????????X?=?check_array(X,?copy=copy,?estimator=self,?dtype=FLOAT_DTYPES,?force_all_finite=True)

????????X?=?np.array(Parallel(n_jobs=self.n_jobs)(delayed(self._inverse_transform)(i,?x)?for?i,?x?in?enumerate(X.T))).T
????????return?X

????def?_inverse_transform(self,?i,?x):
????????inv_interp_func?=?interp1d(self.interp_func_[i].y,?self.interp_func_[i].x,?kind=self.interp_kind,
???????????????????????????????????copy=self.interp_copy,?fill_value=self.fill_value)
????????return?inv_interp_func(erf(x))

????@staticmethod
????def?drop_duplicates(x):
????????is_unique?=?np.zeros_like(x,?dtype=bool)
????????is_unique[np.unique(x,?return_index=True)[1]]?=?True
????????return?x[is_unique]

2.3 RankGaussian處理

feature_names?=?['cont1',?'cont2',?'cont3',?'cont4',?'cont5',?'cont6',?'cont7','cont8',?'cont9',?'cont10',?'cont11',?'cont12',?'cont13',?'cont14']
scaler_linear????=?GaussRankScaler(interp_kind='linear',)?
for?c?in?feature_names:
????train_test[c+'_linear_grank']?=?scaler_linear.fit_transform(train_test[c].values.reshape(-1,1))
????
gaussian_linear_feature_names?=?[c?+?'_linear_grank'?for?c?in?feature_names]

3. NN模型建模

from?tensorflow.keras?import?regularizers
from?sklearn.model_selection?import?KFold,?StratifiedKFold
import?tensorflow?as?tf
#?import?tensorflow_addons?as?tfa
import?tensorflow.keras.backend?as?K
from?tensorflow.keras.layers?import?*
from?tensorflow.keras.models?import?*
from?tensorflow.keras.optimizers?import?*
from?tensorflow.keras.callbacks?import?*
from?tensorflow.keras.layers?import?Input
import?os?

3.1 訓(xùn)練&驗證劃分

  • 隨機劃分訓(xùn)練集和驗證集
tr?=?train_test.iloc[:train.shape[0],:].copy()
te?=?train_test.iloc[train.shape[0]:,:].copy()?
kf??????????=?KFold(n_splits=5,random_state=48,shuffle=False)?
cnt?????????=?0
for?trn_idx,?test_idx?in?kf.split(tr,tr['target']):
????if?cnt?==?0:
????????cnt?+=?1
????????continue
????X_tr_gbdt,X_val_gbdt?=?tr[feature_names].iloc[trn_idx],tr[feature_names].iloc[test_idx]
????X_tr_dnn_linear_gaussian,X_val_dnn_linear_gaussian?=?tr[gaussian_linear_feature_names].iloc[trn_idx],tr[gaussian_linear_feature_names].iloc[test_idx]
????y_tr,y_val?=?tr['target'].iloc[trn_idx],train['target'].iloc[test_idx]
????break
/home/inf/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_split.py:297: FutureWarning: Setting a random_state has no effect since shuffle is False. This will raise an error in 0.24. You should leave random_state to its default (None), or set shuffle=True.
FutureWarning

3.2 MLP模型(sigmoid):0.7108

  • 基于sigmoid的回歸
class?MLP_Model(tf.keras.Model):?
????def?__init__(self):
????????super(MLP_Model,?self).__init__()?
????????self.dense1?=Dense(1000,?activation='relu')??
????????self.drop1??=?Dropout(0.25)
????????self.dense2?=Dense(500,?activation='relu')??
????????self.drop2??=?Dropout(0.25)?
????????self.dense_out?=Dense(1,activation='sigmoid')?

????def?call(self,?inputs):
????????min_target?=?0
????????max_target?=?10.26757
????????x1??????=?self.dense1(inputs)
????????x1??????=?self.drop1(x1)
????????x2??????=?self.dense2(x1)
????????x2??????=?self.drop2(x2)
????????outputs??????=?self.dense_out(x2)
????????outputs??=??outputs?*?(max_target?-?min_target)?+?min_target??
????????return?outputs

import?time??
def?RMSE(y_true,?y_pred):
????return?tf.sqrt(tf.reduce_mean(tf.square(y_true?-?y_pred)))

model?=?MLP_Model()
adam?=?tf.optimizers.Adam(lr=1e-3)
model.compile(optimizer=adam,?loss=RMSE)

K.clear_session()?
model_weights?=?f'./models/model_gauss_mlp_mlp.h5'
checkpoint?=?ModelCheckpoint(model_weights,?monitor='loss',?verbose=0,?save_best_only=True,?mode='min',
?????????????????????????????save_weights_only=True)
plateau????????=?ReduceLROnPlateau(monitor='val_loss',?factor=0.5,?patience=10,?verbose=1,?min_delta=1e-4,?mode='min')
early_stopping?=?EarlyStopping(monitor="val_loss",?patience=25)
history?=?model.fit(X_tr_dnn_linear_gaussian.values,?y_tr.values,
????????????????????????validation_data=(X_val_dnn_linear_gaussian.values,?y_val.values),
????????????????????batch_size=1024,?epochs=100,
????????????????????callbacks=[plateau,?checkpoint,?early_stopping],
????????????????????verbose=2
???????????????????)?
WARNING:tensorflow:Entity > could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: Failed to parse source code of >, which Python reported as:
def call(self, inputs):
min_target = 0
max_target = 10.26757
x1 = self.dense1(inputs)
x1 = self.drop1(x1)
x2 = self.dense2(x1)
x2 = self.drop2(x2)
outputs = self.dense_out(x2)
outputs = outputs * (max_target - min_target) + min_target
# outputs = self.dense_out(x3) # 1500 original

return outputs

This may be caused by multiline strings or comments not indented at the same level as the code.
WARNING: Entity > could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: Failed to parse source code of >, which Python reported as:
def call(self, inputs):
min_target = 0
max_target = 10.26757
x1 = self.dense1(inputs)
x1 = self.drop1(x1)
x2 = self.dense2(x1)
x2 = self.drop2(x2)
outputs = self.dense_out(x2)
outputs = outputs * (max_target - min_target) + min_target
# outputs = self.dense_out(x3) # 1500 original

return outputs

This may be caused by multiline strings or comments not indented at the same level as the code.
Train on 240000 samples, validate on 60000 samples
Epoch 1/100
WARNING:tensorflow:Entity .initialize_variables at 0x7f4818c32950> could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module 'gast' has no attribute 'Num'
WARNING: Entity .initialize_variables at 0x7f4818c32950> could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module 'gast' has no attribute 'Num'
240000/240000 - 1s - loss: 0.8020 - val_loss: 0.7203
Epoch 2/100
240000/240000 - 0s - loss: 0.7345 - val_loss: 0.7225
Epoch 3/100
240000/240000 - 0s - loss: 0.7290 - val_loss: 0.7183
Epoch 4/100
240000/240000 - 0s - loss: 0.7270 - val_loss: 0.7197
Epoch 5/100
240000/240000 - 0s - loss: 0.7247 - val_loss: 0.7170
Epoch 6/100
240000/240000 - 0s - loss: 0.7232 - val_loss: 0.7190
Epoch 7/100
240000/240000 - 0s - loss: 0.7227 - val_loss: 0.7157
Epoch 8/100
240000/240000 - 0s - loss: 0.7205 - val_loss: 0.7215
Epoch 9/100
240000/240000 - 0s - loss: 0.7199 - val_loss: 0.7144
Epoch 10/100
240000/240000 - 0s - loss: 0.7185 - val_loss: 0.7148
Epoch 11/100
240000/240000 - 0s - loss: 0.7175 - val_loss: 0.7176
Epoch 12/100
240000/240000 - 0s - loss: 0.7170 - val_loss: 0.7147
Epoch 13/100
240000/240000 - 0s - loss: 0.7165 - val_loss: 0.7142
Epoch 14/100
240000/240000 - 0s - loss: 0.7157 - val_loss: 0.7140
Epoch 15/100
240000/240000 - 0s - loss: 0.7150 - val_loss: 0.7132
Epoch 16/100
240000/240000 - 0s - loss: 0.7145 - val_loss: 0.7127
Epoch 17/100
240000/240000 - 0s - loss: 0.7136 - val_loss: 0.7127
Epoch 18/100
240000/240000 - 0s - loss: 0.7131 - val_loss: 0.7124
Epoch 19/100
240000/240000 - 0s - loss: 0.7126 - val_loss: 0.7165
Epoch 20/100
240000/240000 - 0s - loss: 0.7120 - val_loss: 0.7130
Epoch 21/100
240000/240000 - 0s - loss: 0.7116 - val_loss: 0.7119
Epoch 22/100
240000/240000 - 0s - loss: 0.7111 - val_loss: 0.7129
Epoch 23/100
240000/240000 - 0s - loss: 0.7104 - val_loss: 0.7129
Epoch 24/100
240000/240000 - 0s - loss: 0.7102 - val_loss: 0.7136
Epoch 25/100
240000/240000 - 0s - loss: 0.7097 - val_loss: 0.7120
Epoch 26/100
240000/240000 - 0s - loss: 0.7089 - val_loss: 0.7126
Epoch 27/100
240000/240000 - 0s - loss: 0.7084 - val_loss: 0.7154
Epoch 28/100
240000/240000 - 0s - loss: 0.7078 - val_loss: 0.7111
Epoch 29/100
240000/240000 - 0s - loss: 0.7075 - val_loss: 0.7132
Epoch 30/100
240000/240000 - 0s - loss: 0.7074 - val_loss: 0.7126
Epoch 31/100
240000/240000 - 0s - loss: 0.7062 - val_loss: 0.7129
Epoch 32/100
240000/240000 - 0s - loss: 0.7059 - val_loss: 0.7119
Epoch 33/100
240000/240000 - 0s - loss: 0.7054 - val_loss: 0.7135
Epoch 34/100
240000/240000 - 0s - loss: 0.7048 - val_loss: 0.7108
Epoch 35/100
240000/240000 - 0s - loss: 0.7048 - val_loss: 0.7116
Epoch 36/100
240000/240000 - 0s - loss: 0.7037 - val_loss: 0.7161
Epoch 37/100
240000/240000 - 0s - loss: 0.7034 - val_loss: 0.7131
Epoch 38/100
240000/240000 - 0s - loss: 0.7031 - val_loss: 0.7148
Epoch 39/100
240000/240000 - 0s - loss: 0.7022 - val_loss: 0.7113
Epoch 40/100
240000/240000 - 0s - loss: 0.7013 - val_loss: 0.7117
Epoch 41/100
240000/240000 - 0s - loss: 0.7012 - val_loss: 0.7124
Epoch 42/100
240000/240000 - 0s - loss: 0.7008 - val_loss: 0.7116
Epoch 43/100
240000/240000 - 0s - loss: 0.7001 - val_loss: 0.7124
Epoch 44/100

Epoch 00044: ReduceLROnPlateau reducing learning rate to 0.0005000000237487257.
240000/240000 - 0s - loss: 0.6995 - val_loss: 0.7113
Epoch 45/100
240000/240000 - 0s - loss: 0.6962 - val_loss: 0.7116
Epoch 46/100
240000/240000 - 0s - loss: 0.6954 - val_loss: 0.7118
Epoch 47/100
240000/240000 - 0s - loss: 0.6940 - val_loss: 0.7116
Epoch 48/100
240000/240000 - 0s - loss: 0.6938 - val_loss: 0.7120
Epoch 49/100
240000/240000 - 0s - loss: 0.6930 - val_loss: 0.7118
Epoch 50/100
240000/240000 - 0s - loss: 0.6927 - val_loss: 0.7123
Epoch 51/100
240000/240000 - 0s - loss: 0.6920 - val_loss: 0.7123
Epoch 52/100
240000/240000 - 0s - loss: 0.6915 - val_loss: 0.7125
Epoch 53/100
240000/240000 - 0s - loss: 0.6912 - val_loss: 0.7144
Epoch 54/100

Epoch 00054: ReduceLROnPlateau reducing learning rate to 0.0002500000118743628.
240000/240000 - 0s - loss: 0.6905 - val_loss: 0.7146
Epoch 55/100
240000/240000 - 0s - loss: 0.6885 - val_loss: 0.7123
Epoch 56/100
240000/240000 - 0s - loss: 0.6874 - val_loss: 0.7135
Epoch 57/100
240000/240000 - 0s - loss: 0.6872 - val_loss: 0.7136
Epoch 58/100
240000/240000 - 0s - loss: 0.6868 - val_loss: 0.7138
Epoch 59/100
240000/240000 - 0s - loss: 0.6863 - val_loss: 0.7134

3.3 MLP模型(linear):0.7137

class?MLP_Model(tf.keras.Model):

????def?__init__(self):
????????super(MLP_Model,?self).__init__()?
????????self.dense1?=Dense(1000,?activation='relu')??
????????self.drop1??=?Dropout(0.25)
????????self.dense2?=Dense(500,?activation='relu')?
????????self.drop2??=?Dropout(0.25)?
????????self.dense_out?=Dense(1)

????def?call(self,?inputs):?
????????x1??????=?self.dense1(inputs)
????????x1??????=?self.drop1(x1)
????????x2??????=?self.dense2(x1)
????????x2??????=?self.drop2(x2)
????????outputs?=?self.dense_out(x2)?
????????
????????return?outputs

model?=?MLP_Model()
adam?=?tf.optimizers.Adam(lr=1e-3)?
model.compile(optimizer=adam,?loss=RMSE)

K.clear_session()?
model_weights?=?f'./models/model_gauss_mlp_mlp.h5'
checkpoint?=?ModelCheckpoint(model_weights,?monitor='loss',?verbose=0,?save_best_only=True,?mode='min',
?????????????????????????????save_weights_only=True)
plateau????????=?ReduceLROnPlateau(monitor='val_loss',?factor=0.5,?patience=10,?verbose=1,?min_delta=1e-4,?mode='min')
early_stopping?=?EarlyStopping(monitor="val_loss",?patience=25)
history?=?model.fit(X_tr_dnn_linear_gaussian.values,?y_tr.values,
????????????????????????validation_data=(X_val_dnn_linear_gaussian.values,?y_val.values),
????????????????????batch_size=1024,?epochs=100,
????????????????????callbacks=[plateau,?checkpoint,?early_stopping],
????????????????????verbose=2
???????????????????)?
WARNING:tensorflow:Entity > could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: Bad argument number for Name: 3, expecting 4
WARNING: Entity > could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: Bad argument number for Name: 3, expecting 4
Train on 240000 samples, validate on 60000 samples
Epoch 1/100
WARNING:tensorflow:Entity .initialize_variables at 0x7f4818c487a0> could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module 'gast' has no attribute 'Num'
WARNING: Entity .initialize_variables at 0x7f4818c487a0> could not be transformed and will be executed as-is. Please report this to the AutoGraph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module 'gast' has no attribute 'Num'
240000/240000 - 1s - loss: 1.3292 - val_loss: 0.7767
Epoch 2/100
240000/240000 - 0s - loss: 0.8163 - val_loss: 0.7251
Epoch 3/100
240000/240000 - 0s - loss: 0.8072 - val_loss: 0.7251
Epoch 4/100
240000/240000 - 0s - loss: 0.8040 - val_loss: 0.7496
Epoch 5/100
240000/240000 - 0s - loss: 0.7997 - val_loss: 0.7324
Epoch 6/100
240000/240000 - 0s - loss: 0.7982 - val_loss: 0.7271
Epoch 7/100
240000/240000 - 0s - loss: 0.7936 - val_loss: 0.7202
Epoch 8/100
240000/240000 - 0s - loss: 0.7950 - val_loss: 0.7249
Epoch 9/100
240000/240000 - 0s - loss: 0.7914 - val_loss: 0.7284
Epoch 10/100
240000/240000 - 0s - loss: 0.7882 - val_loss: 0.7313
Epoch 11/100
240000/240000 - 0s - loss: 0.7886 - val_loss: 0.7303
Epoch 12/100
240000/240000 - 0s - loss: 0.7857 - val_loss: 0.7292
Epoch 13/100
240000/240000 - 0s - loss: 0.7855 - val_loss: 0.7257
Epoch 14/100
240000/240000 - 0s - loss: 0.7847 - val_loss: 0.7204
Epoch 15/100
240000/240000 - 0s - loss: 0.7825 - val_loss: 0.7224
Epoch 16/100
240000/240000 - 0s - loss: 0.7813 - val_loss: 0.7220
Epoch 17/100

Epoch 00017: ReduceLROnPlateau reducing learning rate to 0.0005000000237487257.
240000/240000 - 0s - loss: 0.7808 - val_loss: 0.7208
Epoch 18/100
240000/240000 - 0s - loss: 0.7752 - val_loss: 0.7187
Epoch 19/100
240000/240000 - 0s - loss: 0.7743 - val_loss: 0.7234
Epoch 20/100
240000/240000 - 0s - loss: 0.7730 - val_loss: 0.7190
Epoch 21/100
240000/240000 - 0s - loss: 0.7750 - val_loss: 0.7196
Epoch 22/100
240000/240000 - 0s - loss: 0.7742 - val_loss: 0.7286
Epoch 23/100
240000/240000 - 0s - loss: 0.7722 - val_loss: 0.7198
Epoch 24/100
240000/240000 - 0s - loss: 0.7720 - val_loss: 0.7227
Epoch 25/100
240000/240000 - 0s - loss: 0.7724 - val_loss: 0.7176
Epoch 26/100
240000/240000 - 0s - loss: 0.7705 - val_loss: 0.7194
Epoch 27/100
240000/240000 - 0s - loss: 0.7689 - val_loss: 0.7206
Epoch 28/100
240000/240000 - 0s - loss: 0.7696 - val_loss: 0.7168
Epoch 29/100
240000/240000 - 0s - loss: 0.7695 - val_loss: 0.7171
Epoch 30/100
240000/240000 - 0s - loss: 0.7681 - val_loss: 0.7164
Epoch 31/100
240000/240000 - 0s - loss: 0.7676 - val_loss: 0.7225
Epoch 32/100
240000/240000 - 0s - loss: 0.7681 - val_loss: 0.7177
Epoch 33/100
240000/240000 - 0s - loss: 0.7660 - val_loss: 0.7198
Epoch 34/100
240000/240000 - 0s - loss: 0.7668 - val_loss: 0.7202
Epoch 35/100
240000/240000 - 0s - loss: 0.7653 - val_loss: 0.7160
Epoch 36/100
240000/240000 - 0s - loss: 0.7647 - val_loss: 0.7248
Epoch 37/100
240000/240000 - 0s - loss: 0.7638 - val_loss: 0.7173
Epoch 38/100
240000/240000 - 0s - loss: 0.7626 - val_loss: 0.7197
Epoch 39/100
240000/240000 - 0s - loss: 0.7624 - val_loss: 0.7182
Epoch 40/100
240000/240000 - 0s - loss: 0.7615 - val_loss: 0.7195
Epoch 41/100
240000/240000 - 0s - loss: 0.7621 - val_loss: 0.7195
Epoch 42/100
240000/240000 - 0s - loss: 0.7616 - val_loss: 0.7192
Epoch 43/100
240000/240000 - 0s - loss: 0.7604 - val_loss: 0.7162
Epoch 44/100
240000/240000 - 0s - loss: 0.7592 - val_loss: 0.7152
Epoch 45/100
240000/240000 - 0s - loss: 0.7600 - val_loss: 0.7193
Epoch 46/100
240000/240000 - 0s - loss: 0.7594 - val_loss: 0.7206
Epoch 47/100
240000/240000 - 0s - loss: 0.7578 - val_loss: 0.7201
Epoch 48/100
240000/240000 - 0s - loss: 0.7583 - val_loss: 0.7164
Epoch 49/100
240000/240000 - 0s - loss: 0.7581 - val_loss: 0.7163
Epoch 50/100
240000/240000 - 0s - loss: 0.7572 - val_loss: 0.7163
Epoch 51/100
240000/240000 - 0s - loss: 0.7554 - val_loss: 0.7166
Epoch 52/100
240000/240000 - 0s - loss: 0.7564 - val_loss: 0.7212
Epoch 53/100
240000/240000 - 0s - loss: 0.7560 - val_loss: 0.7156
Epoch 54/100

Epoch 00054: ReduceLROnPlateau reducing learning rate to 0.0002500000118743628.
240000/240000 - 0s - loss: 0.7547 - val_loss: 0.7180
Epoch 55/100
240000/240000 - 0s - loss: 0.7530 - val_loss: 0.7154
Epoch 56/100
240000/240000 - 0s - loss: 0.7534 - val_loss: 0.7150
Epoch 57/100
240000/240000 - 0s - loss: 0.7531 - val_loss: 0.7148
Epoch 58/100
240000/240000 - 0s - loss: 0.7530 - val_loss: 0.7156
Epoch 59/100
240000/240000 - 0s - loss: 0.7523 - val_loss: 0.7166
Epoch 60/100
240000/240000 - 0s - loss: 0.7522 - val_loss: 0.7152
Epoch 61/100
240000/240000 - 0s - loss: 0.7520 - val_loss: 0.7155
Epoch 62/100
240000/240000 - 0s - loss: 0.7514 - val_loss: 0.7148
Epoch 63/100
240000/240000 - 0s - loss: 0.7514 - val_loss: 0.7149
Epoch 64/100
240000/240000 - 0s - loss: 0.7506 - val_loss: 0.7156
Epoch 65/100
240000/240000 - 0s - loss: 0.7508 - val_loss: 0.7150
Epoch 66/100
240000/240000 - 0s - loss: 0.7516 - val_loss: 0.7154
Epoch 67/100

Epoch 00067: ReduceLROnPlateau reducing learning rate to 0.0001250000059371814.
240000/240000 - 0s - loss: 0.7507 - val_loss: 0.7153
Epoch 68/100
240000/240000 - 0s - loss: 0.7502 - val_loss: 0.7149
Epoch 69/100
240000/240000 - 0s - loss: 0.7497 - val_loss: 0.7147
Epoch 70/100
240000/240000 - 0s - loss: 0.7496 - val_loss: 0.7148
Epoch 71/100
240000/240000 - 0s - loss: 0.7502 - val_loss: 0.7142
Epoch 72/100
240000/240000 - 0s - loss: 0.7492 - val_loss: 0.7148
Epoch 73/100
240000/240000 - 0s - loss: 0.7487 - val_loss: 0.7148
Epoch 74/100
240000/240000 - 0s - loss: 0.7485 - val_loss: 0.7143
Epoch 75/100
240000/240000 - 0s - loss: 0.7496 - val_loss: 0.7154
Epoch 76/100
240000/240000 - 0s - loss: 0.7482 - val_loss: 0.7144
Epoch 77/100
240000/240000 - 0s - loss: 0.7488 - val_loss: 0.7142
Epoch 78/100
240000/240000 - 0s - loss: 0.7492 - val_loss: 0.7145
Epoch 79/100
240000/240000 - 0s - loss: 0.7483 - val_loss: 0.7143
Epoch 80/100
240000/240000 - 0s - loss: 0.7478 - val_loss: 0.7143
Epoch 81/100

Epoch 00081: ReduceLROnPlateau reducing learning rate to 6.25000029685907e-05.
240000/240000 - 0s - loss: 0.7481 - val_loss: 0.7143
Epoch 82/100
240000/240000 - 0s - loss: 0.7480 - val_loss: 0.7146
Epoch 83/100
240000/240000 - 0s - loss: 0.7477 - val_loss: 0.7141
Epoch 84/100
240000/240000 - 0s - loss: 0.7471 - val_loss: 0.7139
Epoch 85/100
240000/240000 - 0s - loss: 0.7475 - val_loss: 0.7140
Epoch 86/100
240000/240000 - 0s - loss: 0.7473 - val_loss: 0.7141
Epoch 87/100
240000/240000 - 0s - loss: 0.7469 - val_loss: 0.7141
Epoch 88/100
240000/240000 - 0s - loss: 0.7474 - val_loss: 0.7148
Epoch 89/100
240000/240000 - 0s - loss: 0.7467 - val_loss: 0.7138
Epoch 90/100
240000/240000 - 0s - loss: 0.7466 - val_loss: 0.7142
Epoch 91/100
240000/240000 - 0s - loss: 0.7460 - val_loss: 0.7141
Epoch 92/100
240000/240000 - 0s - loss: 0.7465 - val_loss: 0.7138
Epoch 93/100
240000/240000 - 0s - loss: 0.7469 - val_loss: 0.7142
Epoch 94/100
240000/240000 - 0s - loss: 0.7467 - val_loss: 0.7141
Epoch 95/100
240000/240000 - 0s - loss: 0.7465 - val_loss: 0.7148
Epoch 96/100
240000/240000 - 0s - loss: 0.7465 - val_loss: 0.7138
Epoch 97/100
240000/240000 - 0s - loss: 0.7461 - val_loss: 0.7138
Epoch 98/100
240000/240000 - 0s - loss: 0.7456 - val_loss: 0.7140
Epoch 99/100

Epoch 00099: ReduceLROnPlateau reducing learning rate to 3.125000148429535e-05.
240000/240000 - 0s - loss: 0.7463 - val_loss: 0.7139
Epoch 100/100
240000/240000 - 0s - loss: 0.7461 - val_loss: 0.7137
參考文獻
  1. https://www.kaggle.com/c/tabular-playground-series-jan-2021/data
  2. https://www.kaggle.com/c/tabular-playground-series-jan-2021/discussion/216037

往期精彩回顧





本站知識星球“黃博的機器學(xué)習(xí)圈子”(92416895)

本站qq群704220115。

加入微信群請掃碼:

瀏覽 92
點贊
評論
收藏
分享

手機掃一掃分享

分享
舉報
評論
圖片
表情
推薦
點贊
評論
收藏
分享

手機掃一掃分享

分享
舉報

感谢您访问我们的网站,您可能还对以下资源感兴趣:

国产秋霞理论久久久电影-婷婷色九月综合激情丁香-欧美在线观看乱妇视频-精品国avA久久久久久久-国产乱码精品一区二区三区亚洲人-欧美熟妇一区二区三区蜜桃视频 在线观看亚洲天堂| 91久久性奴调教| 三级片在线观看视频| 激情丁香六月| 日韩一级免费在线观看| 欧美激情中文字幕| 亚洲AV五月天在线| 免费a视频| 亚洲欧洲日韩| 91人妻人人澡人人爽人人爽 | 粉嫩小泬粉嫩小泬在线| 亚洲福利网| 日韩人妻无码一区二区三区七区| 蜜臀久久99精品久久久电影| 人人爽人人爱| 午夜AV电影| 伊人大香在线| 97av视频| 久久亚洲中文| 精品福利在线观看| 国内免费av| 久草视频网| 欧美在线视频网| 国产一级A片免费播放| 国产人国产视频成人免费观看…| 日韩在线免费观看视频| 无码一区三区| 亚洲图片在线观看| 一卡二卡三卡| 亚洲制服中文字幕| 黄色A片网址| 最新国产第一页| 中文字幕在线视频观看| 亚洲天堂2015| 张柏芝BBw搡BBBB槡BBBBHDfree | 夜夜骑天天操| 中文字幕av一区二区| 国产不卡在线视频| 91大鸡| 日本色情视频网站| 久久国产黄色一级片| jizzjizz国产| 欧美日韩国产一区二区| 无卡无码| 超碰爱爱| 12——13女人毛片毛片| 草逼123| igao在线观看| 天天想天天干| 欧美日韩中国操逼打炮| 成人精品无码| 先锋无码| 欧美中文字幕在线视频| av在线资源网| 性天堂| 国产成人免费看| 国产黄色片在线播放| 日本黄色视频。| 91人妻日韩人妻无码专区精品| 亚洲综合社区在线| 欧美一级在线观看| 成人h在线| 黄色小视频免费看| 无码AV在线观看| 国产AV一二三区| 熟女天堂| 亚洲精品成人无码AV在线| 久久成人影音| 欧美激情国产精品| 一区二区三区四区无码在线| 一级片麻豆| 三级网址大全| 国产久久免费视频| 兔子先生和優奈玩游戲脫衣服,運氣報表優奈輸到脫精光 | 天堂在线中文字幕| 五月丁香999| 99久久婷婷| 北条麻妃JUX-869无码播放| 日本道在线视频| 大香蕉久久精品| 成人免费无码激情AV片| 高清无码中文字| 国产成人精品a视频| 激情AV在线观看| 高清无码一区| 亚洲高清无码免费| 成人777| 爱视频福利| jizz在线免费观看| 亚洲欧洲中文字幕| 日本A片视频| 超碰人人草| 免费无码国产| 婷婷综合五月| 99色色网| AV不卡在线| 久久久久久久久免费看无码| 国产日韩欧美综合在线| 国产亚洲精品码| 欧美口爆视频| 水蜜桃视频免费| 天天高清无码| 日韩无码AV一区二区| 色色网五月天| 91成人在线播放| 欧美另类综合| 3d动漫精品H区XXXXX区| 99成人免费视频| 国产精品视频网站| 国产草草| 日批视频在线观看| 日逼精品| 肏屄视频免费观看| 色天堂网| 九九精品热| 九九九亚洲| 国产乱轮视频| 一级a片免费看| 国产伦精品一区二区三区妓女| 久久大鸡吧| 日韩v| 99热国产在线| 丁香五月天激情| 亚洲一区无码| 小日本91在线观看| 欧美老女人操逼群| 88av在线观看| 一级片三级片| 欧洲在线观看| 日本在线免费视频| 大香蕉伊人综合在线| 青青草成人AV| 91看片看婬黄大片女跟女| 男女无套在线观看免费| 色五月在线视频| 俺操也| 精品在线第一页| 国产激情福利| 久久婷五月| 日本黄色视频免费观看| 无码人妻一区二区三区精品不付款| 久热超碰| 日韩经典视频在线播放| 亚洲AV秘一区二区色盗战流出| 大地8免费高清视频观看大全 | 欧美日韩高清无码| 亚洲在线无码视频| 中文字幕一区二区三区的重点问题 | a视频免费观看| 玖玖爱国产| 国产精品TV| 亚洲天堂在线视频观看| 欧美成人毛片AAAAAA| 日本操逼在线播放| 午夜亚洲福利视频| 亚洲中文字幕第一| 亚洲熟妇在线观看一区二区| AV在线导航| 久99久视频| 性爱av天堂| 国产成人高潮毛片| 福利导航网| 婷婷综合视频| 天天搞天天曰在线观看| 久草视频在线免费| 亚洲黄色视频在线免费观看| 欧美国产精品一区二区三区| 成人理伦A级A片在线论坛 | 91人妻人人爽| 国产一级在线| 中文字幕永久在线5| 操逼网站在线| 99热999| 国产午夜精品一区二区三区牛牛| 色色网五月天| 奇米狠狠色| 97人妻精品一区二区三区| 精品视频一区二区| 国产最新在线| 婷婷日韩在线| 自拍偷拍图区| 亚洲激情一区| 人人操在线播放| 日本不卡一区| 成人精品一区二区三区视频| 黑人粗大无码| 边添小泬边狠狠躁视频| 围内精品久久久久久久久久‘变脸 | 奇米影视av| 亚洲成人精品一区| 一区二区成人视频| 91乱子伦国产乱| 波多野结衣亚洲无码| 国产av福利| 日韩顶级毛片| 嫩草视频在线播放| 午夜视频在线看| 91久久亚洲| 色婷婷影院| 亚洲无码免费看| 亚洲福利影院| 亚洲成人福利在线| 激情开心站| a片免费观看视频| 久久污| 婷婷伊人綜合中文字幕| 黄色无码视频| 亚洲午夜AV| 53岁露大奶熟女偷情贴吧| 五月天色色网站| 亚洲手机视频| 成人免费黄片| 国产一级A片视频| 国产又爽又黄免费观看| 91叉叉叉| 国精产品一区一区三区| 黄色内射在线播放| 9l蝌蚪PORNY中文| 亚洲伊人成人| 色婷婷日韩精品一区二区三区| 亚洲色逼| 欧美性爱天天| 少妇做爱特级AAA| 在线观看视频免费无码免费视频| 国产精视频| 国产黄色电影在线观看| 伊人影院在线免费观看| 欧一美一婬一伦一区二区三区黑人-亚 | 日本黄在线播放| 久久人体视频| 成人久久网| 国产精品一级片| 久草视频在线资源| 国产真实乱婬A片久久久老牛| 亚洲一区三区| 欧美不卡一区| 欧美激情性爱网站| 国产网站精品| 91成人精品一区二区| 无码久久| av日韩在线播放| 二区精品| www中文字幕| 超碰免费在线| 国产精品色情A级毛片| 久久精品一区二区三区不卡牛牛| 国产porn| 天堂在线最新资源| 六十路老熟女码视频| 亚洲av偷拍| 日韩无码播放| 久久无码影视| 夜夜骚av.一区二区三区四区| 国产亚洲Av| 亚洲成人中文字幕| 国产思思99re99在线观看| a片视频网站| 男人的天堂视频| 亚洲aV影院| 97超碰成人| 精品乱子伦一区二区三区免费播成| 91日韩在线| 久久中文无码| 精品国产欧美| 操人妻视频| A免费观看| 99久久夜色精品国产亚洲| 不卡AV在线| 牛牛精品视频一区二区| a片在线观看视频| 无码人妻精品一区二区三千菊电影 | 黄色三级在线观看| 亚洲国产成人无码| 久热视频在线| 99人妻人人爽人人添人人精品 | 国产亚洲视频完整在线观看| 欧美大鸡巴在线观看| 亚欧久久| 最新97色黄色精品高清网站| 日韩超清无码| 人妻天天爽| gogogo高清在线观看免费直播中国| 欧美在线视频一区二区| 无码理论片| 女人卖婬视频播放| 污视频网站免费在线观看| 婷婷91| 亚洲AV成人片色在线观看麻豆| 少妇搡BBBB搡BBB搡小说| 欧美成人第一页| 国产日本欧美韩国久久久久| 黄色av免费网站| 丝袜足交视频| 成人伦理聚合| 亚洲一区二区三区在线播放| 口爆在线观看| 亚洲图片中文字幕| 精品欧美视频| 自拍偷拍网| 精品人妻一区二区三区蜜桃| 大鸡吧视频在线观看| A片观看视频| 日韩成人av在线| 先锋成人资源| 亚洲无码一区二区在线| 日本一区二区三区在线播放| 中文字幕无码在线| 息子交尾一区二区三区| 午夜久久| 在线观看国产欧美| 久久久久久91| 97精品国产| 人妻一区二区在线| 古装一级无遮挡A片| 成人国产欧美日韩在线视频| 国产一区在线看| 伊人无码视频| 三级片AAAA| 综合国产| 国产成人片色情AAAA片| 88AV在线视频| 秋霞福利视频| 久久午夜无码人妻精品蜜桃冫| 操操日| 欧美一级电影| 亚洲日本国产| 亚洲中文字幕在线观看视频网站| 无码中文在线| 蜜臀久久99精品久久久晴天影视 | 无码在线观看免费视频| 伊人网在线视频| 久久久成人影片| 7799精品视频| 91黄网站在线观看| 日韩无码二区| 国产亚洲无码| 亚洲日韩免费观看| 黄频美女日本免费| 人人免费操| 国产一级a| 自拍偷拍激情视频| 国产成人精品免费视频| 国产美女免费视频| 中文字幕-区二区三区四区视频中国 | 狠狠的操| 成人无码观看| 爱爱网址| 欧美BBWBBWBBWBBWBBwBBW | 丰满人妻一区二区三区四区54| 国产免费A片| 日本无码精品| 国产中文字幕在线播放| 三区在线观看| 免费在线成人网站| 天天拍天天日| 成人午夜无码视频| 簧片网站免费| 成人午夜激情| 黄色三极片| 国产成人无码永久免费| 婷婷视频导航| 香蕉网址| 老鸭窝久久久| 蜜桃91精品秘成人取精库| 爱爱视频无码| 欧美三级电影在线观看| 特黄A级毛片| 学生妹一级片| 囯产伦精一区二区三区四区| 久久久久久久网站| 中文原创麻豆传媒md0052| 精品熟妇| 日韩小电影在线观看| 91免费看片| 日韩中文字幕成人| 国产一区二区三区视频在线观看| 黄色视频在线| 中文字幕久久人妻无码精品蜜桃 | 亚洲秘无码一区二区三区,| 91无码一区二区| 中文字幕一区二区三区四虎在线| 影音先锋AV无码| 中文字幕欧美在线| 91无码精品国产AⅤ| 成人A片一级| 69人人| 国产精品秘久久久久久免费播放| 99精品免费观看| 在线人妻| ww无码| 色婷婷激情五月天| 亚洲成人大香蕉| 国产一区视频在线| 爱液视频| 国产精品欧美日韩| 中国熟妇XXXX18| www.xxx| 亚洲免费观看视频| 婷婷操逼网| 伊人久久福利视频| 成人精品一区二区三区中文字幕| 亚洲AV无码成人网站国产网站| 黄片免费视频观看| 操小逼视频| 亚洲69视频| 麻豆秘在线观看国产| 91视频网站在线观看| 老太婆擦BBBB撩BBBB| 热热毛片| 日本熟妇高潮BBwBBwBBw| 亚洲精品在| 美日韩中文字幕| 最新亚洲无码在线观看| 狠狠色婷婷7777| 久久精品一二三| 中文字幕你懂的在线三级| 婷婷三级片| 日韩极品视频在线| 亚洲无码视频免费在线观看| 美女人人操| 中文字幕三级片| 麻豆传媒一区二区| 精品无码久久久久久久久app| 日韩高清无码专区| 操女人的网站| 天堂中文字幕在线观看| 色噜噜一区二区三区| 国产精品18在线| 国产三级国产三级国产| 水果派红桃AV解说| 人妻视频在线| 人人草在线| 久久cao| 成人动漫| 精品人妻一区二区三区四区不卡在| 神马影院午夜福利| 青青草原视频在线免费观看| 瑟瑟视频在线观看| 欧美中文字幕| 麻豆视频在线免费观看| 在线观看黄色AV| 成人影视在线免费观看| 亚洲中文字幕在线视频| 亚洲黄色视频免费看| 亚州在线播放| 综合久久网| 久久久久麻豆V国产精华液好用吗| 麻豆三级精品| 高清国产mv在线观看| 色黄网站在线观看| 色伊人久操视频| 五月在线| 日韩高清成人无码| 91愛爱| 婷婷五月天国产| 一级a免一级a做免费线看内裤| 五月丁香啪啪啪| 人人澡人人澡人人| 69久久成人精品| 一区二区三区久久久| 久一视频| 欧美一区二区三区系列电影| 在线无码播放| 韩国精品一区二区三区| 久久久久黄| av免费网站| 丁香五月综合网| 色播视频在线观看| 日韩欧美高清| 欧美一级在线视频| 免费一区视频| 亚洲欧美美国产| 337p大胆色噜噜噜噜噜| 国产成人777777精品综合| 黑人粗暴偷拍一区二区| 免费网站观看www在线观看| 99精品视频网站| 黑人无码一二三四五区| 91成人一区二区三区| 一区二区三区在线观看免费| 人人操夜夜| 无码视频免费看| 午夜神马影院| 亚洲AV无码高清| 亚洲黄色成人网站| 中文无码播放| 天天日天天草天天干| 亚洲精品无码视频| 欧洲成人免费视频| 在线亚洲欧洲| 欧美精品无码久久久精品酒店| 无码一区二区三区在线| 国产亚洲欧洲| 热热热热色| 99热1| 日本黄色一级| 国产人妖网站| 成人久久AV| 97资源在线| 偷拍欧美日韩| 91亚洲国产精品| 午夜操一操一级| 豆花视频成人| 中文字幕亚洲欧美| 九九九九九九精品视频| 亚洲女与黑人正在播放| 黄页网站免费观看| 免费三级毛片| 操b视频免费| 亚洲在线观看| 国产肏逼视频| 一区二区三区免费播放| 五月丁香激情四射| 亚洲九九| 欧日无码| 日本一级大片| 亚洲中文字幕在线无码| 无码日韩AV| 麻豆91在线| 超碰97在线免费| 丝袜久久| 蜜桃网一区二区| 大香蕉一区二区三区| 日韩国产一区二区| 无码不卡av| 久久yy| 麻豆免费福利视频| 亚洲天堂精品在线观看| 91AV免费观看| 91高清无码视频| 嫩BBB槡BBBB槡BBB小号| 日本精品人妻无码77777| 午夜无码人妻AV大片| 欧美人妻无码| 操逼视频试看| 欧美精品午夜福利无码| 国产精品一区二区视频| 嫩BBB槡BBBB槡BBBB撒尿| 久久AV网站| 四川性BBB搡BBB爽爽爽小说 | 亚洲无码高清视频在线| 亚洲视频精品| 亚洲人人爱| 性爱免费专区| 欧美色交| 无码在线免费视频| 高潮免费视频| 水蜜桃一区二区三区| 美女操网站| 奇米影视亚洲春色| 成人日韩欧美| 日韩免费成人| 在线国产日韩| 激情五月天色色| 午夜操人妻| 97成人视频| 日韩午夜成人电影| 亚洲性爱小说| 亚洲在线观看中文字幕| 成人免费一级视频| 美日韩三级| 毛片在线视频| 黄色成人在线免费观看| 亚洲中文字幕不卡| 日韩欧美小电影| 91插插网| 四虎影院中文字幕| 99热在线免费观看| 亚洲无码中文字幕在线播放| 日本免费不卡| 麻豆视频免费观看| 亚洲区在线播放| 97资源网站| 国产操逼网| 国产精品成人无码免费| 亚洲精品成人7777777| 精品一区二区三区在线观看| 黄色动漫在线免费观看| 黄色小视频免费| av天天日| 中文字幕免费在线看一区七区| 三级片在线视频| 自拍偷拍中文字幕| 少妇搡BBBB搡BBB搡造水多| 蜜桃成人无码区免费视频网站| 国产91无码精品秘入口| 91久久久久久久久久久| 中文字幕在线观看a| 青娱乐精品视频| 一区二区三区四区久久| 日韩v在线| JlZZJLZZ亚洲美女18| 色天天干| 中文字幕无码影院| 欧美男女操逼视频| 天天日天天操天天日| 女人的天堂AAA| 青草福利视频| 超碰日本| 亚洲国产中文字幕| 97人妻人人揉人人躁人人| 亚洲.欧美.丝袜.中文.综合| 久久婷婷在线| 91在线一区| 大香蕉青青| www久草| 欧美成人aaa| 欧美一级片| 美女被操免费网站| jiujiuav| 曰韩毛片| 婷婷亚洲色| 97在线视频免费观看| 国产午夜精品一区二区| 黄在线免费观看| AV资源在线免费观看| 91无码一区二区| 欧美三级不卡| 久久国产精品影院| 欧美操逼免费视频| 波多野吉衣视频| 91在线观看18| 国产性爱网| 亚洲黄色网址| 成人片网址| 久久久久久亚洲AV无码专区| 国产午夜福利在线| 亚洲专区在线| 又大又黄又爽| 91视频在线免费看| 天天免费视频| 无码欧美成人| 自拍视频一区| 亚洲AV无码秘翔田| 九九热在线精品视频| 五月丁香综合在线| 亚洲人妖在线| 国产成人V在线精品一区| 成人毛片一区二区三区| 国产美女18毛片水真多| 日韩三级在线免费观看| 一级大毛片| 亚洲第一网无码性色| 91av视频| 高清免费无码| 中文字幕不卡| 五月婷婷中文字幕| 天天色色| 啪啪啪啪网站| 亚洲无码视频免费观看| 东京热一区二区| 亚洲A√| 一区二线视频| 色婷婷精品| 91大片| aaa黄片| 97人人妻| 免费一级a片| 天天日天天色| 四虎视频| 一级a片免费看| 亚洲天堂在线视频观看| 亚洲婷婷在线| 国产强伦轩免费视频在线| 亚洲成人三区| 人人干人人爱| 久久免费精品| 日韩AV在线免费观看| 日韩一级黄色电影| 污网站在线观看| 色婷婷亚洲| 日韩在线不卡| 台湾精品一区二区三区| 日韩欧美国产视频| 人人人人人操| 91绿帽人妻-ThePorn| 日韩在线综合| 91免费视频在线| 欧美1区| 日韩AV电影在线观看| 91高清视频| 亚洲一区视频| 97超碰人人| 中文无码精品欧美日韩| 偷拍视频图片综合网| 日韩中文性受视频| 丰满岳乱妇一区二区三区| 日本免费在线观看| 视频一区二区三区免费| 黄色成人在线免费观看| 亚洲操b| 国产精品久久久精品| 日韩无码一级| 99欧美| 日韩操逼视频| 美日韩三级| 欧美69| 在线天堂9| 日韩AV无码高清| 538在线观看| 亚洲AV无码蜜桃| 伊人大香蕉视频| 98无码人妻精品一区二区三区 | 苍井空中文字幕在线观看| 免费看a| 国产激情精品视频| 东方a在线| a√天堂中文8| 国产成人69免费看| 91丨九色丨蝌蚪丨成人| 高清无码学生妹| 91成人久久| 久久艹视频| 日韩乱轮小说与视频| 国产高清成人| 天天插天天干| 99热官网| 护士小雪的yin荡高日记H视频 | 夜夜撸天天干| 亚洲在线视频观看| 欧美色影院| 免费视频一二三区| 国产变态另类| 日韩av在线电影| 色就是色欧美| 午夜福利视频网站| 大香蕉福利视频| 夜夜骑天天| 亚洲无码福利视频| 高清无码高潮| 亚洲中文娱乐| 久久精品操| 丝袜人妻| 日韩综合不卡| 91农村站街老熟女露脸| 国产V精品| 亚洲不卡中文字幕| 亚洲天堂一级片| 91精品国产一区二区三区| 日韩视频一区二区| 国产超碰青青草| 亚洲综合小说| 亚洲色图1| 激情乱伦视频| 久草电影在线观看| 国产精品一二| 老女人网站| 91免费在线| 好吊顶亚洲AV大香蕉色色| 免费av在线| 五月琪琪| 日韩免费在线视频观看| 99久久99久久久精品棕色圆| 国产激情免费视频| 中文字幕av一区二区| 日韩毛片在线| ThePorn-成人网站入口| 日韩99在线观看| 少妇熟女一区| 三级成人AV| 日韩在线视频观看| 欧美一级欧美三级在线观看| 农村新婚夜一级A片| 日韩在线观看一区| 色哟哟――国产精品| 黄色成人视频在线观看| 亚洲激情在线观看| 夜夜嗨av无码一区二区三区| 天天操免费视频| 亚洲成a人| 日韩A人人| 色丁香五月婷婷| 一级特黄毛片| 韩国GOGOGO高清| 日韩黄色视频网站| 国产亚洲视频免费观看| 日韩精品视频在线| 91成人视频在线免费观看| 中日美朝美女一级片免费看| 国产精品视频免费在线观看| 精品无码在线观看视频| av免费播放| 亚洲AV成人一区二区三区不卡| 毛片网页| 久久一卡二卡| 大色鬼在线天堂精品| 国产成人久久777777黄蓉| 大香蕉超碰在线| 免费人成视频在线播放| 天天干视频在线| 色婷婷久久综合久色| 91一区在线观看| 免费在线观看亚洲| 大地影视中文第三页最新在线观看| 亚洲无码字幕| 无码人妻一区二区三区| 肉片无遮挡一区二区三区免费观看视频 | 夜夜骚精品人妻av一区| 国内无码精品| 亚洲国产另类无码| 国产精品粉嫩福利在线| 中文字幕在线观看免费高清完整版在线观看| 9I看片成人免费视频| 在线v片| 欧美熟妇搡BBBB搡BBBBB| 中文字幕av一区二区| 在线视频你懂得| 大香蕉精品欧美色综合2025| 好好的日视频| 久久99国产乱子伦...| 欧美成人在线网站| 国产精品电影大全| 先锋成人影音| 无码人妻精品一区二区蜜桃漫画 | 99久久精彩视频| 91久久午夜无码鲁丝片久久人妻| 免费在线观看黄视频| 米奇色色| 国产日本在线视频| 国产成人秘在线观看免费网站 | 免费看欧美成人A片| 一级片AV| 色色热热| 搡BBBB搡BBB搡五十| 欧美性爱天天| 深爱激情五月婷婷| 操逼视频欧美| 亚洲一级黄| 亚洲高清福利| 日韩黄色小电影| 婷婷九九| 久热久热| 亚洲中文字幕无码在线观看| 日本电影一区二区| 天天操狠狠操| 九九热毛片在线观看| 黄色片网站视频| 91丨九色丨熟女新版| 青青草青娱乐| 久久久亚洲AV无码精品色午夜| 色色五月丁香| 成人做爰A片免费看网站| 日本超碰| 老鸭窝久久久| 翔田千里在线播放| 白浆av| 精品欧美片在线观看步骤| 影音先锋AV资源在线| 欧美一级性爱视频| 爽好紧别夹喷水无码| 免费观看成人毛片A片直播千姿| NP玩烂了公用爽灌满视频播放| 在线视频亚洲| 91精品老司机| 久热re| 久久久亚洲熟妇熟女| 91成人情欲影视网| 成人天堂一区二区三区| 思思热视频在线观看| 成人在线视频观看| 精品人人人人| 国产色情视频在线观看| 五月激情综合网| 91视频福利网| 西西特级WWW444无码| 欧美日韩一二三区| 五月花在线视频| 日韩爆乳一区二区三区| 黄色成人视频| 肏逼在线观看| 九九自拍视频| 亚洲一级内射| 五月人妻| 午夜激情免费| 国产高清无码视频在线观看| 秋霞网一区二区| 丁香五月婷婷久久| 天天干天天草| 大鸡巴日小逼| www.99国产| 欧美成人视频在线观看| jizz免费观看| 黄片无码免费观看| 日本a视频| 91精品国产综合久久久久久久| 中文字幕永久在线观看| 一级黄片免费看| 真人一级片| 国产午夜无码视频在线观看| 免费中文资源在线观看| 一级A片黄色| 亚洲精品乱码久久久久久蜜桃91| 无码av观看| 亚洲午夜久久久久久久久| 人人摸人人干人人操| 欧美一区电影| 亚洲精品女人久久久| 六月激情| 久久国产精彩视频| 日韩精品久久久久久久酒店|