sech函数,chiinv函数

1 .如果只将梯度下降函数变更为selu ” “分类问题selu函数, 坡度消失’ ‘ ‘ importtensorflowastfimportmatplotlibasmplimportmatplotlib.pyplotaspltimportnumpyasnpip mportpandasaspdimportotostosimportottttttomplib mportKeras#导入数据fashion _ Mn ist=keras.datasets.fashion _ Mn ist y _ train _ all x _ test,y_test )=fassst ) x_train=x_train 5000张之后的是训练集y_valid,y_train=y_train_all[:5000], y_train_all[5000:]#用于标准化数据的froms klearn.preprocessingimportstandardscalerscaler=standardscaler x _ train ) 28 ) x _ valid _ scaled=scaler.fit _ transform x _ valid.as type NP.float 32 ).reshape-1, 1 ) ) ).reshape 28 ) x _ test _ scaled=scaler.fit _ transform x _ test.as type NP.float 32 ).reshape-1 ) 1 ) ) ) ).reshape ) ) ) float ) ) 652 28 )培训模型model=keras.models.sequential model.add ) keras.layers.flayers.float ) 28 ) #20层神经网络for_inrange20 ) :模型. add keras.layers.dense 100, activation=’selu ‘ )是带规范化功能的activation=’softmax ‘ ) model.com pile loss=’ sparse _ categorical _ cross entropy ‘ ) 在sgd梯度下降法中,局部最小值点metrics=[ ‘ accuracy ‘ ] #回调函数logdir=’./dnn-selu-callbacks ‘ ifno tos.paccuracy ‘ #回调函数’ fashion_mnist_model.h5 ‘ ) callbacks={ keras.callbacks.tttallbacks } keras.callbacks.model check point oute ) Keras.callbacks.earlyssily }#培训开始history=model.fit x _ train _ scaled,y_train,epochs=10,validation y_valid ) #绘制精度def plot _ learning _ curves history ) :PD.dataframe ) history.history ).plot ) figsize=8 3

2 .在selu中添加dropout“”分类问题selu来缓和坡度,通常只在最后几层添加dropout。 防止过拟合’ ‘ ‘ importtensorflowastfimportmatplotlibasmplimportmatplotlib.pyplotaspltimportnumpyasnpimportsklearnimportpandandarttplib nsorflowimportKeras#导入数据fashion _ Mn ist=keras.datasets.fashion _ Mn ist x _ train _ all,y_ ) ) ) ) x_train=x_train_all[:5000]、x _ train _ y _ train=y _ train _ all [ :5000 ], y_train_all[5000:]#用于标准化数据的froms klearn.preprocessingimportstandardscalerscaler=standardscaler x _ train ) 28 ) x _ valid _ scaled=scaler.fit _ transform x _ valid.as type NP.float 32 ).reshape-1, 1 ) ) ).reshape 28 ) x _ test _ scaled=scaler.fit _ transform x _ test.as type NP.float 32 ).reshape-1 ) 1 ) ) ) ).reshape ) ) ) float ) ) 652 28 )培训模型model=keras.models.sequential model.add ) keras.layers.flayers.float ) 28 ) #添加20层神经网络for_inrange20 ) : model.add keras.layers.dense ) 100,activation=’ selu ‘ # sssse model.add keras.layers.alpha dropout rate=0.5 ) )、’ ‘ AlphaDropout:1 .平均值和方差不变2 .归一化性质也不变’ ‘ ‘ model.add activation=’softmax ‘ ) model.com pile loss=’ sparse _ categorical _ cross entropy ‘,optimizer=’sgd ), 在sgd梯度下降法中,局部最小值点metrics=[ ‘ accuracy ‘ ] #回调函数logdir=’./dnn-selu-callbacks ‘ ifno tos.paccuracy ‘ #回调函数’ fashion_mnist_model.h5 ‘ ) callbacks={ keras.callbacks.tttallbacks } keras.callbacks.model check point oute ) Keras.callbacks.earlyssily }#培训开始history=model.fit x _ train _ scaled,y_train,epochs=10,validation y_valid ) #绘制精度def plot _ learning _ curves history ) :PD.dataframe ) history.history ).plot ) figsize=8 3

Published by

风君子

独自遨游何稽首 揭天掀地慰生平

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注