keras topN显示,自编写代码案例
对于使用已经训练好的模型,比如VGG,RESNET等,keras都自带了一个keras.applications.imagenet_utils.decode_predictions的方法,有很多限制:
defdecode_predictions(preds,top=5):
"""DecodesthepredictionofanImageNetmodel.
#Arguments
preds:Numpytensorencodingabatchofpredictions.
top:Integer,howmanytop-guessestoreturn.
#Returns
Alistoflistsoftopclasspredictiontuples
`(class_name,class_description,score)`.
Onelistoftuplespersampleinbatchinput.
#Raises
ValueError:Incaseofinvalidshapeofthe`pred`array
(mustbe2D).
"""
globalCLASS_INDEX
iflen(preds.shape)!=2orpreds.shape[1]!=1000:
raiseValueError('`decode_predictions`expects'
'abatchofpredictions'
'(i.e.a2Darrayofshape(samples,1000)).'
'Foundarraywithshape:'+str(preds.shape))
ifCLASS_INDEXisNone:
fpath=get_file('imagenet_class_index.json',
CLASS_INDEX_PATH,
cache_subdir='models',
file_hash='c2c37ea517e94d9795004a39431a14cb')
withopen(fpath)asf:
CLASS_INDEX=json.load(f)
results=[]
forpredinpreds:
top_indices=pred.argsort()[-top:][::-1]
result=[tuple(CLASS_INDEX[str(i)])+(pred[i],)foriintop_indices]
result.sort(key=lambdax:x[2],reverse=True)
results.append(result)
returnresults
把重要的东西挖出来,然后自己敲,这样就OK了,下例以MNIST数据集为例:
importkeras
fromkeras.modelsimportSequential
fromkeras.layersimportDense
importnumpyasnp
importtflearn
importtflearn.datasets.mnistasmnist
defdecode_predictions_custom(preds,top=5):
CLASS_CUSTOM=["0","1","2","3","4","5","6","7","8","9"]
results=[]
forpredinpreds:
top_indices=pred.argsort()[-top:][::-1]
result=[tuple(CLASS_CUSTOM[i])+(pred[i]*100,)foriintop_indices]
results.append(result)
returnresults
x_train,y_train,x_test,y_test=mnist.load_data(one_hot=True)
model=Sequential()
model.add(Dense(units=64,activation='relu',input_dim=784))
model.add(Dense(units=10,activation='softmax'))
model.compile(loss='categorical_crossentropy',
optimizer='sgd',
metrics=['accuracy'])
model.fit(x_train,y_train,epochs=10,batch_size=128)
#score=model.evaluate(x_test,y_test,batch_size=128)
#print(score)
preds=model.predict(x_test[0:1,:])
p=decode_predictions_custom(preds)
for(i,(label,prob))inenumerate(p[0]):
print("{}.{}:{:.2f}%".format(i+1,label,prob))
#1.7:99.43%
#2.9:0.24%
#3.3:0.23%
#4.0:0.05%
#5.2:0.03%
补充知识:keras简单的去噪自编码器代码和各种类型自编码器代码
我就废话不多说了,大家还是直接看代码吧~
start=time()
fromkeras.modelsimportSequential
fromkeras.layersimportDense,Dropout,Input
fromkeras.layersimportEmbedding
fromkeras.layersimportConv1D,GlobalAveragePooling1D,MaxPooling1D
fromkerasimportlayers
fromkeras.modelsimportModel
#Parametersfordenoisingautoencoder
nb_visible=120
nb_hidden=64
batch_size=16
#Buildautoencodermodel
input_img=Input(shape=(nb_visible,))
encoded=Dense(nb_hidden,activation='relu')(input_img)
decoded=Dense(nb_visible,activation='sigmoid')(encoded)
autoencoder=Model(input=input_img,output=decoded)
autoencoder.compile(loss='mean_squared_error',optimizer='adam',metrics=['mae'])
autoencoder.summary()
#Train
###加一个early_stooping
importkeras
early_stopping=keras.callbacks.EarlyStopping(
monitor='val_loss',
min_delta=0.0001,
patience=5,
verbose=0,
mode='auto'
)
autoencoder.fit(X_train_np,y_train_np,nb_epoch=50,batch_size=batch_size,shuffle=True,
callbacks=[early_stopping],verbose=1,validation_data=(X_test_np,y_test_np))
#Evaluate
evaluation=autoencoder.evaluate(X_test_np,y_test_np,batch_size=batch_size,verbose=1)
print('val_loss:%.6f,val_mean_absolute_error:%.6f'%(evaluation[0],evaluation[1]))
end=time()
print('耗时:'+str((end-start)/60))
keras各种自编码代码
以上这篇kerastopN显示,自编写代码案例就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持毛票票。
热门推荐
10 广西考试祝福语结婚简短
11 猪年祝福语简短小孩
12 元旦祝福语送长辈简短
13 恭喜二宝祝福语简短
14 祝福语暖心话简短
15 国庆中秋祝福语简短兄弟
16 朋友订婚的祝福语简短
17 送弟弟中秋祝福语简短
18 爱生日祝福语简短独特