밑바닥부터시작하는 딥러닝 # deeplearning from scratch #신경망 #lossfunction #gradientboosting

Deep Learning/from scratch I

밑바닥부터 시작하는 딥러닝 - 4장 신경망 학습 I]

손실함수(loss function) 1. 오차제곱합 yk -> 신경망의 출력값 tk -> 정답 레이블 k -> 데이터의 차원수 출력값에서 정답레이블의 차를 구한 후, 이의 제곱합을 구한 후 2로 나눠준다. def sum_squares_error(y,t): return 0.5 * np.sum((y-t)**2) import numpy as np t = [0,0,1,0,0,0,0,0,0,0] # 원핫인코딩으로 나타낸 정답 레이블 #1 y = [0.1,0.05,0.6,0.0,0.05,0.1,0.0,0.1,0.0,0.0] # 출력값 print(sum_squares_error(np.array(y),np.array(t))) #2 y = [0.1,0.05,0.1,0.0,0.05,0.1,0.0,0.6,0.0,0.0] #..

해파리냉채무침
'밑바닥부터시작하는 딥러닝 # deeplearning from scratch #신경망 #lossfunction #gradientboosting' 태그의 글 목록