深層学習の層の数と精度変化の歴史のグラフ
深層学習の発展のグラフ
深層学習の発展を表すグラフとしてよくILSVRCにおける精度の変化と層の数の推移を表すグラフを見かけるので作っておきました.
pythonのコード
import matplotlib.pyplot as plt
def plot_res():
plt.rcParams["font.family"] = "Arial"
plt.rcParams["font.size"] = 18
fig = plt.figure()
c = ["skyblue", "blue"] * 5
depths = [8, 8, 22, 152, 205]
years = list(range(2012, 2017))
errors = [16.4, 11.7, 7.3, 6.7, 2.9]
a = 0.2
ax1 = fig.add_subplot(111)
depth_bars = ax1.bar(years, depths, color=c[0], tick_label=years)
for year, depth in zip(years, depths):
ax1.text(year, depth, depth if depth < 200 else "200>", ha="center", va="bottom", color="black", fontsize=14)
ax1.set_ylim(0, 230)
ax1.set_ylabel("Depth of Neural Networks")
ax2 = ax1.twinx()
error_line = ax2.plot(years, errors, color=c[1], linewidth=2)
for year, error in zip(years, errors):
ax2.text(year - a, error + 1, error, color="black", fontsize=14)
ax2.set_ylim(0, 20)
ax2.set_ylabel("Error Rate")
plt.title("Winner in ILSVRC")
plt.tight_layout()
plt.show()
plot_res()
import matplotlib.pyplot as plt
def plot_res():
plt.rcParams["font.family"] = "Arial"
plt.rcParams["font.size"] = 18
fig = plt.figure()
c = ["skyblue", "blue"] * 5
depths = [8, 8, 22, 152, 205]
years = list(range(2012, 2017))
errors = [16.4, 11.7, 7.3, 6.7, 2.9]
a = 0.2
ax1 = fig.add_subplot(111)
depth_bars = ax1.bar(years, depths, color=c[0], tick_label=years)
for year, depth in zip(years, depths):
ax1.text(year, depth, depth if depth < 200 else "200>", ha="center", va="bottom", color="black", fontsize=14)
ax1.set_ylim(0, 230)
ax1.set_ylabel("Depth of Neural Networks")
ax2 = ax1.twinx()
error_line = ax2.plot(years, errors, color=c[1], linewidth=2)
for year, error in zip(years, errors):
ax2.text(year - a, error + 1, error, color="black", fontsize=14)
ax2.set_ylim(0, 20)
ax2.set_ylabel("Error Rate")
plt.title("Winner in ILSVRC")
plt.tight_layout()
plt.show()
plot_res()
Author And Source
この問題について(深層学習の層の数と精度変化の歴史のグラフ), 我々は、より多くの情報をここで見つけました https://qiita.com/nabenabe0928/items/282fa24bfc93f622de26著者帰属:元の著者の情報は、元のURLに含まれています。著作権は原作者に属する。
Content is automatically searched and collected through network algorithms . If there is a violation . Please contact us . We will adjust (correct author information ,or delete content ) as soon as possible .