Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Carl Shen #2

Open
wants to merge 15 commits into
base: gh-pages
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 45 additions & 0 deletions projects/1-explications/Shen.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Comparison between Deep Blue and AlphaGo\n",
"========\n",
"Deep Blue is the chess playing AI developed in the mid 1990 by IBM. It had two versions, Deep Blue I and Deep Blue II(DB), in witch Deep Blue II has stronger chips and better algorisms that it beat all human players. The speed of the AI has linear positive correlation with the calculation speed of the chips, however, good algorisms could could increase the efficiency exponentially. The makers of DB did make good algorisms such as Hybrid software/hardware search and Massively parallel search \\[1\\], but these algorisms are totally *knowledge-based techniques* that DB can’t grow stronger as it plays more games. \n",
"\n",
"AlphaGo(AG) is a go AI developed by Google. Go was such very complex game: the number of possible boards for chess is 10^47, and for go is 10^171. For both AG and DB beat the very top human players, people should expect AG with much more and stronger chips thank DB, but in fact, AG (in the match) has about the same calculation power as DB. This is mainly because of the different type of algorism AG uses. AG uses a combination of tree search techniques and machine learning \\[2\\], which is *statistic-based technique*. In this way it studies from previous matches played by people and the matches it played with itself. It is notable that AG has 176 GPUs: the major innovation by AG is that it analysis the graphic of the games and find the most possible step a professional player would play next and do calculations on those steps. This largely reduced the total calculation, and could only realized by machine learning. \n",
"\n",
"As a conclusion, I believe the future of AI would use more statistic-based technique. Let’s take an extreme example: if people try to build a human-like AI, it is impossible to pre-code every situation it could face, but give it the capability to learn everything based on some algorisms (maybe quantum physics based chips are need for that).\n",
"\n",
"Reference:\n",
"\n",
"\\[1\\] Murray Campbell, A. Joseph Hoane Jr., Feng-hsiung Hsu, Deep Blue, Artificial Intelligence 134 (2002) 57–83 [link](http://ac.els-cdn.com/S0004370201001291/1-s2.0-S0004370201001291-main.pdf?_tid=3e0d7242-4875-11e6-9aeb-00000aacb361&acdnat=1468358087_0d298efddcf6f972b6c181c24ad96491)\n",
"\n",
"\\[2\\] David Silver etc., Mastering the game of Go with deep neural networks and tree search, Nature. Retrieved 27 January 2016 [link](http://www.nature.com/nature/journal/v529/n7587/full/nature16961.html)"
]
}
],
"metadata": {
"anaconda-cloud": {},
"kernelspec": {
"display_name": "Python [Root]",
"language": "python",
"name": "Python [Root]"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.2"
}
},
"nbformat": 4,
"nbformat_minor": 0
}
15 changes: 15 additions & 0 deletions projects/1-explications/Shen.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
Comparison between Deep Blue and AlphaGo
========
Deep Blue is the chess playing AI developed in the mid 1990 by IBM. It had two versions, Deep Blue I and Deep Blue II(DB), in witch Deep Blue II has stronger chips and better algorisms that it beat all human players. The speed of the AI has linear positive correlation with the calculation speed of the chips, however, good algorisms could could increase the efficiency exponentially. The makers of DB did make good algorisms such as Hybrid software/hardware search and Massively parallel search \[1\], but these algorisms are totally *knowledge-based techniques* that DB can’t grow stronger as it plays more games.

AlphaGo(AG) is a go AI developed by Google. Go was such very complex game: the number of possible boards for chess is 10^47, and for go is 10^171. For both AG and DB beat the very top human players, people should expect AG with much more and stronger chips thank DB, but in fact, AG (in the match) has about the same calculation power as DB. This is mainly because of the different type of algorism AG uses. AG uses a combination of tree search techniques and machine learning \[2\], which is *statistic-based technique*. In this way it studies from previous matches played by people and the matches it played with itself. It is notable that AG has 176 GPUs: the major innovation by AG is that it analysis the graphic of the games and find the most possible step a professional player would play next and do calculations on those steps. This largely reduced the total calculation, and could only realized by machine learning.

As a conclusion, I believe the future of AI would use more statistic-based technique. Let’s take an extreme example: if people try to build a human-like AI, it is impossible to pre-code every situation it could face, but give it the capability to learn everything based on some algorisms (maybe quantum physics based chips are need for that).

Reference:

\[1\] Murray Campbell, A. Joseph Hoane Jr., Feng-hsiung Hsu, Deep Blue, Artificial Intelligence 134 (2002) 57–83 [link](http://ac.els-cdn.com/S0004370201001291/1-s2.0-S0004370201001291-main.pdf?_tid=3e0d7242-4875-11e6-9aeb-00000aacb361&acdnat=1468358087_0d298efddcf6f972b6c181c24ad96491)

\[2\] David Silver etc., Mastering the game of Go with deep neural networks and tree search, Nature. Retrieved 27 January 2016 [link](http://www.nature.com/nature/journal/v529/n7587/full/nature16961.html)

PS: I don't know why markdown didn't run in this file but on notebook format so I get a copy on notebook.
2 changes: 1 addition & 1 deletion projects/1-hello-world/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ Submit an IPython notebook with a Markdown cell briefly introducing yourself: yo

Also include a Python cell which prints out your name using the `print()` function, or does some other thing you find interesting or entertaining.

Put your notebooks here: `lastname.ipynb`.
Put your notebooks here: `Shen.ipynb`.
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please put lastname.ipynb back the way it was?

57 changes: 57 additions & 0 deletions projects/1-hello-world/Shen.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"---\n",
"![awsome markdown user](http://m.tiebaimg.com/timg?wapp&quality=80&size=b150_150&subsize=20480&cut_x=0&cut_w=0&cut_y=0&cut_h=0&sec=1369815402&srctrace&di=a2adb2dc832c56c159b5c7e15bd5f774&wh_rate=null&src=http%3A%2F%2Fimgsrc.baidu.com%2Fforum%2Fpic%2Fitem%2F500fd9f9d72a60597904390c2f34349b033bba94.jpg)\n",
"---"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"#this is Carl, borned in China and lived in LA in past 3 years, learnt AP CS and Data Structure in sophomore and junior year. \n",
"#I learnt Java and know some about Python. My dream is to make an AI that thinks like real people and that's why I'm here.\n",
"\n",
"print( \"Carl Shen, add my battlenet ID for hearthstone and overwatch\")\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n"
]
}
],
"metadata": {
"anaconda-cloud": {},
"kernelspec": {
"display_name": "Python [Root]",
"language": "python",
"name": "Python [Root]"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.2"
}
},
"nbformat": 4,
"nbformat_minor": 0
}