文档详见评论链接
Tutorial 2-Search Algorithm
 Breadth-first search宽度优先搜索
 [图片]
 [图片]
 open表由queue队列实现,因此加在尾部。
 [图片]
 Depth-first search深度优先搜索
 [图片]
 [图片]
 open表由stack栈实现,因此加在头部。
 [图片]
 Hill climbing (a heuristic search algorithm)
 [图片]
 Hill climbing algorithm goes uphill along the steepest possible path until can go no further up, which may return a state that is a local maximum.
 Advantages
- Avoid traversal避免遍历所有解。
- Achieve the purpose of improving efficiency提高搜索效率。
 Disadvantages
- Not necesssarily find the global maximum but converge on a local maximum不一定找到最优解但收敛于局部最优解。
- In plateau cases, the hill climber may not be able to determine in which direction it should step, and may wander in a direction that never leads to improvement.在高原(比较平坦)的情况下,登山者可能无法确定应该朝哪个方向前进,并且可能在一个永远无法改善的方向上徘徊。
- Ridges problem: If the target function creates a narrow ridge that ascends in a non-axis-aligned direction, then the hill climber can only ascend the ridge by zig-zagging.山脊问题:如果目标函数创建了一个狭窄的山脊,它以非轴线对齐的方向上升,那么爬山者只能通过曲折攀登山脊。
Best-first Search (Greedy Search贪心算法)
 The node with the lowest evaluation is expanded first, i.e., 𝑎𝑟𝑔𝑚𝑖𝑛(𝑓(𝑛))
 f(n) = ℎ(𝑛) = 𝑒𝑠𝑡𝑖𝑚𝑎𝑡𝑒𝑑 𝑐𝑜𝑠𝑡 𝑜𝑓 𝑡ℎ𝑒 𝑐ℎ𝑒𝑎𝑝𝑒𝑠𝑡 𝑝𝑎𝑡ℎ 𝑓𝑟𝑜𝑚 𝑠𝑡𝑎𝑡𝑒 𝑎𝑡 𝑛𝑜𝑑𝑒 𝑛 𝑡𝑜 𝑎 𝑔𝑜𝑎𝑙 𝑠𝑡𝑎𝑡𝑒
 If 𝑛 is a goal node, then ℎ(𝑛) = 0.
 [图片]
 Limitations of Greedy Search
- not optimal
 [图片]
 A* Search
 [图片]
 [图片]
 [图片]
 [图片]
 Exercise
 [图片]
 Breadth-first search:A B C D E F
 [图片]
 Depth-first search:A D F
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 openList, closeList = [start], []
 while True:
 currentNode = lowest f cost in openList
 if currentNode == end: return
 for neighbour in currentNode.Neighbours:
 if closeList.contains(neighbour) continue
 if new_neighbour_f <= old_neighbour_f or not openList.contains(neighbour):
 neighbour.f = new_neighbour_f
 if not openList.contains(neighbour):
 openList.add(neighbour)
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
Tutorial 3-Genetic Algorithm
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
Tutorial 4-Multi-objective Optimization
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 Exercise 1同 Tutorial 3的Exercise 4
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
Tutorial 5-Regression and Gradient Descent
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
Tutorial 6-Scaling, Overfitting and Kmeans
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
Tutorial 7-Building a Perceptron
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
Tutorial 8-Building a Neural Network
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
Tutorial 9-Attention and Transformer
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
Tutorial 10-Ensemble Learning
 [图片]
 [图片]
 [图片]
 [图片]
Tutorial 11-Fuzzy
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
Tutorial 12-Fuzzy2
 [图片]
 [图片]
 按列两两求最小,然后整列求最大。
 [图片]
 P的行,与Q的列逐个元素求最小值,然后选择最小值里的最大值。
 [图片]
 [图片]
 [图片]
 [图片]
 [图片]
期中考:
- 图搜索 * 2
- 逻辑代数 * 2
- bfs dfs
- a*
- 贝叶斯公式
- Kmeans
- 反向传播