Files
G4G0-2/AI & Data Mining/Week 18/Week 18 - Tutorial.md
2025-01-30 09:27:31 +00:00

1.5 KiB

  1. Cognitive bias such as confirmation bias, and lack of self-awareness of own thoughts could cause introspection to be inaccurate. Hard to be objective.
  2. Since systems should become more knowledgable over time, an AI could learn to correct past mistakes and no longer act like a human making errors. If a goal cannot be achieved, a system may attempt to improve it's intelligence to obtain it's goal; optimise performance measure, in line with evolution.
  3. The latter statement is true in most cases, external factors out of our control must violate the statement, but generally computers must only do what programmers tell them to. However, this does not imply a computer cannot be intelligent; a programmer could tell a computer to learn and evolve.
  4. Considering the previous point, this is a direct comparison, and the philosophy should be maintained. An animal's intelligence may not be constrained by it's genes, but rather experiences and environment.
  5. Unless a law of physics is misunderstood, animals, humans and machines must abide by laws of physics. However this does not define intelligence.
  6. To what extent:
    1. Bar Code Scanners utilise perception
    2. Search Engines recognise language
    3. Telephone menus recognise language and can "hear"
    4. Dynamic routing algorithms utilise machine learning
  7. Following the principles defined by the turing test, AI should classify as both science and engineering. The concepts and programming are scientific, however robotics and hardware are engineering.