Basic Knowledge: Logical AI and the Knowledge Challenge
How Much Knowledge Must a Logical AI System Have?
Tau bets on Logical AI, but what is AI, and what does logic bring to it that is special? Today, AI is everywhere, from washing machines to humanoid robots, and just about everyone knows that the main idea behind these AI-based machines is to make them behave intelligently, or as a human would.
To be sure, it is not immediately obvious how a washing machine can behave intelligently, but it can in the sense that, for instance, it will accept 38.7 degrees Celsius as being almost as hot as 40 degrees Celsius without further ado. Note that this is a form of logical reasoning, though it is not bivalent or classical. Humans, too, are pretty flexible as far as intelligence is concerned, with different agents doing different things when confronted with the same problems. For instance, humans use different strategies when asked to perform the (usually unpleasant) task of making mental arithmetical computations, and not all chess champions play the game in the same way.
But intelligence is not of much help without knowledge. Allen Newell, the very first president of the American Association for Artificial Intelligence (AAAI) and one of the fathers of symbolic AI with a penchant for Logical AI, defined intelligence as our ability to attain goals with the knowledge we have. As early as 1980, in the very first Presidential Address to the AAAI, he defended the view that AI is all about creating knowledge systems, or artificial systems that are characterized by possessing knowledge. This possession would allow these systems to act with everyday goals in view, from being able to fix a car (or finding someone who knows how to do it) to being capable of accounting for someone’s not meeting you at the appointed time and place.
Newell (1982) actually wrote:
To treat a system at the knowledge level is to treat it as having some knowledge and some goals, and believing it will do whatever is within its power to attain its goal, in so far as its knowledge indicates.
- “She knows where this restaurant is and said she’d meet me here. I don’t know why she hasn’t arrived.”
- “Sure, he’ll fix it. He knows about cars.”
- “If you know that 2 + 2 = 4, why did you write 5?”
If you think about this, it takes an impressively large number of facts to be able to solve these problems. For instance, to solve the first problem above, a knowledge system would have to know who is being mentioned — a fact like isFemale(brenda) in a logical knowledge-representation language , where “isFemale” is a unary predicate symbol and “brenda” is a constant in its argument, the part between brackets –, who that person actually is (e.g., the binary predicates isYearsOld(brenda,32), isTeacherOf(brenda,physics), and livesIn(brenda,london), etc.), the time and place of the appointment (e.g., the ternary predicate appointmentIs(friday09.09.22,8p.m.,caféHarry’s)), the current time (e.g., timeNowIs(8:30p.m.)), etc. For the last facts, the system would have to be connected to other knowledge systems (e.g., a CUT system and a GPS system), which means that this could not be a closed system, but rather a system open with relation to the environment and the world at large. Moreover, the system would have to infer from the above and the fact isAbsentAtTime(brenda,8:30p.m.) that Brenda is absent (rather than “isn’t present”) at the agreed time in the agreed location, and this is where logic proper comes in. Is Brenda herself also in possession of the fact appointmentIs(friday09.09.22,8p.m.,caféHarry’s)? Does she know that today is September 9th, 2022? Maybe she didn’t want to go to the meeting? If the system takes the viewpoint of the Open World Assumption (OWA) — as it should, because we saw that it is connected to parts of the outer world, including parts about which it may have little or no knowledge at all –, then it will conclude that Brenda is absent, but it will not know why only from the above facts.
Of course, in a situation like this a human will use their intelligence to speculate, but a knowledge system usually sticks to facts alone. Despite appearing at first rather limiting, OWA is a powerful tool in Logical AI: the system can only infer truths from the facts that it possesses, and it never infers negated facts. The bad news is that it typically takes a large number of facts to make obvious inferences. But as we sometimes forget the obvious, Logical AI is a great tool of assurance. As we like to say in Tau, it gives you the assurance of truth.
Newell, A. (1982). The knowledge level. Artificial Intelligence, 18(1), 87–127.