Knowledge Mapping and Management

The knowledge revolution just started !

After the revolution of the mechanical machine begun in the 18th century, companies returned at the end of the 20th century to the revolution of information. More and more masses of data are collected on our economy and the world in general, compiled and processed by computers and subsequently absorbed by information workers.

Now begins the revolution of knowledge.

In the Sistine Chapel decorated by Michelangelo, God  in his brain-shaped cloud, brings knowledge to Adam.

Distinguish between data, information, knowledge and Cognizance

To clarify the discourse and what follows, while avoiding to sink into the eternal philosophical debates on the nature of things, we define the following:

  • Data : any sequence of numbers or letters or graphic, expressible with a series of symbols.
    • Example: “123”, “Jean Dupont”, “red”, “α²> β-γ”
    • We can consider data sets (ex: set of first names, …)
  • Meaning : The relation between a given and an object of the real world or the world of human ideas. We will then speak of the interpretation of a given. Examples:
    • Red ” is the color of a strawberry, or ” red ” refers to the (bad) quality of a company’s accounts.
    • Soft ” refers to the hardness of an object, or the character of a person.
    • Dsklqdsfjdq_RTYAzklfbdjb ” does not make sense.
    • etc.
  • Information : a sequence of data whose interpretation by a human has a meaning:
    • examples: [‘dog’, ‘medor’, ‘bad’], [‘car’, ‘color’, ‘red’
    • The information can be assembled to produce new information.
    • The processing of data and information is mechanized (by computer).
  • Fact : A fact is an event in the real world, to which one can refer with at least one piece of information. Examples:
    • “The end of the Second World War in 1945”: [‘World War II’, ‘end’, ‘1945’].
    • “The duration of the rotation of the Earth around the Sun is 365 days”
  • Axioms : these elements are in the pure world of ideas and are truths admitted a priori, in a context.
  • Inference and inference rules: Inference rules are used to generate inferences from information, axioms, or inferences. Whereas planar geometry can be deduced from Euclid’s axioms, nonlinear geometries can be deduced from the first four axioms.
  • Knowledge : a knowledge is a collection of facts, axioms and inferences. Knowledge is coded and can be collected and processed by computer .
  • Cognizance : refers to the subset of knowledge available to a person or group of people at a given time. Cognizance is in the brain of individuals. The transmission and appropriation of cognizance is the challenge of training. Cognizance is linked to cognitive abilities and cognition, the processes which enable to  think (perception, memory, planing, reasoning, learning, solving problems, etc.).

Computers are machines built to process data. Cognizance coding (as knowledge) allows the appropriate processing by information algorithms, to produce new information and inferences from existing information. In general (according to Gödel’s theorem), an algorithm does not allow to mechanize the processing of the inference rules specific to this algorithm.

The brain does not work like a computer. Its coded information processing capacity is very limited. The connection to reality and the ability to interpret are intrinsically human capacities for the (a long …) moment.

A simple way to understand the difference between cognizance and knowledge is to consider a quantum mechanics course (which is quite complex …):

  • One can easily know where the book is, one can have the book in its library, one can have leafed the book to look at a few pages: it is about knowledge.
  • Knowing the content of the course, knowing how to explain points in a different way than the presentation that was made, solving problems through its contents represents cognizance.

Cognizance : Key to Performance

A business is not a simple exchange of products / services for money. The company exchanges with its customers an important flow of information (notices, instructions, contracts, various settings, etc.), while generating internally a lot of information related to the functioning of its organizations. The company also transmits various documents (advertising, institutional communication, etc.) to its environment, and also absorbs numerous data (interest rate, VAT rate, etc.).

Effective companies have implemented data processing processes that optimize their operations. However, freezing these processes in a world in constant evolution (legislation, competition, regulation, taxation, raw materials, etc.) leads to a progressive and certain maladjustment.

Processes must therefore be evaluated and adapted continuously, which is precisely the role of the human being, as we have seen earlier, processes can not self-modify (except in the case of marginal parameterization). The adaptation of processes requires the acquisition of new cognizance and the generation of new rules of inference.

The paradox and the incompatibility of the acceleration of cognizance prococesses

Knowledge and information are stored in books and computers, but cognizance is peculiar to the individual, his understanding of the world, and according to the knowledge and cognizance that have been transmitted to him. We are thus confronted with a paradox and a physical incompatibility : we want to accelerate the management of cognizance that is in the brain of individuals, whereas the performing tool, the computer, works in another universe linked to data and their transformation.

There is only one solution to speed up the processing of cognizance cognizance : to find effective bijections between cognizance and information, between reasoning and rules of inference in order first to transform cognizance into knowledge, to calculate using computers on these knowledge, and retransform the knowledge into cognizance.

A major difficulty arises: the whole of human cognizance is very vast and very varied!

Notation, the key to capitalization and the transmission of cognizance

Faced with this variety of cognizance, the first point to be studied is how humans manage to communicate knowledge among themselves. Human communication is an immense subject which will be restricted in two ways in our analysis :

  • First of all by restricting ourselves to the communication of cognizance.
  • The ultimate communication with a computer is to find a situation of timeless and aspatial communication: the two individuals can only communicate in a monodirectional way, and through symbols.

The key word is therefore ” notation system “: a set of symbols and conventions allowing to reproduce, in a more or less faithful bijection, the internal representations of the brain. Over time man has developed effective notation systems for each type of knowledge and problem . The notations concern stories (books), diagrams in physics, equations in mathematics, music, Benesh notation in dance, etc.

Cognizance notation systems are thus fundamental for capturing, conserving and transmitting knowledge: knowledge is noted in heterogeneous formats that are linked to the specificity of the information contained in it or to the history of its creation.

We have made very fine studies on writing systems and knowledge transfer mechanisms. We came to the conclusion that nonlinear representations (drawings and graphs) are much more effective than the “linear discourse” mode of books to store knowledge. But they are above all much more modern.

The SGH notations offer a system of non-linear and semi-graphical representation of knowledge. The associated KM2 methodology demonstrates that this approach is not only effective but also allows significant energy and time savings in knowledge management activities for students, researchers and more generally for all professionals in their multiple daily processing activities and knowledge management.