An internal regularity of an information system is often referred to as a goal of the system. When we say a system has a goal, we usually mean that there are changes in the system that cannot be fully explained as caused by the changes in the environment, since that regularity can be observed in various states of the environment.
When the system goes through state changes to achieve its goals, some basic state changes can be considered as the system's actions, which is what the system can do, internally or externally. In a computer, its primary actions are defined by its instruction set, and in a human, by his physical and mental competence.
Relations among goals and actions are abstracted as the knowledge of the system. In the simplest form, if the appearing of a goal usually triggers the execution of an action, we can say that the system has the knowledge that the action will achieve the goal.
For information systems in general, goals, actions, and knowledge are all described abstractly, without bounded to certain concrete underlying processes described in a discipline-specific way. Furthermore, the system does not necessarily have self-awareness of its goals, actions, and knowledge to any extent. Instead, they are concepts adopted by an observer as a natural way to describe the system.
A goal can either be a state or a group of states a system attempts to reach or stay in, or a "direction" in stage-change, meaning that we can define a numerical function on states that the system attempts to minimize or maximize its value. We can call them "reachable" and "unreachable" goals, respectively.
It is not by accident that "information system" and "teleological system" happen to indicate the same group of system. Often a goal is specified abstractly, and the system may have multiple ways to keep/achieve a goal by changing itself, the environment, or both. To describe such a system as an information system can greatly reduce the complexity of the description without losing the essence of the process. For similar considerations, Wiener (1948) considered cybernetics as denoting the study of "teleological mechanisms".
In general, at any moment a system can have multiple goals, and its goals may change over time. A system has initial goals that are built-in at its forming process. In biological information systems, such as human beings and some animals, the initial goals are survival and reproduction, which are formed in evolution. In artificial information systems, such as computers and control systems, the initial goals are certain human needs that the system is designed to satisfy.
Given the general nature of the initial goals, usually they cannot be directly achieved. Instead, they need to be further specified and extended into more concrete subgoals, or derived goals, and this process may repeat for multiple times to reach the goals that can be directly satisfied by an action, such as "pick up the apple on the table".
The system's goals may change over time for several reasons. For biological systems, some goals are driven by the underlying biological mechanism, and all satisfactions of them will only last for a certain amount of time, so the system has to deal with them from time to time, though they are not everlasting. Some new goals are triggered by the changes in the environment, and some others are just derived from the other goals. Therefore, in general the system's activities are guided by a group of goals, which changes over time.
When the system has multiple goals, they are not necessarily consistent with each other, and furthermore, they usually compete for the system's resources. The system usually has some control mechanism to handle various types of conflicts and competitions among its goals. Consequently, at a given moment, the system's activities may only be determined by some, but not all, of its existing goals.
Like goals, the initial actions available to a system are usually determined by the system's heritage or design. In a biological system, there is usually a growing period, in which the system's actions develop in scope and complexity. However, for a mature system, a new action only comes as a way to combine the existing actions into compound actions.
Even with the same basic action set, the system's actual competence can be amplified by using "tools", which are other systems (not necessarily information systems) and objects in the environment. For example, the same action of pressing a button can achieve many different results, depending on where the button is.
Usually there are basic actions whose internal structure and process cannot be further analyzed within the current information system framework, because they must be specified in the language of physics, chemistry, biology, etc. As far as the information system is concerned, it is usually enough to study the conditions and consequences of an action, specified as state changes internal and external to the system.
Its primary function is to link the action of the system to its goal, therefore the system can use the former to achieve the latter. In the simplest form, a piece of knowledge directly links a goal to an action that can be used to achieve the goal.
Knowledge can describe the relations between compound goals or actions to their components, to show how a compound is composed by its components, and this composing relation can be recursive.
Knowledge can describe the conditions and consequences of an action.
Knowledge can describe the relations among other knowledge.
Altogether, knowledge provides direct and indirect pathways for the system's goals to be achieved by executing available actions. This opinion is different from the popular opinion that knowledge is the system's internal representation of the environment, or a "model of the world".
This actual capability should be distinguished from the system's potential capability, defined by the space of possible states that can be arrived by all valid combinations of actions. Though in some systems some of these potential can become reality, at the moment it is still beyond the system's reach, due to the lack of necessary knowledge.
Another restricting factor is the system's resources supply. Here "resources" mainly include the processing time and the storage space of the system. Since every concrete system at a given moment always has a constant information processing capacity, and every practical problem has a time requirement attached, a "solution" that demands more resources than the system can afford is not a valid solution, and therefore is not included in the system's capability.