Abstract

This paper presents a formalized communicating process for dealing with information asymmetry between agents. A proactive process can improve the efficiency of dealing with asymmetry by allowing agents to take the initiative of communication in a goal-oriented way. In the process, by reasoning on belief and intention about the world and figuring out the information needed, the agent proactively requests information from another agent when asymmetry exists between them. Considering that agents may take advantage of information asymmetry by hiding information, the process also includes a model based on game theory to restrict the hiding behaviour. The work presented here not only introduces a definition of information asymmetry from cognitive perspective but also proposes a way to deal with it by communication in MAS. In addition, this paper presents some basic ideas on designing proactive mechanisms in cooperation between agents.

1. Introduction

Information asymmetry exists when a party or parties possess greater informational awareness relative to other participating parties, and this information is pertinent to effective participation in a given situation [1]. In Multiagent System (MAS), agents represent entities with different interests, so information asymmetry could bring benefits to some agents in team work and poor results for some other agents. This paper presents a formalized communicating process, where agents can deal with information asymmetry by reasoning on their knowledge about the world and figure out the information needed when facing asymmetry.

To deal with issues caused by information asymmetry, probability and statistical mechanisms are usually employed [2]. Such mechanisms usually relay on history of interaction between agents. But in some situations, like at beginning of the interaction, such historical information is unavailable. From the cognitive view, if an agent can take proactive action to figure out what information is lacking in cooperation, the agent can request that information directly from agents who possess such information, and it can also take strategies to restrict information hiding (even cheating) by considering the context. Then information asymmetry can be solved in a proactive manner.

Proactive behaviour is considered as one of the key characteristics in software agents. Proactivity usually refers to the ability of agent to make conscious decision without being told to [3, 4], which means that agents will take actions to help each other according to some common goals without being instructed. On such willing to help assumption, researches for proactive behaviour usually ignore the issue of information asymmetry, such as share plan or joint intention [510]. However, we can treat “eliminating information asymmetry” as a common goal in teamwork and the communication process can then be modelled from the cognitive view.

Researches have also been conducted on proactive behaviour in human organisations including in the area of feedback seeking and issue selling [1123]. These researches show that proactive communication between people is helpful to resolve the information asymmetry. From the view of information economics [2], researchers also have proposed several models to analyse the problem of how to get optimum contract under information asymmetry. In this paper, we intend to combine the work about proactive behaviour modelling in MAS, proactive communication, and game theory together and then to provide an efficient way of dealing with information asymmetry between agents in teamwork.

The work described here firstly introduces a formalized description about the communication process of dealing with information asymmetry from the cognitive point of view. Secondly, by combining a game theory based model with the communication process, information hiding is restricted according to context. Finally, the work proposed here provides some basic ideas of designing proactive communication process between agents. In a scenario of information asymmetry, the agent that needs information takes the initiative to identify the information and requests information from the agent that owns it. Such a proactive manner can be used to deal with some other problems in communication, such as trust establishment. During trust establishment the trustor can proactively collect information from the trustee, rather than just waiting to observe the behaviour of the trustee or waiting for information from the third party.

2. The Proactive Communicating Process for Dealing with Information Asymmetry

To facilitate the next discussion, a simple scenario of information asymmetry is introduced first. During the development of software, requirements of a customer change over time. Generally speaking, a customer is usually not familiar with the technologies, while the developer is not familiar with business requirements. Thus information asymmetry exists between the customer and the developer. The developer may take the advantage of information asymmetry to refuse a new requirement in order to gain unreasonable benefits.

In this section, the formal description of communication for dealing with information asymmetry in a proactive manner is presented. Information asymmetry and related processes are expressed with mental attitudes of agents. These mental attitudes are described with modal operators like Bel, Int. To/Th, Attempt, Inform, and Request, which are proposed in Joint intention, SharedPlans and the work of proactive information exchange [38, 2426].

Sometimes information asymmetry exists in some scenarios which participants do not even realise. The process presented here focuses on how to deal with information asymmetry in a proactive manner. So we assume that communicating participants realise that information asymmetry exists in their cooperation. In the previous scenario, the customer should consider how to deal with information asymmetry proactively, in order to add the new requirement without paying unreasonable cost. Meanwhile the developer should consider how to deal with request of customer and take advantage of information asymmetry.

2.1. The Definition Information Asymmetry in the Communication Process

First different roles that two agents play in asymmetry are deserved to be discussed. Without special statement, this paper uses to present the agent that owns information and to present the agent short of information. Here information asymmetry means that for certain proposition , there is an agent that does not believe being true or false, and meanwhile there is another agent that does believe being true or false, or forms such belief by reasoning on its mental attitudes and knowledge base. Suppose that prop () and prop () are the set of propositions that and own in their mental attitudes and knowledge base, respectively, and Rules is the set of rules of two agents, and rules are all written with horn clauses. Then we define Rules () as the set of propositions appeared in the rules like  or.

We assume that in cooperation needs to form the belief about proposition based on ’s belief on . Here is the agent short of information and is the agent with information. However, needs to get some information from to complete the reasoning process of forming belief about . If cannot provide all information needs, but needs to provide some information to help get information needed by , for these parts of information needed by , there is role exchange between and . then becomes the agent short of information and becomes the agent with information. Such a process can be described as in Figure 1.

Now we introduce the presentation of information asymmetry in the communication process. First, two participants of communication should be included in the representation of information asymmetry, as well as the role of each participant: who needs information and who provides.

Second, information asymmetry relates certain propositions which form intentions, beliefs, and other mental attitudes of agents. For instance, in the software scenario, the proposition in the representation of information asymmetry should be “the customer intends the developer to implement a new requirement.”

Third, information asymmetry exists under certain context. For instance, in the scenario, if the developer and the customer belong to the same company and the developer is subordinate of the customer, the developer should tell the customer the feasibility of a new requirement. Then it is not necessary to deal with information asymmetry in such situation.

With the previous discussion, information asymmetry can be presented as the following:,,.

In the definition, and are agents involved in information asymmetry; _Role and _Role are the roles that two agents take in the asymmetry. For the sake of convenience, we use poor to represent 0, and rich to present 1.

inputVar (inputVar′) is the set of input variables of . outputVar (outputVar′) is the set of output variables of . is the context constraints of information asymmetry. The semantics of AsymInfo operator is illustrated with the following axiom.

Axiom 1. , ,. ,,.

The axiom says that at current time , if (or ) believes that information asymmetry about proposition exists between it and (or ), there must exist some proposition in Rules() (of each agent’s own) that believes being true but does not believe that believes being true, or believes being false but does not believe that believes being false. Table 1 lists the references about the Bel and MB operators. With this axiom, it can be assumed that has ability to provide with information related with asymmetry.

In the following section, the formal description of communication process is presented in detail, and many problems will be discussed. The notations to be used are listed in Table 1.

2.2. The Communication Process from the General View

In modern control theory, state space equation is a common tool to model and analyze dynamic characteristics of systems. A state space equation can be expressed as

The equation is composed of following components: a set of state variables to describe behaviours of system and a set of input variables and a set of output variables. These variables make up the state equation and the output equation. Consider that agents communicate with each other to deal with information asymmetry. Each agent has its own internal states composed by its mental attitudes. An agent sends some variables to another agent and requests for answers. These variables indicate what information needs. The target agent receives the variables and finally gives out answers with reasoning process. This situation is similar to state space equation discussed in control theory.

Based on the previous analysis, the process of dealing with information asymmetry can be described with following state space equations

Here, poor and rich in subscript of the equation represent two agents involved in asymmetry, with different roles. is a set which includes mental attitudes of an agent, like beliefs and intentions. and are sets of input and output variables, which correspond to inputVar and outputVar in the operator of AsymInfo. Here and mean that the states of agent are updated for the next round of communication. means that after poor gets the output variables from rich, input variables of poor are updated. In the equation, , and correspond to the reasoning process as follows.: For the agent that is short of information, represents the establishment of reasoning process for dealing with information asymmetry and identifying input variables in . : For the agent that is short of information, represents the process of identifying output variables in with input variables and internal state. : For the agent that is with information, represents the process of updating ’s internal states after receives input variables from : For the agent that is with information, represents the establishment of reasoning process for finding true values for input variables in : In the process of , represents the process of hiding information in the (such a process can be included in the process of . Here we use a separate operator in order to emphasize such a process).

Equation (2) shows that in communication process, input and output variables define what information needs to be exchanged between two agents and they are closely related to asymmetry between two agents. Definition 1 gives out the formal definition of input and output variables of both agents.

Definition 1. For agent that is short of information, input variable is defined as,;for agent that is short of information, output variable is defined as,,;for agent that is with information, output variable is defined as,,,.

Then input variables of are related to what information wants from and are defined with ’s belief about ’s belief about a proposition prop. Input variable (prop, unknown) means that does not believe that believes prop is true and also does not believe believes prop is false. Output variables of are some beliefs of its own that wants to tell in communication. These variables may help to get information that needs and they are helpful to avoid to request them from again.

As for input variables of , they are elements in union set of input and output variables sending by . For output variables of , they are beliefs of which are requested by as what defined as input variables. Output variable (prop, true) of means that believes prop is true, and output variable (prop, false) means that believes prop is false. Output variable (prop, unknown) means that does not believe prop is true or false.

The whole process is presented by Figure 2. First, agent short of information () finds out that information asymmetry would bring negative influence on cooperation between itself and agent with information (). First identifies some input variables related to information asymmetry. In the process of identifying input variables, a process of reasoning is established (the tree in second part of the Figure 2). The reasoning process and related mental attitudes of agent correspond to and in (2). At the same time, the output variables () with initial value are also identified (with input variables included). Then output variables are sent to to begin a communicating process to deal with information asymmetry (Rule 2 in Figure 2).

After output variables received, begins to construct its own state space. The input and the output variables of are constrained by the output and input variables of (as shown in (2)). Mental attitudes of are updated with input variables at first. In order to get the true_values of variables in , will start a reasoning process (the tree in third part of the Figure 3). After true_values of variables are gotten, puts these variables into the set of output variables and hides information as needed. Then output variables of are sent to (CommuResponse in the Figure 2), and analyzes whether true values of variables are hidden and updates its mental attitudes.

The information hiding during communication is constrained by the method based on game theory. can have a set of strategies by defining to hide a set of variable in its output. After gets inputVar from (output variables of ), it can have a set of strategies by judging each variable as hidden or not. can also define payoffs for strategies. If proper mechanisms for the game are designed, and can get equilibrium about their game on information hiding.

2.3. The Communication Process for Dealing with Information Asymmetry in Detail
2.3.1. Start of the Process

To facilitate introduction in following section, formal definitions of modal operators Attempt, Inform, and Request are listed as follows [3]. The semantics of Inform and Request are given by choosing appropriate formulas to substitute in the definition of Attempt [3, 28]. Related operators and predications are listed in Table 1. Here we use “=” to present “defined as.”

Definition 2. , .
?; , , , .
?; , , , .

In the definition of , represents some ultimate goal that may or may not be achieved by the attempt and represents what it takes to make an honest effort. The definition of says that at current time , wants to believe that is true with event before time . The definition of says that at the current time , wants to execute with event before time .

With these definitions, the process of dealing with information asymmetry will be discussed in detail. The first question is how to start and who will initiate the process of communication. According to Axiom 1 in Section 2.1, two agents believe that information asymmetry about proposition exists between them. If both of them have reached an agreement on true value of , it may be unnecessary to deal with asymmetry between them. They just need to act as what they both agree. But when some conflicts about appear as Axiom 1 suggests that inconsistency between both agents’ belief exists, both agents should consider finding out the relation between conflicts and information asymmetry. And the process of dealing asymmetry should be taken into consideration.

Another question is who will initiate the process. It seems that both the agent short of information and the agent with information may be aware of conflicts caused by asymmetry, and both of them can start the process of dealing. But the agent who initiates the process should identify input variables, which constrains the choices of output variables of another agent and the reasoning process of both agents. This paper assumes that agent short of information () should initiate the process, as it can find out propositions whose true values it is not aware of in reasoning process. These propositions are candidates of input variables. When information asymmetry on exists between and , we define a rule to make sure that the agent with information () will inform about a contradiction of beliefs or intentions on proposition between it and .

Rule 1. , , , ;, .

Suppose that at time information asymmetry about proposition exists between and . Rule 1 includes two situations. First, at time if believes that intends that will believe is true at time , but at time ,   believes that it won’t believe is true at time , should intend to inform that will not believe is true at time . Second, if believes intends that will intend to do at some time before under context , but believes that it won’t intends to do under context at any time before , should intend to inform such belief. Inform should be finished before . is certain time after , and it can be defined by according to the requirement of concrete scenario.

In Rule 1 we use beliefs about and , because such beliefs can be gotten when informs about its intentions, but it is hard for to get beliefs like directly.

As described in part 1 of Figure 2, if uses Rule 1 to inform about some conflicts, can be aware of conflicts between and . Here assumption 1 is defined to show how forms belief of conflict between it and .

Assumption 3. believes that there exist conflicts between its intention that should perform some action or intend some proposition to hold and ’s unwillingness of performing or intending be hold as, , where, , , , ,.

prop1 stands for that intends proposition “ believes being hold at time ” being hold. prop2 stands for that intends proposition “at some time ” before ,  and believes that does not believe that holds at time ”. prop3 stands for that intends that “ intends to do at time ” being hold. prop4 stands for that intends proposition “at some time ” before , and believes that believes that will not intend to do at any time before ” being hold. Meta-predicate represents situations in which actions or propositions conflict with each other [6]. Function constr () denotes the constraints components of the context [6].

Theorem 4. In Rule 1, if ’s belief about ’s intention is consistent with ’s intention, successful performance of Inform in Rule 1 will make believe that conflicts exist between it and .

Proof. As ’s belief about ’s intention is consistent with ’s intention, in Rule 1, if or, holds at current time , also has beliefs as follows: or.Then at some time , and, hold (time is decided by as necessary).
With Rule 1, forms the intention of Inform. If the performance of Inform is successful, with Definition 2 there exists some time that and reach a mutual belief like, or.With Assumption 3, gets a conflict between prop1 and prop3, or prop2 and prop4.

After is aware of conflicts, should consider initiating a process of dealing with information asymmetry in the following situations, as we define in Rule 2.

Rule 2. , , , , ,;, , , , , , ,;, , , , , , ,;, , , , , , , ,.

Here, Int.Tx stands for Int.Th or Int.To. Rule 2 says that at time , suppose that believes that at time ,   intends that will believe that some proposition is true or will intend to do . At this time also believes that some proposition is true or it will intend to do . However, conflicts between and , as well as information asymmetry, exist between and in ’s opinion. Then at time , should form potential intention to execute CommuAct at time under context . includes and ’s belief about conflict.

Here we use potential intention because should reconcile intention on CommuAct with other intentions that has already adopted. We define CommuAct as follows.

Definition 5. , ; ;, ,.

At time , execution of CommuAct means that before time , executes action ConstructSpaceP. If ConstructSpaceP is done successfully, requests to execute CommuResponse at and makes response before time . ConstructSpaceP is responsible for establishing reasoning process for dealing with information asymmetry. This definition will be discussed in detail later.

2.3.2. Dealing Process

In Definition 5, CommuAct is executed at time defined as an act like this: before agent executes ConstructSpaceP, which constructs state space according to the proposition and identifies input and output variables. If ConstructSpaceP is executed successfully, requests to execute CommuResponse at time , which should be finished before . inputVar is a set of input variables of , and outputVar is a set of output variables of . outputVar will be sent to with action . inputVar′ and outputVar′ are input and output variables of . says that when sends outputVar to with action Request, inputVar′ should include outputVar; when sends outputVar′ to with action CommuResponse, outputVar′ should include input variables. Request and CommuResponse can be implemented with Agent Communication Language.

First input and output variables need further discussion. As what we have discussed earlier, will hide information when it uses CommuResponse to send outputVar′ to . Then gets true value of , true or false, and information hiding can be defined as the following:,.Here we do not consider cheating between two agents, and we define cheating as follows:,.So we define an assumption about information hiding as follows.

Assumption 6. ,,, , ;, , ;,,, , ;,, , .

The assumption says that after received the set of output variables, for each variable var in outputVar, if true_value of var is true (or false) and believes that no conflict will appear between ) and other propositions that believes being true or intends to (or intends that being true), will believe believes that is true (or false).

According to Definition 1, for each output variable (prop, true_value) in outputVar which is sent by to , (prop, true) stands for and (prop, false) stands for . For each output variable (prop, true_value) in outputVar′ which is sent by to , (prop, true) stands for and (prop, false) stands for . Take first part of Assumption 6 as example, it says that after receives output variables from , for each output variable (prop, true) believes that its beliefs have no conflict with chooses to believe that . In other words, holds. Here and stand for context constraints related with and , respectively. As for the situation that of var is unknown, it gets involved with information hiding, which will be discussed later.

Then the process of CommuAct will be presented in detail. According to Rule 2, a process of dealing with information asymmetry is initiated because conflicts occur between and . Such conflicts happen because in the process of deducing , beliefs and intentions of both agents have conflicts. Consider that has more information related to . Before gets the information that it needs, should establish reasoning trees about with its own mental attitudes and rules and find out beliefs and intentions in the tree that considers inconsistent with . These beliefs and intentions are candidates of input variables. And these reasoning trees are state spaces of for the process of dealing with information asymmetry. They also correspond to of poor in (2) (in Section 2.2). can also choose some beliefs and intentions from reasoning trees as output variables. In ’s opinion, these variables can help to get true values of input variables.

We assume that rules of and are written with horn clause. That is to say rules follow the schema like or . Then the reasoning tree can be presented with Figure 3. In a reasoning tree, each node is a proposition of the . Suppose that information asymmetry that is related to proposition prop exists between and . In the rule and , prop should be a belief of . For rule , prop is the root of the tree, and each is a child node of prop. For rule ,   is the root of the tree, and each is child node. Every node in the tree, except the leaf nodes, will have a rule like , and each proposition in the left part of the rule will be the child node of o. prop may have many reasoning trees at one time.

In the reasoning process presented in the previous tree, for the prop in the information asymmetry, some propositions in the tree of prop may be inconsistent with beliefs or intentions of . And the inconsistency of these propositions may hinder and in getting consistent result of prop. So in ’s opinion, ’s beliefs about these propositions are what needs. These propositions are candidates of input variables.

Assume that a conflict of proposition prop appears between and . intends that will believe at time , while believes that it won’t believe at . If has some rules like or , such rules can be used to establish the reasoning tree of the process of dealing with information asymmetry. For each in the rules, can also find out rules like and add them into the reasoning tree. Repeating with such a recursion process until no more rules added, the state space of the process of dealing with asymmetry is established. However, there may be rules like for in the tree. Although such rules do not appear in the reasoning tree, propositions in these rules can also be considered as input variables. can also choose some propositions, or even rules in the reasoning tree as output variables. Such a process can be implemented with backward chaining algorithm [29], and this paper takes ConstructSpace as a basic action here.

In the part 2 of Figure 2, if action CommuAct is successful, will be aware of the input variables of with Request in CommuAct.

Assumption 7. As propositions in the previous reasoning tree, propositions that fulfill the following conditions are taken as input variables and should be put into inputVar of :,.

The assumption says that for proposition in Rules(), believes that does not believe believes is true and also does not believe that believes is false, and then believes variable var = (, unknown) is input variable.

With ConstructSpaceP, will have input and output variables in inputVar and outputVar, respectively, with the state space being established. Then will request to execute act, CommuResponse, and expect for reply for each input variables.

Axiom 2. gets output variables of when believes that intends to execute CommuResponse:, , , = .

Axiom 2 says that when believes that intends that “at certain time before ,   intends to execute CommuResponse at time ,” gets input variables in inputVar and puts them into inputVar′. According to Rule 2, requests to execute CommuResponse and sends input variables with the request. If ’s request is received by successfully, is aware of ’s intention of expecting to intend to execute CommuResponse.

Theorem 8. In Rule 2, successful performance of CommuAct will make get the input variables of .

Proof. If the performance of CommuAct in Rule 2 is successful, as earlier introduced, the state space of dealing with information asymmetry is set up. With Assumption 7, gets input variables and puts them into inputVar.
As CommuAct is successfully accomplished, Request in CommuAct is also successful. According to the definition of Request, if is success, there exists a time such that holds, where, . Then according to the , .
In Definition 5, in the is actually CommuResponse, and and can get the following mutual beliefs at some time :, , , = ,.With Axiom 2, finally gets input variables in inputVar and put them into inputVar′.

The mutual belief means that after request and both believe that at some time before , wants “ intends to execute CommuResponse at some time ” to hold before . The definition of CommuResponse is shown as Definition 9.

Definition 9. , ,, ,, .

Executing CommuResponse at time means that for each output variable var in outputVar′, informs about its belief in the proposition in var before time under context constraint . As for variables with true_value being unknown, after outputVar′ are sent to , they are still with true_value being unknown.

Before executes CommuResponse, there is still some work which needs to be done, including constructing a process of reasoning for input variables in inputVar′ and finding out true value for each input variable. Similar to , an act also needs to be defined for (Definition 10).

Definition 10. , ; ;, .

Definition 10 says that before agent constructs its state space for the process of dealing with information asymmetry with action ConstructSpaceR, then if ConstructSpaceR is successful, will execute CommuResponse at time , which should be finished before . input and output are sets of input and output variables of , respectively. The action ConstructSpaceR has some similarity to ConstructSpaceP and will be discussed later.

Rule 3 requires to form a potential intention on CommuRes after the request about CommuRes received from .

Rule 3. , , , , , , .

The rule says that at current time believes that some time earlier (at time ), intended that at time would intend to execute CommuResponse at time , and also believes that information asymmetry exists between itself and . Then should have a potential intention to execute action CommuRes at time . This time is a certain time after . It is defined by according to the concrete scenario.

If the Request for CommuResponse is successful, will have belief at some time as follows:,,, .

Then if believes information asymmetry about exists between and , should have a potential intention on CommuRes.

Similar to action ConstructSpaceP of , uses action ConstructSpaceR in Definition 9 to establish the state space for the process of dealing with information asymmetry. ConstructSpaceR is mainly responsible for the following tasks: updates the mental attitudes with input variables in input, gets true values for each input variables, and puts these input variables in to output after the true_value of each input variables is got by reasoning. It is also responsible for deciding strategies of hiding for output variables.

First, input variables in inputVar′ from contain beliefs of about the proposition in each variables. ConstructSpaceR will update mental attitudes of with such beliefs. With Assumption 6, for each input variable (that is output variable of ) with its true_value being true (or false), believes that believes the proposition in the input variable being true (or false). In the process that updates its own mental attitudes with input variables, the problem of belief revision is involved. Many works have been done on this problem, and some algorithms have been proposed [3033]. As belief revision is a complex topic, it will be discussed in our future work. ConstructSpaceR is also responsible for judging whether hides information in CommuAct. However, since is the agent short of information, it seems that is short of motivation to hide information in the process of requiring information from . For simplicity, this paper does not discuss the situation that hides information in communication with .

Second, after belief revision finished, finds out the true value for each input variable. The process is similar to ConstructSpaceP; for variables whose true values can not be gotten directly from beliefs of , the reasoning tree for each variable is established to see whether the true value of variables can be gotten from ’s own mental attitudes and rules. Backward chaining algorithm can also be employed here. For those variables which can not be gotten by the reasoning process, their true_value are set to be unknown. Then these input variables are put into outputVar′, waiting for sending back to .

In parts 3 and 4 of Figure 2, if action CommuRes is successful, will get responses of the input variables from .

Axiom 3. When believes that intends to believe ’s belief about the true value of input variables, puts such beliefs in inputVar as input variables with true value:,, , .

The first part says that if believes that intends that “at some time , believes that believes ” holds, and (, unknown) belongs to inputVar, updates (, unknown) to (, true) in inputVar. The second part says that if believes that intends that “at some time believes that believes ” holds, and (, unknown) belongs to inputVar, updates (, unknown) to (, true) in inputVar.

Theorem 11. In Rule 3, successful performance of CommuRes will make get the output variables of .

Proof. According to Rule 3, forms intention on CommuRes when the proposition, , holds. Then with Axiom 2, gets input variables of and puts them into inputVar′.
With Assumption 6, the input variables in inputVar′ are changed to beliefs of . And with necessary process of beliefs revision, updates its mental attitudes. At the same time the state space of dealing with information asymmetry is set up.
As CommuRes is successfully accomplished, ConstructSpaceR and CommResponse are successful. As ConstructSpaceR is successful, gets true values of input variables in inputVar′.
According to Definition 2, if Inform is successful, the following proposition holds:, .
Then if Inform in CommuResponse is successful, and can get the following mutual beliefs at some time :.
With Axiom 3, for each input variables in inputVar′, updates true values of input variables with input variables in inputVar′.

We leave one step in CommuRes for further discussion. Before sending inputVar′ to , will consider the hiding strategies for these variables and a game between and will begin. Before such a step, we should notice that after receives inputVar′ from , the true value of some variables may be “unknown.” Then can also choose to establish a reasoning process about these variables with some other rules rather than used in current process of establishing state space. It means that establishes a new state space for variables and can also start a communicating process as aforementioned. Such a situation is much similar with decomposition in Hierarchical Task Network (HTN) [29]. At first the process of dealing with information asymmetry can be regarded as some high level tasks. Then receives input variables from and decides to set up new state space for certain variables and start a communicating process. This new process can be regarded as the high level task decomposing to low level task.

2.3.3. Game in the Process of Dealing with Information Asymmetry in a Proactive Manner

After the second step of ConstructSpaceR, has output variables in the set outputVar′. As information asymmetry exists between and , will consider taking advantage of asymmetry by hiding true values of output variables. Suppose that elements in outputVar′ are as follows:.

According to the strategy of hiding introduced in Section 2.2, can adopt the following strategies for the outputVar′:, , , , , , , , , , .

The decision of on whether to hide information depends on the result of game on information hiding between and . With Assumption 6, after receives outputVar′ (that is input variables of ) from , for each variable with true_value being true (or false), believes that believes the proposition in the input variable being true (or false). And these variables will be directly used to revise beliefs of . But for the variables with true_value to be unknown, will wonder whether the true_values of these variables are hidden, or really has no idea about the true_values of these variables. If considers some true values of variables are hidden, will punish in order to force to eliminate hiding behaviours and promote cooperation. In such situation, game theory is employed by both and to make decisions in the game. Here, first decides how to hide information, and then analyzes what hiding strategy may be adopted by , so it is reasonable for and to use dynamic game model [34] to support their decision. Besides, according to whether and are willing to publish payoff in the game, and can choose to use complete or incomplete information game model [34]. Here we use dynamic game with complete information.

As the previous example, ’s complete strategy set is {strategy1, strategy2, strategy3, strategy4}, and correspondingly ’s strategy set is , , ,  . If adopts strategy4 and then gets inputVar′ like , can possibly adopt strategies like or . Generally speaking, the game between and can be presented with the game tree in Figure 4.

In Figure 4, the dotted line presents information set with many nodes [34], which means that knows its turn to make decision, but it does not know which node in the set it resides; in other words, it does not know the decision of . At the terminal nodes [34] of the tree, the payoffs for each strategy are labelled with (payoff of ) and (payoffs of ). Such payoffs depend on the concrete scenario. And they influence result of the game.

First, should take the punishment from into consideration, as may find out that hides information in the process of dealing with asymmetry. Second, should consider the possibility that will act as what expects if hides some variables. For instance, in the scenario of Section 2.1, after requests for the true value of “new requirement will bring big change on what has been finished,” decides to tell that “it is hard to tell whether or not” (which means unknown) instead of the truth that “we have already defined some interfaces to guarantee no big change will be brought about” (which means false). Before such decision, may consider whether will choose to give up new requirement or give more time and more money. The different choice of will bring with different payoff.

As for , different strategies will lead to take different actions. For instance, should consider whether to add new requirement when it is short of information about whether new requirement will bring big change on the overall work. So the payoff for each strategy depends on action will take, and it also depends on its punishment on . Here should be careful because that mistaken punishment on will cause to punish back.

After the game is over, revises its beliefs with input variables, and a process of dealing with information asymmetry in a proactive manner is finished. Here we define the term “the game between and effectively constrains information hiding between and ” as follows. In the game between and , finally chooses not to hide information in communication, and considers that does not hide information in communication. As different game mechanisms can be employed when facing different scenarios, in communication between and the situations like hides information regardless of punishment or has wrong judgment about information hiding may appear. In such situations the process of dealing with information asymmetry will be influenced by information hiding. This problem needs to be discussed in game mechanism design, and we do not take it into consideration here.

Theorem 12. If gets response of input variables from successfully and the game between and effectively constrains information hiding between and , ’s belief about information asymmetry no longer holds.

Proof. After successful performance of CommuAct and CommuRes, gets responses for input variables from . According to Assumption 6, believes that is not cheating for input variables. As the game between and effectively constrains information hiding between and , believes that does not hide information in communication. Then each variable in inputVar falls into the following categories.
when var = (prop, true), with the definition of input/output variables, gets beliefs like.
when var = (prop, false), with the definition of input/output variables, gets beliefs like.
when var = (prop, unknown), with the definition of input/output variables, gets beliefs like.
Consider that operator Bel follows KD45 axioms of model logic, with also getting beliefs as follows:,, .
According to Assumption 7, belief. no longer holds.

Then with Axiom 1, ’s belief about information asymmetry no longer holds. With information in inputVar and the process of belief revision, can complete the reasoning process that it sets up for dealing with information asymmetry.

3. Summary

Information asymmetry brings about problems like adverse selection and negative influence on cooperation between agents. This paper presents a proactive communication process for dealing with information asymmetry in MAS. The main contributions of this paper are as follows. First, a formal description of the communication process for dealing with asymmetry is presented. Previous works pay less attention to the process of dealing with information asymmetry, such as how to start the process, or what the communicating process like. In the work presented here, agent short of information takes initiative to identify and request for the needed information, and detailed steps of communication are also defined.

Second, by combining the communication process with a game theory based model, the work presented here provides a more flexible and effective way to deal with asymmetry between two agents. The game between two agents guarantees that information hiding can be constrained. On the one hand, the game between agents allows agents to take advantage of information asymmetry by hiding information in communication; on the other hand, it also constrains the decisions of each agent according to their interests, respectively.

At last, the process proposed here provides some basic ideas of designing proactive communication process between agents. In the situation of information asymmetry, the agent short of information takes initiative to identify the information it needs and requests information from the agent with information. Such a proactive manner can be used to deal with some other problems in communication, such as in the situation of trust establishment.

Still several important issues deserve further studies. First, more concrete semantic should be considered for information asymmetry. Information asymmetry may be caused by many factors in organization. These factors should be taken into consideration. As the study of information asymmetry has been conducted in information economics, some models will be helpful when agents construct state space and identify input and output variables.

Second, in the process of establishing state space, the controllability and observability of the state space should be taken into consideration. On the one hand, an agent expects another agent who it is communicating with to act as expected, and on the other hand, an agent does not expect to be controlled by other agents. So the choice of the input and the output variables is critical. Too much information exposed will break autonomous of the agent. Too less information exposed will make cooperation less effective.

Third, we mainly focus on information asymmetry between two agents. If three or more agents are involved in information asymmetry, the organization of these agents may influence the process of dealing with information asymmetry. Such situations will be considered in our future work.

Acknowledgments

This work is partially supported by the National 973 Foundation of China (no. 2013CB329304), the National 985 Foundation of China, National Science Foundation of China (nos. 61070202 and 61222210), and the National 863 Program of China (no. 2013AA013204).