Existence and uniqueness of pseudo almost periodic solutions to some abstract partial neutral functional–differential equations and applications

Share Embed


Descripción

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 23, NO. 1, JANUARY 2012

109

Existence and Uniqueness of Pseudo Almost-Periodic Solutions of Recurrent Neural Networks with Time-Varying Coefficients and Mixed Delays Boudour Ammar, Student Member, IEEE, Farouk Chérif, and Adel M. Alimi, Senior Member, IEEE

Abstract— This paper is concerned with the existence and uniqueness of pseudo almost-periodic solutions to recurrent delayed neural networks. Several conditions guaranteeing the existence and uniqueness of such solutions are obtained in a suitable convex domain. Furthermore, several methods are applied to establish sufficient criteria for the globally exponential stability of this system. The approaches are based on constructing suitable Lyapunov functionals and the well-known Banach contraction mapping principle. Moreover, the attractivity and exponential stability of the pseudo almost-periodic solution are also considered for the system. A numerical example is given to illustrate the effectiveness of our results. Index Terms— Banach fixed point, exponential stability, pseudo almost-periodic functions, recurrent neural network.

I. I NTRODUCTION

M

ANY scientific studies have proven that an animal continuously senses its environment via different perceptual means and integrates the sensory information to adapt its behavior. The temporal aspect of this integration is fundamental for the sensory perception. A population of neurons makes a success of this dynamic integration by an intricate combination of synchronization of potential of action and recurring connections. Inspired by this biological mechanism, recurrent neural networks (RNNs) are believed to be a powerful sequence processing method. Recurrent interactions among large populations of neurons are expected to yield collective phenomena adapted for dealing with temporal behavior. Thus, RNNs, especially Hopfield neural networks and cellular neural networks, have found potential applications such as signal processing, pattern recognition, optimization, and associative memories. Some important results have been Manuscript received January 28, 2011; revised October 10, 2011; accepted October 16, 2011. Date of publication December 15, 2011; date of current version January 5, 2012. This work was supported by grants from General Direction of Scientific Research, Tunisia, under Program ARUB. B. Ammar and A. M. Alimi are with the Research Group on Intelligent Machines, Department of Electrical and Computer Engineering, National Engineering School of Sfax, University of Sfax, Sfax 3038, Tunisia (e-mail: [email protected]; [email protected]). F. Chérif is with the Department of Mathematics, EPAM, Sousse 4011, Tunisia (e-mail: [email protected]). Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/TNNLS.2011.2178444

reported ( [1]–[4] and the references therein). Consequently, there has been a rapidly growing research interest on the mathematical properties of RNNs, including the nature of solutions, stability, and the oscillation properties. Thereby, a great number of results have been available in the literature [5]– [10]. In particular, theoretical considerations have even shown that RNNs are universal approximators of dynamical systems (see [11], [12]). Many phenomena exhibit great regularity without being periodic. This is modeled using the notion of pseudo almostperiodic and the related functions which allow complex repetitive phenomena to be represented as an almost-periodic process plus an ergodic component. Besides, it is well known that there exist time delays in the information processing of neurons due to various reasons. For instance, time delays can be caused by the finite switching speed of amplifier circuits in neural networks or deliberately introduced to achieve the tasks of dealing with motion-related problems, such as moving image processing, pattern recognition, robotics, etc. Time delays in the neural networks make the dynamic behaviors more complex, and may destabilize the stable equilibria and admit almost-periodic oscillation, pseudo almost-periodic motion, bifurcation, and chaos [13], [14]. In this paper, motivated by the above discussions, we are concerned with the following RNNs with time-varying coefficients and mixed delays: ·

x i (t) = − ai x i (t) +

n 

(ci j (t) f j (x j (t)) + di j (t)

j =1

t ×g j (x j (t − τ )) + pi j (t) +Ji (t) , 1 ≤ i ≤ n.

ki j (t − s) h j (x j (s))ds)

−∞

(1)

This model has been the subject of intensive analysis by numerous authors in recent years. In particular, there have been extensive results on the problem of the existence and stability of periodic and almost-periodic solutions of RNNs in the literature ( [2], [3], [15]– [21] and the references therein). Since the space of pseudo almost-periodic functions contains strictly the space of almost-periodic functions and of periodic functions, the criteria obtained in this paper extend or improve the results given in [15], and [17]–[20]. Notice that in [22] and

2162–237X/$26.00 © 2011 IEEE

110

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 23, NO. 1, JANUARY 2012

[23] the delay τ (t) is a continuously almost-periodic function on R, while in this paper we will consider a constant delay. Moreover, our approach and the techniques used to prove the theorem of existence and uniqueness are different from that used in [23] because our results are mainly based on the fixed point method and the properties of the space of pseudo almostperiodic functions. The rest of this paper is organized as follows. In Section II, we will recall the basic properties of the pseudo almostperiodic functions. In Section III, we will introduce some necessary notations, definitions, and preliminaries that will be used later. The existence and uniqueness of pseudo almost-periodic solutions of (1) in the suitable convex set of P A P(R, Rn ) are discussed in Section IV. A numerical example is given in Section V to illustrate the effectiveness of our results. Finally, we draw conclusions in Section VI. II. A LMOST-P ERIODIC AND P SEUDO A LMOST-P ERIODIC F UNCTIONS Throughout this paper, we will use the following concepts and notations. BC(R, Rn ) denotes the set of bounded continued functions from R to Rn . Note that (BC(R, Rn ), .∞ ) is a Banach space where .∞ denotes the sup norm  f ∞ := sup  f (t) . t ∈R

Definition 1: Let f ∈ BC(R, Rn ). We say that f is almost-periodic (Bohr a.p.) or uniformly almost-periodic, when the following property is satisfied. For all ε > 0, there exists l > 0 such that: ∀α ∈ R, ∃δ ∈ [α, α + lε ] ,  f (· + δ) − f (·)∞ ≤ ε. A subset D of R is called relatively dense in R when ∃l > 0, ∀α ∈ R, D ∩ [α, α + l] = ∅. And so, by introducing the sets E( f, ε) := {r ∈ R,  f (·+r ) − f (·)∞ < ε}, we can formulate the definition of the Bohr almost periodicity of f ∈ C(R, Rn ) in the following manner, for each ε > 0, the set E( f, ε) is relatively dense in R. An element of E( f, ε) is called an ε−period of f. Consequently, a Bohr almost-periodic function is a continuous function that possesses very many almost-periods. We denote by A P(R, Rn ) the set of the Bohr a.p. functions from R to Rn . It is well known that the set A P(R, Rn ) is a Banach space with the supremum norm. We refer the reader to ([24]– [27]) for the basic theory of almost-periodic functions and their applications. Besides, the concept of pseudo almost periodicity (pap) was introduced by Zhang (see [28], [29]) in the early 1990s. It is a natural generalization of the classical almost periodicity. Define the class of functions P A P0 (R, Rn ) as follows: ⎫ ⎧ T ⎬ ⎨   1  f (t) dt = 0 . f ∈ BC R, Rn , lim ⎭ ⎩ T →+∞ 2T −T

Definition 2: A function f ∈ BC (R, X) is called pseudo almost-periodic if it can be expressed as f =h+ϕ

where h ∈ A P (R, Rn ) and ϕ ∈ P A P0 (R, Rn ) . The collection of such functions will be denoted by P A P (R, Rn ) . Remark 1: The functions h and ϕ in above definition are, respectively, called the almost-periodic component and the ergodic perturbation of the pseudo almost-periodic function f . Besides, the decomposition given in definition above is unique. Remark 2: Note that (P A P(R, Rn ), .∞ ) is a Banach n space and A P(R, Rn ) is a proper subspace of P A P(R, √ R ), 2 2 for example, the function φ(t) = cos t + cos 3t + exp −t 2 cos2 t is a pseudo almost-periodic function but not almost-periodic. III. D ESCRIPTION OF THE S YSTEM AND P RELIMINARIES The model of the delayed chaotic neural network considered in this paper is described by the following state equations: ·

x i (t) = −ai x i (t) +

n  (ci j (t) f j (x j (t)) + di j (t) j =1

t ×g j (x j (t − τ )) + pi j (t) +Ji (t) , 1 ≤ i ≤ n

ki j (t − s) h j (x j (s))ds)

−∞

where n is the number of the neurons in the neural network; x i (t) denotes the state of the i th neuron at time t; and f j (x j (t)), g j (x j (t)), and h j (x j (t)) are the activation functions of j th neuron at time t. The functions ci j (t), di j (t), and pi j (t) denote, respectively, the connection weights, the discretely delayed connection weights, and the distributively delayed connection weights of the j th neuron on the i neuron. Ji (t) is the external bias on the i th neuron, ai denotes the rate with which the i th neuron will reset its potential to the resting state in isolation when disconnected from the network and external inputs, and τ is the constant discrete time delay. Throughout this paper, we make the following assumptions. (H1) For all 1 ≤ j ≤ n, there exist positive constant f g numbers L j , L j , and L hj > 0 such that for all x, y ∈ R f j (x) − f j (y) < L f |x − y| , g j (x) − g j (y) j g < L |x − y| , h j (x) − h j (y) <

j h Lj

|x − y| .

(2)

Furthermore, we suppose that for all 1 ≤ j ≤ n, f j (0) = g j (0) = h j (0) = 0 and h j ∞ < +∞. (H2) For all 1 ≤ i, j ≤ n, the functions t −→ ci j (t) , t −→ di j (t) , t −→ pi j (t) , t −→ Ji (t) are pseudo almost-periodic on R. (H3) For all 1 ≤ i ≤ n, ai > 0 and τ > 0. (H4) For all 1 ≤ i, j ≤ n, the kernel ki j : [0, +∞[ −→ [0, +∞[ is pseudo almost-periodic function, and it satisfies +∞ ki j (s) ds = 1. 0

AMMAR et al.: EXISTENCE AND UNIQUENESS OF PSEUDO ALMOST-PERIODIC SOLUTIONS OF RNNs

(H5) For all 1 ≤ i, j ≤ n, we shall use the following notations:     ci j = sup ci j (t) , di j = sup di j (t) t ∈R t ∈R     pi j = sup pi j (t) , Ji = sup Ji (t) t ∈R

and suppose that ⎛ r = max ⎝

Proof: First, the function φi satisfies |φi (t)| ≤ h j ∞

f n j =1 (ci j L j

g

+ di j L j + pi j L hj ) ai

⎞ ⎠ < 1.

In this section, we establish some results for the existence, uniqueness, and the global exponential stability of the pseudo almost-periodic solution of (1). Lemma 1: If ϕ ∈ P A P(R, Rn ), then ϕ(· − h) ∈ P A P(R, Rn ). Proof: By definition, we can write ϕ = ϕ1 + ϕ2 , where ϕ1 ∈ A P(R, Rn ) and ϕ2 ∈ P A P0 (R, Rn ). Obviously

= h j ∞

Thus, for all n ≥ N, one has |φi (t + h n ) − φi (t)| t h ki j (t − s) x j (s + h n ) − x j (s) ds ≤ Lj

Observe that ϕ1 (· − h) ∈ A P(R, Rn ) and

1 T →+∞ 2T

≤ .

T ϕ2 (t − h) dt

−T T−h

ϕ2 (t) dt = 0

= lim

(3)

−T −h

which implies that ϕ2 (· − h) ∈ P A P0 (R, Rn ). So ϕ(· − h) ∈ P A P(R, Rn ). Lemma 2: If ϕ, ψ ∈ P A P(R, R), then ϕ × ψ ∈ P A P (R, R). Proof: By definition, we can write ϕ = ϕ1 + ϕ2 , ψ = ψ1 +ψ2 where ϕ1 , ψ1 ∈ A P(R, R) and ϕ2 , ψ2 ∈ P A P0 (R, R). Obviously ϕψ = ϕ1 ψ1 + ϕ1 ψ2 + ϕ2 ψ1 + ψ2 ϕ2 . T

where u j ∈ A P(R, R) and v j ∈ P A P0 (R, R). Consequently t φi (t) =

−T

t ki j (t − s) u j (s) ds +

−∞

ki j (t − s) v j (s) ds

−∞

= φi1 (t) + φi2 (t) .

T (|ϕ1 ψ2 | + |ϕ2 ψ1 | + |ψ2 ϕ2 |)dt −T

(

  ki j (t − s) u j (s) + v j (s) ds

−∞ t

=

−T



(4)

h j (x j (·)) = u j (·) + v j (·)

|ϕ1 ψ2 + ϕ2 ψ1 + ψ2 ϕ2 | dt

T

−∞

It remains to be proven whether the function φi belongs to P A P(R, R). First, note that by using respectively the composition theorem of pseudo almost-periodic functions [30] and Lemma 1, we immediately obtain the following, for all 1 ≤ i, j ≤ n, the functions s −→ ki j (t − s) and s −→ h j (x j (s)) belong to P A P(R, R). Lemma 2 implies that ψ : s −→ ki j (t − s) h j (x j (s)) belongs to P A P(R, R). Furthermore, for all 1 ≤ j ≤ n one has

Note that ϕ1 ψ1 ∈ A P(R, R) and

1 ≤ 2T

ki j (t) ds = h j ∞

t which proves that the integral −∞ ki j (t − s) h j (x j (s))ds is absolutely convergent and the function φi is bounded. Now, we have to prove the continuity of the function φi . Let (h n )n be a sequence of real numbers such that lim n→∞ h n = 0. The continuity of the function x j implies that for all  > 0, there exists N ∈ N such that x j (s + h n ) − x j (s) ≤  , for all n ≥ N. L hj

ϕ (· − h) = ϕ1 (· − h) + ϕ2 (· − h) .

1 T →+∞ 2T

ki j (t − s) ds

0

IV. M AIN R ESULTS

lim

t

−∞ +∞

t ∈R

1≤i≤n

1 2T

111

|ψ1 |∞ |ϕ2 |∞ |ϕ1 |∞ |ψ2 | + |ϕ2 | + |ψ2 |)dt = 0 2T 2T 2T

which implies that ϕ1 ψ2 + ϕ2 ψ1 + ψ2 ϕ2 ∈ A P0 (R, R). So ϕ × ψ ∈ P A P(R, R). Theorem 1: Assume that assumptions (H1) and (H4) hold and for all 1 ≤ j ≤ n, x j (·) ∈ P A P(R, R), then for all t 1 ≤ i ≤ n, the function φi : t −→ −∞ ki j (t − s) h j (x j (s))ds belongs to P A P(R, R).

Let us prove the almost periodicity of the function t −→ φi1 (t). For ε > 0, we consider, in view of the almost periodicity of u j , a number L ε such that in any interval [α, α + L ε ] one finds a number δ, with the property that   sup u j (ξ + δ) − u j (ξ ) < . ξ ∈R

Afterwards, we can write 1 φ (t + δ) − φ 1 (t) i i t +δ  t ki j (t + δ − s) u j (s) ds − ki j (t − s) u j (s) ds = −∞

−∞

112

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 23, NO. 1, JANUARY 2012

t ≤

ki j (t − s) u j (s + δ) − u j (s) ds

ki j (s) ds =  0

which implies that φi1 (·) ∈ A P(R, R). Now, we turn our attention to φi2 (·). We have to prove that for all 1 ≤ i ≤ n T   1  2  lim φi (s) ds = 0. T →+∞ 2T −T

pi j (s)

  ki j (t − ρ) h j ϕ j (ρ) dρ + Ji (s)

−∞

is pseudo almost-periodic since P A P(R, R) is vectorial space, and by using Lemma 2, Theorem 1, and the composition theorem of pseudo almost-periodic functions ( [30], [31]). Consequently, for all 1 ≤ i ≤ n, Fi can be expressed as Fi = Fi1 + Fi2 where Fi1 ∈ A P(R, R) and Fi2 ∈ P A P0 (R, R). So t ( i ϕ)(t) =

e

−(t −s)ai

t Fi1 (s) ds

−∞

T  t 1 lim ki j (t − s) v j (s) ds dt T →+∞ 2T −T −∞ T ∞ 1 = lim ki j (ρ) v j (t − ρ) dρ dt T →+∞ 2T

1 ≤ lim T →+∞ 2T ∞ ≤

0 −T T ∞

ki j (ρ) v j (t − ρ) dρdt

−T 0



1 ki j (ρ) ⎝ lim T →+∞ 2T

0

=0

T

⎞ v j (t − ρ) dt ⎠ dρ

−T

Lemma 3: Suppose that assumptions (H1)–(H4) hold. Define the nonlinear operator as follows, for each ϕ = (ϕ1 , . . . , ϕn ) ∈ P A P(R, Rn ), ( ϕ)(t) := x ϕ (t) where ⎞t ⎛ t  ⎜ e−(t −s)a1 F1 (s) ds ⎟ ⎟ ⎜ ⎟ ⎜−∞ ⎟ ⎜ ⎟ ⎜ .. x ϕ (t) = ⎜ ⎟ . ⎟ ⎜ ⎟ ⎜ t ⎟ ⎜ −(t −s)an ⎝ e Fn (s) ds ⎠ −∞ n      ci j (s) f j ϕ j (s) + di j (s) g j ϕ j (s − τ )

j =1

+

n  j =1

j =1

⎛ ⎝ pi j (s)

s

⎞   ki j (s − ρ) h j ϕ j (ρ) dρ ⎠ + Ji (s)

−∞

then maps P A P(R, Rn ) into itself. Proof: First, note that, for all 1 ≤ i ≤ n, the function Fi : s −→

n  j =1

−∞

t ∈R

Afterwards, we can write     i Fi1 (t + δ) − i Fi1 (t) t +δ  t = e−(t +δ−s)ai Fi1 (s) ds − e−(t −s)ai Fi1 (s) ds ≤

−∞

e−(t −s)ai Fi1 (s + δ) − Fi1 (s) ds

−∞



ε ai

which implies that ( i Fi1 ) ∈ A P(R, R). Now, we turn our attention to ( i Fi2 ). We have to prove that T  t 1 −(t −s)ai 2 e Fi (s) ds dt = 0. lim T →+∞ 2T −T −∞

Clearly

T  t 1 −(t −s)ai 2 dt ≤ I1 + I2 lim e F ds (s) i T →+∞ 2T −T −∞

and n 

e−(t −s)ai Fi2 (s) ds

t Let us prove the almost periodicity of ( i Fi1 ) : t −→ −∞ e−(t −s)ai Fi1 (s) ds. For ε > 0, we consider, in view of the almost periodicity of Fi1 , a number L ε such that in any interval [α, α + L ε ] one finds a number δ, with the property that sup Fi1 (t + δ) − Fi1 (t) < .

−∞ t

which implies that φi2 (·) ∈ P A P0 (R, R). Consequently, φ (t) = φi1 (t) + φi2 (t) is a pseudo almost-periodic function.

+

    = i Fi1 (t) + i Fi2 (t).

One has

Fi (s) =

s

j =1

−∞ +∞

≤ε

+

n 

n      ci j (s) f j ϕ j (s) + di j (s) g j ϕ j (s − τ ) j =1

where I1 = lim

T →+∞

1 2T

T



⎞ t ⎝ e−(t −s)ai Fi2 (s) ds ⎠ dt

−T

1 and I2 = lim T →+∞ 2T

−T T −T

⎞ −T −(t −s)ai 2 ⎠ dt ⎝ Fi (s) ds . e ⎛

−∞

Pose ξ = t − s, then by Fubini’s theorem one has ⎛ t ⎞ T  1 ⎝ e−(t −s)ai Fi2 (s) ds ⎠ dt 2T −T

−T

AMMAR et al.: EXISTENCE AND UNIQUENESS OF PSEUDO ALMOST-PERIODIC SOLUTIONS OF RNNs

=

=

1 2T 1 2T

1 ≤ 2T

T −T T −T T −T

⎛ ⎝

t

⎞ 2 ⎠ Fi (s) ds dt

113

−T

Theorem 2: Suppose that assumptions (H1)–(H5) hold. Then the delayed RNN of (1) has a unique pseudo almostperiodic solution in the region   rβ n B = B(ϕ0 , r ) = ϕ ∈ P A P(R, R ), ϕ − ϕ0  ≤ (1 − r )

0

where

e−(t −s)ai

⎞ ⎛ t +T  ⎝ e−ξ ai Fi2 (t − ξ ) dξ ⎠ dt ⎞ ⎛ +∞  ⎝ e−ξ ai Fi2 (t − ξ ) dξ ⎠ dt



⎜ e−(t −s)a1 J1 (s)ds ⎟ ⎟ ⎜ ⎟ ⎜−∞ ⎟ ⎜ . .. ϕ0 (t) = ⎜ ⎟ ⎟ ⎜ t ⎟ ⎜ −(t −s)an ⎝ e Jn (s)ds ⎠

0

⎞ ⎛ +∞ T 1 = e−ξ ai ⎝ Fi2 (t − ξ ) dt ⎠ dξ 2T 0 −T ⎛ ⎞ T−ξ +∞ ⎟ ⎜1 2 ≤ e−ξ ai ⎝ Fi (u) du ⎠ dξ 2T 0 −T −ξ ⎞ ⎛ T+ξ +∞ 1 ⎜T + ξ 2 ⎟ e−ξ ai ⎝ ≤ Fi (u) du ⎠ dξ. T 2 (T + ξ )

−∞

and

⎛

Since the function defined by

Fi2 (·)

1≤i≤n

T+ξ

2 F (u) du i

−T

−T

On the other hand, notice that Fi2 ∞ = sup t ∈R Fi2 (t) < ∞, then ⎞ ⎛ −T T  1 −(t −s)ai 2 ⎠ ⎝ I2 = lim Fi (s) ds dt e T →+∞ 2T −∞ −T ⎞ ⎛ −T T  1 ⎝ = lim e−(t −s)ai Fi2 (s) ds ⎠ dt T →+∞ 2T −∞ −T 2 ⎞ ⎛ +∞ sup Fi (t) T  t ∈R ⎝ ≤ lim e−ξ ai dξ ⎠ dt T →+∞ 2T 2 −T ⎛ t +T ⎞ sup Fi (t) T +∞ t ∈R ⎝ ≤ lim e−ξ ai dξ ⎠ dt T →+∞ 2T 2T 2 −T sup Fi (t) t ∈R = lim e−2ai T = 0. T →+∞ ai

⎞ f g L j + L j di j + L hj pi j ) ⎠. ai

   t ϕ0 (t) = sup max e−(t −s)ai Ji (s)ds −∞ t ∈R 1≤i≤n   Ji ≤ max = β. 1≤i≤n ai

Therefore ϕ (t) ≤ ϕ (t) − ϕ0 (t) + ϕ0 (t) ≤ ϕ (t) − ϕ0 (t) + β.

−T −ξ

is bounded and satisfies lim T →+∞ T (ξ ) = 0. Consequently, by the Lebesgue dominated convergence theorem, we obtain ⎞ ⎛ t T  1 −(t −s)ai 2 ⎠ ⎝ Fi (s) ds dt = 0. I1 = lim e T →+∞ 2T



Proof: One has

∈ P A P0 (R, R), the function T

1

T (ξ ) = 2T



n j =1 ( ci j

r = max ⎝

−T −ξ

0

⎞t

t

Set B = B(ϕ0 , r ) = {ϕ ∈ P A P(R, Rn ), ϕ − ϕ0  ≤ (rβ/ (1 − r ))}. Clearly, B is a closed convex subset of P A P(R, Rn ) and, therefore, for any ϕ ∈ B by using the estimate just obtained, we see that β rβ +β = . 1−r 1−r Let us prove that the operator is a self-mapping from B to B. In fact, for any ϕ ∈ B, we have ϕ ≤ ϕ − ϕ0  + ϕ0  ≤

( ϕ)(t) − ϕ0 (t) ≤ sup max ⎡ t

e−(t −s)ai

−∞

⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎣

t ∈R 1≤i≤n

⎤ ci j (s) f j (ϕ j (s)) + di j (s) ⎥ ⎥ j =1 ⎥ ⎥ g j (ϕ j (s − τ )) + pi j (s) ⎥ ds s  ⎥  ⎥ ⎦ ki j (s − ρ) h j (ϕ j (ρ)) dρ

n  

−∞

t ≤ ϕ sup max

t ∈R 1≤i≤n −∞

e−(t −s)ai

 n   ci j (s) L f + di j (s) L g + pi j (t) L h dρ ds j j j j =1

⎛

2 Consequently, the function  ( iFi ) belongs to P A P0 (R, R). So for all 1 ≤ i ≤ n, i Fi belongs to P A P(R, R) and consequently ( ϕ) belongs to P A P(R, Rn ).

≤ ϕ max ⎝

Remark 3: Note that, for proving the almost periodicity of the operator , our approach is different from that in [23].



1≤i≤n

rβ . (1 − r )

f n j =1 (ci j L j

g

+ di j L j + pi j L hj ) ai

⎞ ⎠

114

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 23, NO. 1, JANUARY 2012

In view of (H1), for any ϕ, ψ ∈ B, we have

Take an arbitrary  > 0. Set, for all 1 ≤ i ≤ n

( ϕ)(t) − ( ψ)(t) ≤ max

z i (t) = |x i (t) − ϕi (t)| eμt .

1≤i≤n

t

e−(t −s)ai

 n

Then, for all 1 ≤ i ≤ n, and for all −τ ≤ t ≤ 0, one has

ci j (s) f j (ϕ j (s)) − f j (ψ j (s))

+ di j (s) g j (ϕ j (s − τ )) − g j (ψ j (s − τ ))  s + pi j (s) ki j (t − l) h j (ϕ j (l)) − h j (ψ j (l)) dl ds −∞

≤ max

e−(t −s)ai

 n

ci j (s) L f ϕ j (s) − ψ j (s) j

j =1

−∞

g + di j (s) L j ϕ j (s) − ψ j (s) + pi j (s)  s ki j (t − l) L hj ϕ j (l) − ψ j (l)) dl ds

z i (t) ≤ M + . Let us denote ts = min ti . It follows that 0 < ts < +∞ and

≤ max 1≤i≤n ⎛ ⎞ n  f c i j L + L g d i j + L h p i j ⎠ ⎝ j j j

1≤i≤n

for all −τ ≤ t ≤ ts , one has z i (t) ≤ M + .

j =1

ϕ − ψ∞

Note that e−(t −s)ai ds

z s (ts ) = M +  and D + z s (ts ) ≥ 0.

−∞

⎛ ≤ max ⎝

z i (t) ≤ M + .

Then, ti > 0 and for all −τ ≤ t < ti , one has

−∞

t

In the following, we shall prove that for all t > 0 :

Suppose the contrary. Let us denote Ai = {t > 0, z i (t) > M + }. It follows that there exists 1 ≤ i 0 ≤ n such that Ai0 = ∅. Let  inf Ai if {t > 0, z i (t) > M + } = ∅ ti = . +∞ if {t > 0, z i (t) > M + } = ∅

1≤i≤n

t

z i (t) ≤ M < M + .

j =1

−∞





n j =1 ci j

1≤i≤n

⎞ f g L j + L j di j + L hj pi j ⎠ ϕ − ψ∞ ai

≤ r ϕ − ψ∞ . By (H5), is a contraction mapping. Then, by virtue of the Banach fixed point theorem, has a unique fixed point which corresponds to the solution of (1) in B ⊂P A P(R, Rn ). Theorem 3: Suppose that assumptions (H1) − (H5 ) hold. Let x = (x 1 , . . . , x n ) is the unique pseudo almost-periodic solution of (1) in B. If for every sufficiently small t > 0 n    f g L j ci j + eτ t di j L j + pi j L hj > 0 (H6) ai −

Now since x i (·) and ϕi (·) are solutions of (1), we get " # 0 ≤ D + z s (ts ) = D + |x s (t) − ϕs (t)| eμt |t =t % D + |x i (t) − ϕi (t)| =e |t =ts μ |x s (t) − ϕs (t)| + dt < |x s (ts ) − ϕs (ts )| μeμts + eμts (−as |x s (ts ) − ϕs (ts )| n  ci j (ts ) L f x j (ts ) − ϕ j (ts ) + μts

j

j =1

x(t) − ϕ (t) ≤ Me

pi j (ts ) L h j

j =1

≤ (M + ) (μ − as ) +

where M = sup−τ ≤t ≤0 x(t) − ϕ (t) . Proof: For 1 ≤ i ≤ n, set

j =1

+e

j =1 n 

μts

−μt

ψi (t) = t − ai  n   f g L j ci j + eτ t di j L j + pi j L hj +

n  di j (ts ) L g x j (ts − τ ) − ϕ j (ts − τ ) ) j

+eμτ

j =1

then there exists a constant μ > 0 such that for any solution x = (x 1 , . . . , x n ) of (1) in B and for all t > 0 we have

s

D + z i (t) |t =ts = dt $

$

ts

ki j (ts −ρ) x j (ρ))−ϕ j (ρ) dρ

−∞

n  ci j (ts ) L f z j (ts ) j j =1

+∞



+eμτ

j =1

ki j (ρ) etρ dρ .

0

It is clear that the functions t −→ ψi (t) are continuous functions on R+ and by hypothesis (H6 ), ψi (0) < 0. Thus, there exists a sufficiently small constant μ such that ψi (μ) < 0, for all 1 ≤ i ≤ n.

n  di j (ts ) L g z j (ts − τ ) j

+

n  j =1

pi j (ts ) L h j ⎛

+∞ ki j (u) z j (ts − u)du 0

≤ (M + ) ⎝μ − as +

n  j =1

⎞ ci j L j + eμτ di j L j + pi j L hj ⎠ . f

g

%

AMMAR et al.: EXISTENCE AND UNIQUENESS OF PSEUDO ALMOST-PERIODIC SOLUTIONS OF RNNs

  f g It follows that μ − ai + nj =1 ci j L j + eμτ nj =1 di j L j + n h j =1 pi j L j ≥ 0, that is, ψi (μ) ≥ 0, which contradicts the fact that ψi (μ) < 0. Thus we obtain that for all t > 0

+

i=1 j =1

t n  n  pi j (t) + L hj ki j (t − s) |x i (t)

1≤i≤n

we obtain for all t > 0

i=1 j =1

x (t) − ϕ (t) ≤ (M + ) e−μt .

ai −

f L j ci j

g + di j L j

+

pi j L hj



Otherwise DV2+ (t) ≤

>0

d (x i − ϕi ) = −ai (x i (t) − ϕi (t)) dt n  " # + ci j (t) f j (x j (t)) − f j (ϕ j (t))



×⎣



DV3+ (t) ≤

t



ki j (t − s) h j (x j (s))ds − ki j (t − s) h j (ϕ j (s))ds ⎦.

−∞

i=1 j =1 n n  

−∞

  V : R −→ P A P R, Rn t −→ V1 + V2 + V3

n n n where V1 (t) = |x i (t) − ϕi (t)|, V2 (t) = i=1 i=1 t n j n=1  +∞ g j =1 i=1 0 t −τ Lj di j |x i (s) − ϕi (s)|ds, and V3 (t) = t ki j (s) t −s L hj pi j |x i (u) − ϕi (u)|du. Let us calculate the upper right Dini derivative D + V (t) of V along the solution of the equation above. Then we get DV1+ (t) ≤

−ai |x i (t) − ϕi (t)|

i=1 n n   ci j (t) f j (x j (t)) − f j (ϕ j (t)) + i=1 j =1 n n  

g

L j di j |x i (t − τ ) − ϕi (t − τ )|

n n  

L hj pi j

j =1 i=1

+∞ ' & ki j (s) |x i (t) − ϕi (t)| − |x i (t − s) − ϕi (t − s)| 0



n n  

L hj pi j |x i (t) − ϕi (t)|

j =1 i=1

Consider the Lyapunov function

n 

'

and

j =1

t

j =1 i=1

i=1 j =1

n " #  di j (t) g j (x j (t − τ )) − g j (ϕ j (t − τ )) + pi j (t)

j =1

g

L j di j

|x i (t) − ϕi (t)| − |x i (t − τ ) − ϕi (t − τ )| n n   g L j di j |x i (t) − ϕi (t)| ≤

then all solutions of (1) in the convex B converge to its unique pseudo almost-periodic solution. Proof: Let x i (·) be a solution of (1) and ϕi be a pseudo almost-periodic solution of (1) . Then

+

n n  

&

j =1

j =1 n 

−∞

−ϕi (t)| ds.

The proof of this theorem is completed. Theorem 4: Suppose that assumptions (H1)–(H5) hold. If n  

f L j ci j (t) |x i (t) − ϕi (t)|

i=1 j =1

Note that x (t) − ϕ (t) = max |x i (t) − ϕi (t)|, then passing to the limit when  →

n n  

n n   di j (t) L g |x i (t − τ ) − ϕi (t − τ )| + j

z i (t) = |x i (t) − ϕi (t)| ≤ (M + ) e−μt . 0+

115



n n  

L hj pi j

j =1 i=1

+∞ ki j (s) |x i (t − s) 0

−ϕi (t − s)| ds. By using the inequality D + (F1 + F2 ) ≤ D + (F1 ) + D + (F2 ) we get + + + D + (V (t)) ⎛ ≤ DV1 (t) + DV2 (t) + DV3⎞(t) n n   f g ≤ − ⎝ai− L j ci j + di j L j + pi j L hj ⎠ |x i (t) − ϕi (t)| i=1 n 

=−

j =1

αi |x i (t) − ϕi (t)| < 0.

i=1

di j (t) g j (x j (t − τ )) − g j (ϕ j (t − τ )) By integrating the above inequality from t0 to t, we get t i=1 j =1 n  t |x i (s) − ϕi (s)| ds < V (t0 ) < +∞. V (t) + α  n  n i  i=1 ki j (t − s) h j (x j (s))ds pi j (t) + t0 +

i=1 j =1

−∞ −h j (ϕ j (s)) ds n  ≤− ai |x i (t) − ϕi (t)| i=1

Clearly V (t) > 0, it follows that t αi |x i (s) − ϕi (s)| ds < V (t0 ) < +∞.

lim sup

t →+∞

t0

116

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 23, NO. 1, JANUARY 2012

Note that x i (·) is bounded on R+ . Therefore

1.2

lim |x i (s) − ϕi (s)| = 0.

t →+∞

The proof of this theorem is completed. Remark 4: The proof is similar to [6, Th. 5]. Notice that the pseudo almost periodicity is without importance in the proof of the above theorem and we have to replace the time-varying delay τ (t) by a constant delay τ. Remark 5: If we let f j = g j and pi j = 0, (1) changes into the model of [32]. Besides, in [33], the authors have analyzed the global stability of the following system: t n  · x i (t) = −ai x i (t) + pi j (t) ki j (t − s) h j (x j (s))ds j =1

Solution x

0.8 0.6 0.4 0.2 0 −0.2

0

+∞ +∞ ki j (s) ds = 1 and ski j (s) ds = 1. 0

V. A PPLICATION In order to illustrate some features of our main results, in this section we will apply our main results to some special 3-D systems and demonstrate the efficiencies of our criteria. Let us consider the following RNN: 3  (ci j (t) f j (x j (t)) + di j (t) j =1

t g j (x j (t − τ ))) + pi j (t) +Ji (t) , 1 ≤ i ≤ 3

1500

Fig. 1. Curve of the pseudo almost-periodic solution for RNNs with timevarying coefficients and mixed delays (τ = 2, a1 = 3, a2 = 6, and a3 = 7).



⇒ ci j

ki j (t − s) h j (x j (s))ds

−∞

(5)

where a1 = 3, a2 = 6, a3 = 7 |x + 1| − |x − 1| f j (x) = g j (x) = h j (x) = 2   ci j (t) 1≤i, j ≤3 = ⎛1 ⎞ √ 1 3 1 5 cos t + 10 cos 2t 5 10 √ √ ⎟ 2 2 ⎜ 1 1 1 1 + e−t cos t cos t + 10 cos 2t 10 cos 2t ⎟ ⎜ 5 5 ⎝ 1 ⎠ √ 1 1 −t 2 cos2 t sin t + sin 2t 15 cos t + 10 e cos t 5 5

 1≤i, j ≤3

  di j 1≤i, j ≤3 = √ ⎛1 1 5 cos t + 10 cos √ 2t ⎝ 1 cos t + cos 2t 5 √ 1 5 sin t + sin 2t

⎛ ⎞ 0.5 0.2 0.1 = ⎝1.2 0.3 0.1⎠ 1.2 0.3 0.2 √

⎞ 1 1 1 5 cos t + 10 cos √2t 5 cos√t 1 1 1 ⎠ 5 cos t + 10 cos 2t 10 cos 2t 1 1 −t 2 cos2 t 1 5 cos t + 10 e 10 cos t

⎛ ⎞ 0.3 0.3 0.2   ⇒ di j 1≤i, j ≤3 = ⎝1.2 0.3 0.1⎠ 1.2 0.3 0.1

0

It turns out that the last condition is not necessary for our study.

·

1000 Time t

as a model for neural networks involving distributed time delays arising from signal propagation. Due to the difference in the methods discussed, the results in this paper and those in the above references are different. Therefore, our results are novel and have some significance in theories as well as in applications of almost-periodic oscillatory neural networks. On the other hand, a different approach is used in [6] to obtain several sufficient conditions for the existence of almostperiodic solution for a new class of RNNs similar to (1). Note that in this paper the kernel ki j is a piecewise continuous integrable function and satisfies

x i (t) = −ai x i (t) +

500

−∞

+Ji , 1 ≤ i ≤ n

for all x ∈ R

x1 x2 x3

1



pi j ⎛1

 1≤i, j ≤3

=

√ 3 cos t + 10 cos√ 2t ⎝ 1 cos t + cos 2t 5 √ 1 2 sin t + sin 2t 5

√ cos 2t √ 1 5 cos t + 0.1 cos 2t 1 5 cos t 1 10

3 10 3 10

√ ⎞ cos √2t cos 2t ⎠ 1 5 cos t



⎞ 0.5 0.1 0.3 ⇒ pi j 1≤i, j ≤3 = ⎝1.2 0.3 0.3⎠ , 1.5 0.2 0.2 ⎞ ⎛ √ 1 3 1 −t 2 cos2 t cos t + 10 cos 2t + e 5 2 √ ⎟ ⎜ 1 Ji (t) = ⎝ ⎠ 5 cos t + cos 2t 2 2 1 1 −t cos t cos t + e 5 2 ⎛ ⎞ 1 1 ⇒ (Ji )1≤i≤3 = ⎝ 1.2 ⎠ ⇒ β = , 3 0.7 



⎛

⎞ f g L j + L j di j + L hj pi j ) ⎠ r = max ⎝ 1≤i≤3 ai ( 3 ) j =1 ( ci j + di j + pi j ) = max 1≤i≤3 ai



3 j =1 ( ci j

= max (0.833, 0.833, 0.742) < 1. 1≤i≤3

AMMAR et al.: EXISTENCE AND UNIQUENESS OF PSEUDO ALMOST-PERIODIC SOLUTIONS OF RNNs

Therefore, all conditions of Theorem 3 are satisfied; then, the delayed RNNs of (5) have a unique pseudo almost-periodic solution in the region (see Fig. 1)   rβ . B = B(ϕ0 , r ) = ϕ ∈ P A P(R, R3 ), ϕ − ϕ0  ≤ (1 − r ) VI. C ONCLUSION We remark that, in nature, there is no phenomenon that is purely periodic, and this gives the idea to consider the almost-periodic oscillation and the pseudo almost-periodic situations. So, in this paper, some novel sufficient conditions were presented ensuring the existence and uniqueness of the pseudo almost-periodic solution for RNNs with time-varying coefficients and mixed delays. All criteria were found without assuming that the networks have almost-periodic or pseudo almost-periodic activation functions. The only restriction for the activation function is the Lipschitz property. We claim that many results in the literature dealing with periodic or almostperiodic solutions of RNNs are special cases of the results in this paper. As the best of our knowledge, this is the first paper considering the pseudo almost-periodic RNN. Notice that the method of this paper may be extended to study some other systems. Finally, an illustrative example was given to demonstrate the effectiveness of the obtained results. R EFERENCES [1] B. Cannas, S. Cincotti, M. Marchesi, and F. Pilo, “Learning of Chua’s circuit attractors by locally recurrent neural networks,” Chaos, Solitons Fractals, vol. 12, no. 11, pp. 2109–2115, Sep. 2001. [2] Q. Dong, K. Matsui, and X. Haung, “Existence and stability of periodic solutions for Hopfield neural network equations with periodic input,” Nonlin. Anal.: Theory Methods Appl., vol. 49, no. 4, pp. 471–479, May 2002. [3] Z. Gui, W. Ge, and X. Yang, “Periodic oscillation for a Hopfield neural networks with neutral delays,” Phys. Lett., vol. 364, nos. 3–4, pp. 267– 273, Apr. 2007. [4] J. Hopfield, “Neurons with graded response have collective computational properties like those of two-state neurons,” Proc. Nat. Acad. Sci. United States Amer., vol. 81, no. 10, pp. 3088–3092, May 1984. [5] W. Allegretto, D. Papini, and M. Forti, “Common asymptotic behavior of solutions and almost periodicity for discontinuous, delayed, and impulsive neural networks,” IEEE Trans. Neural Netw., vol. 21, no. 7, pp. 1110–1125, Jul. 2010. [6] X. Huang, J. Cao, and D. W. C. Ho, “Existence and attractivity of almost periodic solution for recurrent neural networks with unbounded delays and variable coefficients,” Nonlin. Dyn., vol. 45, nos. 3–4, pp. 337–351, 2006. [7] C. Li and X. Liao, “New algebraic conditions for global exponential stability of delayed recurrent neural networks,” Neurocomputing, vol. 64, pp. 319–333, Mar. 2005. [8] Y. Liu, Z. Wang, and X. Liu, “Global exponential stability of generalized recurrent neural networks with discrete and distributed delays,” Neural Netw., vol. 19, no. 5, pp. 667–675, Jun. 2006. [9] Q. Song, “Novel criteria for global exponential periodicity and stability of recurrent neural networks with time-varying delays,” Chaos Solitons Fractals, vol. 36, no. 3, pp. 720–728, May 2008. [10] Q. Song, “Exponential stability of recurrent neural networks with both time-varying delays and general activation functions via LMI approach,” Neurocomputing, vol. 71, nos. 13–15, pp. 2823–2830, Aug. 2008. [11] A. M. Schäfer and H. Zimmermann, “Recurrent neural networks are universal approximators,” in Artificial Neural Networks (Lecture Notes in Computer Science), vol. 4131. New York: Springer-Verlag, 2006, pp. 632–640. [12] P. E. Sottas, “Temporal sequence learning with non-equilibrium recurrent neural networks,” M.S. thesis, Faculté Inf. Commun., École Polytechnique Fédérale de Lausanne, Ecublens, Switzerland, 2002.

117

[13] J. Ge and J. Xu, “Computation of synchronized periodic solution in a BAM network with two delays,” IEEE Trans. Neural Netw., vol. 21, no. 3, pp. 439–450, Mar. 2010. [14] Y. Liu, Z. Wang, J. Liang, and X. Liu, “Stability and synchronization of discrete-time Markovian jumping neural networks with mixed modedependent time delays,” IEEE Trans. Neural Netw., vol. 20, no. 7, pp. 1102–1116, Jul. 2009. [15] J. Cao, H. Hang, and J. Wang, “Global exponential stability and periodic solutions of recurrent neural networks with delays,” Phys. Lett. A, vol. 298, nos. 5–6, pp. 393–404, 2002. [16] J. Cao, A. Chen, and X. Huang, “Almost periodic attraction of delayed neural networks with variable coefficients,” Phys. Lett., vol. 340, nos. 1– 4, pp. 104–120, 2005. [17] H. Huang, J. Cao, and J. Wang, “Global exponential stability and periodic solutions of recurrent neural networks with delays,” Phys. Lett. A, vol. 298, nos. 5–6, pp. 393–404, Jun. 2002. [18] B. Liu and L. Huang, “Existence and exponential stability of almost periodic solutions for Hopeld neural networks with delays,” Neurocomputing, vol. 68, pp. 196–207, Oct. 2005. [19] Z. Liu, A. Chen, J. Cao, and L. Huang, “Existence and global exponential stability of almost periodic solutions of BAM neural networks with continuously distributed delays,” Phys. Lett. A, vol. 319, nos. 3–4, pp. 305–316, Dec. 2003. [20] B. Liu, “Almost periodic solutions for Hopfield neural networks with continuously distributed delays,” Math. Comput. Simul., vol. 73, no. 5, pp. 327–335, Jan. 2007. [21] H. Zhao, “Existence and global attractivity of almost periodic solution for cellular neural network with distributed delays,” Appl. Math. Comput., vol. 154, no. 3, pp. 683–695, Jul. 2004. [22] L. Wang, W. Lu, and T. Chen, “Multistability and new attraction basins of almost-periodic solutions of delayed neural networks,” IEEE Trans. Neural Netw., vol. 20, no. 10, pp. 1581–1593, Oct. 2009. [23] H. Xiang and J. Cao, “Almost periodic solutions of recurrent neural networks with continuously distributed delays,” Nonlin. Anal.: Theory, Methods Appl., vol. 71, no. 12, pp. 6097–6108, Dec. 2009. [24] L. Amerio and G. Prouse, Almost-Periodic Functions and Functional Equations. New York: Van Nostrand, 1971. [25] F. Chérif, “A various types of almost periodic functions on Banach spaces: Part I,” Int. Math. Forum, vol. 6, no. 19, pp. 921–952, 2011. [26] F. Chérif, “A various types of almost periodic functions on Banach spaces: Part II,” Int. Math. Forum, vol. 6, no. 20, pp. 953–985, 2011. [27] A. M. Fink, Almost Periodic Differential Equations (Lecture Notes in Mathematics), vol. 377. New York: Springer-Verlag, 1974. [28] C. Zhang, “Pseudo almost periodic functions and their applications,” Ph.D. thesis, Dept. Math., Univ. Western Ontario, London, ON, Canada, 1992. [29] C. Zhang, Almost Periodic Type Functions and Ergodicity. Beijing, China: Kluwer, 2003. [30] B. Amir and L. Maniar, “Composition of pseudo almost periodic functions and cauchy problems with operator of non dense domain,” Ann. Math. Blaise Pascal, vol. 6, no. 1, pp. 1–11, 1999. [31] H. Li, F. Huang, and J.-Y. Li, “Composition of pseudo almost-periodic functions and semilinear differential equations,” J. Math. Anal. Appl. vol. 255, no. 2, pp. 436–446, Mar. 2001. [32] J. Cao, “New results concerning exponential stability and periodic solutions of delayed cellular neural networks,” Phys. Lett. A, vol. 307, nos. 2–3, pp. 136–147, 2003. [33] K. Gopalsamy and X. Z. He, “Delay-independent stability in bidirectional associative memory networks,” IEEE Trans. Neural Netw., vol. 5, no. 6, pp. 998–1002, Nov. 1994. Boudour Ammar (S’08) was born in Sfax, Tunisia, in 1982. She received the Graduate degree in computer science and the Masters degree in automatic and industrial computing from the National School of Engineers of Sfax, University of Sfax, Sfax, in 2005 and 2006, respectively. She has been pursuing the Ph.D. degree in learning systems and robotics with the Research Group on Intelligent Machines, University of Sfax, since 2008. Farouk Chérif received the Ph.D. degree in mathematics from University Paris 1 Panthéon-Sorbonne, Paris, France, in 1995. He was a Research Associate at CERMCEM, University Paris 1 PanthéonSorbonne, from August 1992 to December 1998. He is currently an Associate Professor with the Department of Mathematics and Scientific Computing, Military Academy, Sousse, Tunisia. His current research interests include neural network-based controls, oscillatory phenomena, Lyapunov exponents, and Hamiltonians systems.

118

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 23, NO. 1, JANUARY 2012

Adel M. Alimi (SM’00) was born in Sfax, Tunisia, in 1966. He received the Graduate degree in electrical engineering in 1990 and the Ph.D. and H.D.R. degrees in electrical and computer engineering in 1995 and 2000, respectively. He is currently a Professor with the Department of Electrical and Computer Engineering, University of Sfax, Sfax. His current research interests include applications of intelligent methods, neural networks, fuzzy logic, evolutionary algorithms, pattern recognition, robotic systems, vision systems, industrial processes, intelligent pattern recognition, learning, analysis, and intelligent control of large-scale complex systems. Prof. Alimi is an Associate Editor and Editorial Board Member of many international scientific journals, including the IEEE T RANSACTIONS ON F UZZY S YSTEMS , Pattern Recognition Letters, NeuroComputing, Neural Processing Letters, the International Journal of Image and Graphics, Neural Computing and Applications, the International Journal of Robotics and Automation, and the International Journal of Systems Science. He was a Guest Editor of several special issues of international journals, including Fuzzy Sets and Systems, Soft Computing, the Journal of Decision Systems, Integrated Computer Aided Engineering, and Systems Analysis Modeling and Simulations. He is the Founder and the Chair of many IEEE Chapters in the Tunisia Section, the IEEE Sfax Subsection Chair in 2011, the IEEE ENIS Student Branch Counselor in 2011, the IEEE Systems, Man, and Cybernetics Society Tunisia Chapter Chair in 2011, and the IEEE Computer Society Tunisia Chapter Chair in 2011. He is an Expert Evaluator for the European Agency for Research. He was the General Chairman of the International Conference on Machine Intelligence ACIDCA-ICMI’2005 and 2000.

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.