The Key to AGI: Aspect-Oriented

TagtalLabs
23 min readSep 30, 2023

--

Aspect-Oriented: a Candidate for the Biologically Inspired Programming Paradigm for Neural Networks and Evolvable Software

Abstract

We observe that the nervous systems of biology handle “the non-orthogonal concerns” effectively other than the Object-Oriented paradigm does. This natural phenomenon inspires us with a programming paradigm to handle “cross-cutting”. It is argued that the Aspect-Oriented paradigm is a candidate for the biologically inspired programming paradigm. To support this point, the Aspect-Oriented paradigm is used to implement a simple Artificial Neural Networks(ANN) and the preliminary experiment shows good results. In addition, we proposed a biologically inspired framework of evolvable software which could be implemented by using the Aspect-Oriented paradigm combining with neurocomputing and genetic computing.

Key words: biologically-inspired computing,programming

paradigm,aspect-oriented programming, object-oriented,evolvable software

1 Introduction

Neural networks are massively parallel processing systems that require expensive and usually not available hardware, in order to be realized. Instead of hardware implementation software simulation is widely used in research and industry, for its low costs and flexibility. Moreover, Object-Oriented Programming(OOP) language is the usual choice to implement artificial neural networks. The artificial neural networks, however, implemented with Object Oriented(OO) technology provide little support to change the size and topology of the neural networks in runtime[1,2]. Furthermore, the parallelism of artificial neural networks is not friendly to the programmers who have to transform manually the parallel neural networks into the codes executed serially since an OO program is a collection of dialoguing of a group of objects indeed. This manual transformation is a complex process and the final implementations are often hard to be reused. We argue that these problems in implementation of artificial neural networks can be solved by using Aspect-Oriented Programming (AOP).

Our research reveals that nervous systems of biology handle “non-orthogonal concerns’’(cross-cutting) effectively. This natural phenomenon inspires a programming paradigm for neural networks and evolvable software, which can handle non-orthogonal concerns effectively as that of nervous systems of biology. As far as we know, there is no discussion on this phenomenon.

Moreover, we believe that the AO paradigm is a candidate for the desired paradigm since the AO paradigm partly satisfies the considerations to implement the desired paradigm and bears a natural analogy with nerve systems. To verify this idea, we implement a simple artificial neural network which solves the MONK’s problem using AOP.

Nevertheless, the simulation of neural networks is never the only thing we talk about here. The more interesting topic in this paper is how to apply artificial neural networks in evolvable software development though this idea is very young now.

The next section gives brief reviews on separation of concerns and AOP. The biologically-inspired programming paradigm is discussed in detail in section 3. The implementation of ANN using the AO paradigm is arranged in section 4. We propose a biologically-inspired framework of evolvable software in section 5. Section 6 is the conclusion.

2 Reviews on the Principle of Separation of Concerns and AOP

One of the approaches to solve complex problems is to divide the problem into some smaller, simpler and loose coupling sub-problems, which is called the separation of concerns principle [3]. The principle confirms that we could not handle many problems at a time and we should deal with problems one by one, and the important problem should be represented intentionally (clearly and declaredly) and should be localized. Thus we can get intelligibility, adaptability, reusability and many other important qualities of software systems since the degree of satisfaction of software requirements could be conveniently verified with such intentional representation and localization. Nevertheless, the separation of concerns principle gives no guide to separate concerns. Many methods are proposed in the past decades, such as procedure-oriented method and OO method, to e
ectively handle orthogonal concerns. Unfortunately, many concerns are non-orthogonal. As a result, there are massive redundancies when we represent all of these problems intentionally and locally with these methods.

Aspect-Oriented Programming (AOP) [4]provides a mechanism to handle non-orthogonal concerns by modularizing cross-cutting via augmenting the Object-Oriented programming paradigm with Aspect, join point, Pointcut and Advice[5]. AOP is based on the idea that computer systems are better programmed by separately specifying the various concerns and properties or areas of interest of a system and some description of their relationships, and then relying on instruments in the AOP environment to weave or compose them together into a coherent program.[6] AOP does what OOP cannot do effectively, that is clearly and cleanly modularize functional system code, i.e. source code, by separating concerns into well localized units, called aspects, to eliminate code tangling.

The OO technology becomes popular owing to its synthesizing three important factors of software development, i.e. computer platform, thought way and the features of issues. The basic elements of the OO paradigm are objects and interactions between them. This paradigm is in harmony with the thought process of humans and the rules of the natural world and hence that we can map problem space to design space directly, and then to programming space[7]. It should be pointed out that the rules of the nature world the OO paradigm inspired are the rules of the mechanistic nature. We can innovate new software technology with inspirations from the organic nature to handle non-orthogonal concerns effectively other than OO does[12].

3 The Biologically-inspired Programming Paradigm and it’s Candidate

3.1 The biologically-inspired paradigm to handle cross-cutting

We observe that the nervous systems handle “cross-cutting” in their body, which inspires us with a programming paradigm to handle cross-cutting in software systems.

3.1.1 How does nature handle “cross-cutting”

The communications between objects in the Object-Oriented paradigm are very close to intercellular communications in primitive multicellular animals without a nervous system, such as sponges, in which the cells communicate directly via gap junction with knowing exactly each other. 5 Sponges have no nervous system(only a kind of neuron without synapse). Their responses to stimulation are regional, slow and its level depends on the strength of stimulation. The information materials are transmitted by di
usion of gelatinous materials, dissociating amoebocyte and the contact of fixed cells. Sponges can do very simple behaviors with this mechanism.

Although, in the view of great nature, Object-Oriented technology is pretty primitive, we have built a few systems that are most complex and often hardest to understand and maintain for humans. How low complexity becomes a prominent problem while the scale of the software systems is increasing. Therefore, we should find new methods to address the problem. One of the contributors of this problem is non-orthogonal concerns (i.e. cross-cutting) which the OO paradigm does not deal with effectively.

Nature does that effectively. The process of natural evolution does not stop on sponge, and therefore, more complex animals with nervous systems appeared. The first nervous system was found in Coelenterate (such as Hydra). Neurons partially separated from e
ectors and form a neural net without nerve center in which nerve impulses are broadcasted throughout the body to stimulate all of the muscle cells to product simple behaviors. Since the neurons conducted impulses without direction such neural net is called di
use nervous systems. But animals need better maneuverability to get energy to survive in a competitive world. Therefore, a real nervous system appeared in Annelida (such as earthworm). The earthworm is made up of segments that are formed by subdivisions that partially transect the body cavity. Segments each contain elements of such body systems as circulatory, nervous, and excretory tracts. Metamerism increases the efficiency of body movement by allowing the e
ect of muscle contraction to be extremely localized, and it makes possible the development of greater complexity in general body organization. It’s very interesting that moving concerns are separated by nature! As a result, the segments are cohesive and have loose coupling.

The most important inspiration from the process of natural evolution described above is that the complex behaviors are not completed by directly communicating with muscle cells of the segments but completed by central control of the ganglia (nerve center). Obviously, it can reduce the complexity of animals otherwise, for example, each muscle cell of the segment has to have the same “movement logic”. We could say that nature encapsulates the “cross cutting” into the ganglion(neuron). This phenomenon is more common in vertebrate(Figure 1). The extreme example is the cortex, in which the most complex “cross-cuttings”, such as emotion, learning ability, are handled very well.

3.1.2 Inspirations of nervous system

How to improve object-oriented paradigm to implement “nervous systems” of large scale software systems to deal with cross-cutting? Let us investigate more details in the nervous system. Nervous systems are made of neurons. Although the nervous system varies enormously in structure and function, neurons function similarly in animals as humans. A typical neuron has four morphologically defined regions: cell body, dendrites, axon, and presynaptic terminals(Figure.

Fig. 1. An example of “cross-cutting” in the nervous system: synchronization of limbs in walking .

2).

Fig. 2. The structure of neurons.

The axon is the main conducting unit of the neuron, capable of conveying electrical signals along distances that range from as short as 0.1 mm to as long as 2 m, thereby conveying information to different targets. Near its end the axon divides into fine branches that make contact with other neurons or e
ector. The point of contact is known as synapse. The cell transmitting a signal is the presynaptic cell, and the cell receiving the signal is the postsynaptic cell.

Ramon y Caljal pointed out in his principle of connectional specificity that the connections between neuron-to-neuron and neuron-to-e
ector are specified. The principle of connectional specificity states that each cell communicates with certain postsynaptic target cells but not with others and always at specialized points of synaptic contact. The specified connection is similar to that of direct intercellular communication (by gap junction or by plasma membrane bound molecules). But the specified connections can be changed owing to synaptic plasticity. For example, long-term habituation and sensitization involve structural changes in the presynaptic terminals of sensory neurons, long-term habituation leads to a loss of synapses, and long-term sensitization to an increase(Figure. 3). Thus, nervous systems in which “cross-cutting” is modularized can learn and evolve in a changing environment.

1.Control 2. Long-term habituation 3. Long-term sensitization

Sensory neuron

Motor neuron

Fig. 3. Long-term habituation leads to a loss of synapses, and long-term sensitized to an increase.[10]

If we draw an analogy between object in object-oriented paradigm and e
ector in animals’ body, it could be seen that the features of intercellular communicating through synapse, which is the key to handle “cross-cutting”, do not exist in object-oriented paradigm in which that is done by point-to point method call similar to gap junction communication mentioned above.

From discussions above, we conclude several considerations to augment object-oriented paradigm to implement the “nervous system” of large scale software systems to deal with cross-cutting.

1)add a new communication mechanism similar to synapse communication.

2)differentiate “neuron” from “e
ector”(object).

3)add a mechanism supporting “synaptic plasticity”.

Up to now, we observe that the Aspect-Oriented paradigm is a candidate for the desired paradigm. We discuss this point in the next section.

3.2 A candidate for the desire paradigm: Aspect-Oriented

As an extension of the Object-Oriented paradigm, AOP adds Aspect, PointCut and JoinPoint into the Object-Oriented paradigm[11]. The Aspect has a natural analogy with neural cells in nervous systems of animals and Aspect could be modeled with a simplified neural model (Figure.4).

Fig. 4. Metaphor between AOP and nerve system. Please note that joint points defined in object or aspect correspond to membrane receptors of the cell. It can be method call, method execution etc. [11] , by which PointCut synapses with object or aspect.

Aspect(Neuron)

Aspect is “neuron” differentiation from “effector”(object), which is a modularized cross-cutting, i.e. modules in which cross-cutting is encapsulated. It consists of join point, PointCut and Advice. Join points connect with an Advice through the PointCut they belong to and expose the logic of cross-cutting implemented in the Advice. In other word, the Advice surrounded by a “wall” of join points(Figure.5) and hence that the only way to make the Advice to function is to activate one of join points in the wall. Thus, cross-cutting which can not be effectively handled by the OO paradigm is encapsulated into Aspect. Aspect has some features similar to that of Class. First, Aspect is cohesive because join points can only activate certain Advice. Second, Aspect is loose coupling with other parts of the system because it communicates with other elements just by join points in the wall.[12]

PointCut(Axon)

Together with join points, PointCut adds a new communication mechanism similar to synapse communication into the OO paradigm. PointCut is the main conducting unit of the Aspect, capable of getting the context of the target objects synapsed by their own join points (we also call it as the context of the join point). A PointCut, connecting to at least one Advice, could have

Fig. 5. Another representation of Aspect and Class[13]

several join points, which are program execution points similar to membrane receptors of cells defined in object or aspect. It can be method call, method execution etc. , by which PointCut synapses with object or aspect. Rather the direction of impulse transmission in the axon of most neurons is always from the base of axon to its terminals, information in PointCut is transmitted bi-directly, i.e. Aspect can pass information from and to the targets. We call the PointCut, which passes information from the target to the Aspect it belongs to, as dendrite.

Advice(Signal Transduction)

Advice similar to signal transduction logic in neurons is a logic implementing cross-cutting. Which can change behaviors of the target objects synapsed by the join point of the own PointCut and can be activated by join points. The activated Advice performs some operations such as before(), around() and after() in the context of join point to implement cross-cutting. For any object, the Advice is inaccessible, in other words the object can not call the operations of Advice, which guarantees the modularity of cross-cutting.

3.3 Sample: a Simplest Nervous Net

The simplest neural network consists of three cells: a sensory neuron connected to a motor neuron connected to a muscle cell, which can be mapped into the neuroscience-inspired model described above(Figure 6).

1 class Effector {

2 boolean state,transmitter;

3 private boolean receptor(){

4 boolean transmitter_received = false;

5 returntransmitter_received;

6 }

7 public boolean emite(){

8 Transmitter = state;

9 return transmitter;

10 }

11 public void contract(){

12 //to contract and change the state…

13 }

Fig. 6. A simplest nervous net

14 public void relax(){

15 //to relax and change the state…

16 }

17 public void go(){

18 if(receptor()) contract();

19 else relax();

20 }

21 }

The sensory neuron in the neuronal network is implemented as an aspect which has two pointcuts , axon() and dendrite(), similar to that of neuron. These pointcuts are different in direction of information transmitted, axon() transmit information (transmitter to release) from SensoryNeuron to another aspect or object (in this case, E
ector) and dendrite() vice versa. The Advice similar to signal transduction in cell which dendrite() binding to receives information emitted by E
ector (line 9 ), and then the Advice of axons() decides what information be transmitted to the target, in this simple case transmitter received is transmitted directly.

1 aspect SensoryNeuron {

2 boolean transmitter_to_release,transmitter_received; 3 pointcut axon():within(MotorNeuron) && execution(* receptor(..)); 4 pointcut dendrite():

5 within(Effector) && execution( * emit(..));

6 boolean around(): axon() {

7 return transmitter_received;

8 }

9 after() returning(boolean x) :dendrite() {

10 transmitter_received = x;

11 }

12 }

MotorNeuron is an aspect similar to SensoryNeuron. However there are two differences between SensoryNeuron and MotorNeuron. First, MotorNeu ron only has one pointcut axon(), which transmits orders to the E
ector in certain logic (line 8). Second, the former has a membrane receptor() (line 4) similar to that of E
ector which receives transmitter emitted by the latter.

1 aspect MotorNeuron {

2 boolean transmitter_to_release, transmitter_received; 3 pointcut axon(): within(Effector) && execution( * receptor(..)); 4 private boolean receptor(){

5 boolean transmitter_received = false;

6 return transmitter_received;

7 }

8 boolean around(): axon() {

9 transmitter_received = receptor();

10 //Motor Neuron receive singal = transmitter_received 11 if (transmitter_received) transmitter_to_release = true; 12 else transmitter_to_release = false;

13 //Motor Neuron release transmitter = transmitter_to_release 14 return transmitter_to_release;

15 }

16 }

4 Applying AO paradigm in ANN

We use AOP to implement ANN to support one of our points, i.e. AOP dealing with crosscutting is similar to that of real neural nets, which implies that we could apply ANN theory in large scale AO systems in which ANN is a by-product. ANN could be implemented by using other programming paradigms but these implementations are some shortcomings as mentioned in section 1.

With the paradigm described above, artificial neural networks can be easily simulated. To illustrate it, we present a simple example of three-tier BP neural network which solves the MONK’s problem I.[14]

The MONK’s problems rely on the an artificial domain; in which robots are described by six different attributes:

x1:head shape 2round, square, octagon

x2:body shape 2round, square, octagon

x3:is simling 2yes, no

x4:holding 2sword, balloon, flag

x5:jacket color 2red, yellow, green, blue

x6:has tie 2yes, no

The learning task is a binary classification task. Each problem is given by a logic description of a class. Robots belong either to this class or not, but instead of providing a complete class description to the learning problem, only a subset of all 432 possible robots with its classification is given. The learning task is then to generalize over these examples and, if the particular learning technique at hand allows this, to derive a simple class description. The MONK’s problems are composed of three tasks. One of them is the MONK’s problem I described below. Figure 7 illustrates the network architecture used in MONK’s problem I. Each unit in the Hidden1 layer represents one of the attributes.

Problem 4.1 (Monk1): (head shape = body shape) or (jacket color = red). From 432 possible examples, 124 were randomly selected for the training set. No noise is present.

Fig. 6 The network architecture for the MONK’s Problem I.[14]

Fig. 7. The network architecture for the MONK’s Problem I.[15]

Fig. 8. The aspect-oriented design model of the network in Fig. 7

Each aspect of the network represents a neuron(We call it Aspect Oriented Neural Networks(AONN) ). Learning algorithm is a typical cross cutting which should be modeled as an aspect with pointcuts which cross cut every aspect which needs to be learned (in this case, every aspect in all layers except Input ). Given the simplicity factor, it is not shown in the figure.

The architecture in figure 7 is mapped into the AO design space as shown in figure 8. By using the programming technique described in section 3.3, the design model can be implemented easily. We implement neuron into an abstract aspect with one pointcut dendrite().

1 public abstract aspect Neuron {

2 //some definitions

3 …

4 //n is the amount of input of the neuron

5 protected int n;

6 protected float[] input;

7 protected float[] weight;

8 protected Params params;//e.g. weights and threshold of the neuron 9 protected ID id;//identity of the neuron

10 protected OutPut output;//transmitter

11 abstract pointcut dendrite();

12 after() returning(OutPut out) : dendrite() {

13 if (!init) init();

14 input[out.getId().getNumber()] = out.getValue();

15 counter++;

16 if (! (counter < n)) {

17 counter = 0;

18 getOutPutWrapper();

19 }

20 }

21 public ID getID() {

22 return id;

23 }

24 /* *Wrapper() just only wrap the method *OutPut(float net) 25 in order to the dendrite() pointcut picks out jp within the 26 correct aspect(see to their implementations).This is some trick*/ 28 public abstract void init();

29 public abstract OutPut getOutPutWrapper();

30 public abstract OutPut getOutPut(float net);

31 public abstract float getNetInput();

32 public abstract Params getParamsWrapper();

33 public abstract Params getParams(ID id);

34 }

Abstract aspect SigmoidNeuron partly implements Neuron with sigmoid activation function.(line 6)

1 public abstract aspect SigmoidNeuron

2 extends Neuron {

3 //some definitions

4 …

5 //sigmoid active function

6 public OutPut getOutPut(float net) {

7 //calculateing output with sigmoid function and return it 8 …

9 }

10 public float getNetInput() {

11 //this executing point will be captured by learning aspect 12 Params tempParams = getParamsWrapper();

13 //compute and return net value

14 …

15 }

16 public Params getParams(ID id) {

17 return params;

18 }

19 }

We implement neurons into aspects and name it in a simple rule. The name of a neuron is composed of two parts, the order name and the group name. We group neurons by the target they synapse. The neurons, which have the same target, are in the same group, such as the neuron which belongs to hidden layer 2 is in Group 7,etc. So every neuron has a group id. Second, the position of the neuron in the group (from left to right) is the order name of neuron, such as Neuron0, Neuron1,etc.(See also to figure 8)

1 aspect Neuron0Group7

2 extends SigmoidNeuron {

3 pointcut dendrite() : within(*Group0) && call( * getOutPut(..)); 4 public void init() {

5 //initialize this neuron and set the flag

6 …

7 }

8 public OutPut getOutPutWrapper(){

9 return getOutPut(getNetInput());

10 }

11 public Params getParamsWrapper(){

12 return getParams(id);

13 }

14 }

There are some differences among the neurons. The neurons in the input layer are the sensory neuron mentioned in section 3.3. These neurons have pointcut dendrite() which could pickup the join points in effectors which are pattern generators(POJO) in this case. Second, we add a new method finalOutPut(ID id) into the neurons in the output layer by which the learning aspect will get the final output without knowing the scale of the net. Once the pattern generators produce all data the output aspect will call finalOutPut(ID id) to output the result. Thus the net is ready to run together with a simple main class and if the parameters are assigned correctly it will surely return the right results for all the patterns. But we want it to get the right parameters by itself, i.e. learning.

Learning is a typical cross-cutting which is implemented as aspect Bp with several pointcuts which cross cut every aspect which needs to learn (in this case, every aspect in all layers except input layer).The key in implementing learning aspect is parameters update. Once the output layer products an output, pointcut checkErr() will capture it and check if the net need to learn (i.e. the result is wrong or right) and pointcut checkPass() calculates the update using learning algorithm(BP algorithm in this case, we can easily replace it with another learning algorithm). While the net is running, if there are some parameters that need to be refreshed pointcut paramsUpdate() will update it.

1 public aspect Bp {

2 …

3 //we do not care about fire order of the neuron in the net 4 //so data is stored in the hash tables

Wang, Tang and Zhang

5

6 pointcut paramsUpdate() : if (learningFlag && hasUpdate) 7 && call( * getParams(..))&& within(Neuron*);

8 Params around() : paramsUpdate() {

9 }

10

12 pointcut checkErr() : within(Neuron*) && call( * finalOutPut(..)); 13 after() returning(float v) : checkErr() {

14 }

15

16 // check if the input neuron fires, that is the end of one pass 17 pointcut checkPass() : if (learningFlag&&isInputsReady&&isParamsReady) 18 && within(Neuron*) && call( * finalOutPut(..));

19 after() : checkPass() {

20 …

21 calculate();

22 }

23

24 //get all outputs of the neuron in the net

25 pointcut getAllOutPut() : if (learningFlag) && within(Neuron*) && 26 call( * getOutPut(..));

27 after() returning(OutPut out) : getAllOutPut() {

28 }

29

30 pointcut getAllParams() : if ( (!onePass) && learningFlag) && 31 within(Neuron*) && call( * getParams(..));

32 after() returning(Params tempParams) : getAllParams() { 33 //store the params into a hash table and set the flag 34 …

35 }

36 public void calculate() {

37 //calculating weight update using BP algorithm

38 …

39 }

40 }

Our preliminary experiment shows that AONN has good traits. First, the parallelism of neural networks is transformed automatically. Neural networks are parallel processing systems, in which neurons in the same layer fire parallel, so programmers usually are required to transform the parallelism into serial procedure or dialoging between objects manually. That is prone to error and is often hard to implement a large scale neural network. In AONN, rather than care about the details of interaction of neurons (transformation) we just focus on the synapses (join points). The transformations are automatically completed by weaving .(By process algebras[16,17], we can prove the correctness of weaving but for space limitation it will be discussed in another paper ). Second, the approach used in AONN has high traceability. The problem space is mapped into the design space directly, and then to programming space. Moreover, AONN has a low change impact. We can add or delete neurons and neuron layers or change the topology with no impact on the whole system. Furthermore, by using dynamic AOP (the implementation is immature), such as JAC, JBoss,

Axon etc.[18,19,20], the topology of NN can be changed in runtime. Fourth, the implementation can be reused easily. The neurons implemented in aspects are cohesive and loose coupling with other parts of the system so it can be used in another system with light modification. Last but never the least, the learning algorithm can be changed on demands. The learning algorithm of the neural network is localized into a neuron so it can be replaced easily.

On the other hand, we define an aspect for each neuron in the network, but it needs a reasonable e
ort to write though we have carefully avoid it. It seems to be reasonable to create multiple objects from one neuron class definition, but AspectJ’s pointcut and advice mechanism can not allow to instantiate arbitrary aspect instances, and requires to distinguish advised objects.It might be an interesting idea to apply instance-level aspects for representing connection between neurons and this is one of our future works.[31,32]

The Aspect-Oriented paradigm, a candidate for biologically-inspired programming paradigm discussed in section 3.1, is not dedicated to implement ANN; it has been used to develop more general applications and can be used to develop biologically-inspired software which is discussed in the next section at abstract level.

5 Biologically-Inspired Framework of Evolvable Software

The concepts of biologically-inspired computing have existed in computer science for decades. Earlier, W. McCulloch and W. Pitts proposed artificial neural networks[21], John von Neumann presented cellular automata[22], Alan Tur ing researched on Morphogenesis[23] and John H. Holland invented genetic algorithms[24] . And more recently some new biologically-inspired computing are presented, such as [25,26]. A mass of applications using biologically inspired computing are developed, such as [27,28,29,30].

We argue that although the most of the applications using biologically inspired computing are implemented as software systems the software technology itself does not benefit from ideas of biologically-inspired computing owing to the software paradigm’s incongruity with the structure and the mechanism of biology(particular nervous systems), such as absence of synapse communication mechanism in the Object-Oriented paradigm. As discussed earlier, the Aspect-Oriented paradigm is close to the desired paradigm. We believe that it can be used to develop biologically-inspired evolvable software proposed below(Figure 9).

The biologically-inspired framework of evolvable software consists of four layers. The Core Application Layer (CAL) , similar to organs of biology, includes only orthogonal functions of the system, which is constructed by using the OO paradigm and run independently. The systemic functions, such as security, transaction, persistence and distribution etc., are implemented in the Reflex Aspects Layer(RAL) similar to the spinal cord of animals with the Aspect-Oriented paradigm. The RAL controls the functions of core application in CAL and their e
ectiveness are adjusted according to the changing environment. RAL is what the Aspect-Oriented community focuses on now. At the top of the architecture, the Knowledge-Based AONN Layer (KBAL) serves as a “cortex” of the software system which indirectly controls the core applications via RAL. Owing to the combination of different biologically-inspired computing technologies, such as knowledge-based neurocomputing, genetic computing and synapse(join point) plasticity, KBAL learns and adapts to changing environments(new requirements), thus providing us with evolvable software.

Fig. 9. The architecture of Aspect-Oriented evolvable and adaptive software and the biological analog

6 Conclusions

To handle non-orthogonal concerns (cross-cutting) which is not done effectively by using the Object-Oriented paradigm, we have proposed a biologically inspired programming paradigm with lessons from nervous systems. There are three considerations to augment to handle cross-cutting: 1) add a new communication mechanism similar to synapse communication. 2) differentiate “neuron” from “effector”(object). 3) add a mechanism supporting “synaptic plasticity”.

The Aspect-Oriented paradigm, which satisfies the considerations 1) and 2) at present, is a candidate for the desired paradigm. It has been used in ANN simulation , named Aspect-Oriented Neural Networks(AONN), to support our ideas and the preliminary experiment shows that AONN has many good traits. Although the concentration on the consideration 3) (i.e. dynamic AOP)is increasing, it has not been well implemented yet. The research on synaptic plasticity in neuroscience may give us more inspiration to achieve it. In addition, we have proposed a preliminary biologically-oriented evolvable software framework which can be implemented using the AO paradigm combining with biologically-inspired computing technologies, such as neural computing and genetic computing.

References

[1] George D. Manioudakis, and Spiridon d Likothanassis, An Object-Oriented Toolbox for Adaptive Neural Networks’ Implementation, International Journal on Artificial Intelligence Tools. 10, №3(2001), 345–371.

[2] Ellingsen B.K. An object-oriented approach to neural networks, Technical Report №45, ISSN 0803–6489, UIB-IFI,

URL:http://citeseer.ist.psu.edu/ellingsen95objectoriented.html.

[3] Dijkstra, Edsger W., . “A discipline of programming,” Englewood, Cli
s, NJ:Prentice-Hall Inc.,1976.

[4] Kiczales G, Lamping J, Mendhekar A, Maeda C, Lopes CV, Loingtier JM, Irwin J.,Aspect-Oriented programming, In: Proceedings of the European Conference on Object-Oriented Programming (ECOOP). LNCS 1241, Springer-Verlag, 1997. 220–242.

URL:http://citeseer. nj.nec.com/63210.html.

[5] Soares, S., and Borba, P., (2002), “Progressive implementation with aspect oriented programming”, in Springer Verlag, editor, The 12th Workshop for PhD Students in Object–Oriented Systems, ECOOP 02.

[6] Elrad, T., Filman, R.E., and Bader, A., “Aspect oriented programming”, Communications of the ACM, vol. 44, no. 10(2001).

[7] F.Q. Yang, H. MEI, J, Lu and Z. Jin, Some Discussion on the Development of Software Technology(in Chinese), Chinese Journal of Electronics, Vol.30, №12A, 1901–1906, 2002

[8] William K. Purves, David Sadava, Gordon H. Orians, Craig Heller. “Life: The Science of Biology,” 7th Ed., W H Freeman, Bedford, 2004

[9] E. R. Kandel, J. H. Schwartz, T. M. Jessell, “Essentials of Neural Science and Behavior,” McGraw-Hill, 1996

[10] Bailey, C.H., and Chen, M. Morphological basis of long-term habituation and sensitization in Aplysia. Science 220:91–93.

[11] The AspectJ Team, The AspectJ Programming Guide,With associated web site http://eclipse.org/aspectj.

[12] L.C. Wang, X.F. Tang, NBO: An Approach for Aspect-Oriented Software Development in Light of Neurobionics, submitted.

[13] Taylor, D.A., “Object-Oriented Technology: A Manager’s Guide,” New York:Addison Wesley, 1990

Wang, Tang and Zhang

[14] Thrun, S.B., Bala, J., Bloedorn, E., Bratko, L., Cestnik, B., Cheng, J., De Jong, K., Dzeroski, S., Fahlmann, S.E., Fisher, D.,Hamann, R., Kaufman, K., Keller, S., Kononenko, L., Kreuziger, J., Michalski, R.S. Mitchell, T., Pachowicz, P., Reich, Y., Vafaie, H., Van de Welde, W., Wenzel, W., Wnek, J., Zhang, J. (1991), The Monk’sProblems: A Performance Comparison of Di
erent Learning Algorithms, Technical Report: CMU-CS-91–197, Carnegie Mellon University.

[15] Masumi Ishikawa, Structural learning and rule discovery from data, in S. Amari and N. Kasabov Eds., Brain-Like Computing and Intelligent Information Systems, Chapter 16, pp.396–415, Springer (1998)

[16] C.A.R. Hoare. “Communicating Sequential Process,” Prentice-Hall, Englewood Cli
s, NJ, 1985

[17] James H. Andrews. Process-Algebraic Foundations of Aspect-Oriented Programming. In Proceedings of the Third International Conference on Metalevel Architectures and Separation of Crosscutting Concerns Pages: 187–209, 2001

[18] Pawlak, R., L. Seinturier, L. Duchien, and G. Florin, “JAC: A Flexible Solution for Aspect-Oriented Programming in Java,” in Metalevel Architectures and Separation of Crosscutting Concerns (Reflection 2001), LNCS 2192, pp. 1–24, Springer, 2001

[19] http://www.jboss.org/developers/projects/jboss/aop .

[20] Swen Aussmann, Michael Haupt. Axon-Dynamic AOP through Runtime Inspection and Monitoring. ECOOP’03 workshop on Advancing the State of the Art in Runtime Ispection (ASARTI) 2003

[21] W. McCulloch and W. Pitts, A Logical Calculus of the Ideas Immanent in Nervous Activity, Bulletin of Mathematics Biophysics., Vol. 5, 1943:115–133

[22] John von Neumann, Theory of Self-Reproducing Automata. University of Illinois Press. 1966 (Originally published in 1953)

[23] Alan Turing. The Chemical Basic of Morphogenesis, Philosophical Transactions of Royal Society B(London). 1952

[24] John H. Holland. Genetic algorithms and the optimal allocation of trials. SIAM Journal on Computation, 2:88–105, 1973.

[25] Gh. Paun, Computing with Membranes, J. Comput. System Sci. 61(1) (2000) 108–143. (See also Turku Center for Computer Science-TUCS Report №208, 1998, www.tucs.fi)

[26] Abelson et al, Amorphous Computing, Communications of the ACM, Volume 43, Number 5, May 2001

[27] M. Wang and T. Suda, “The bio-networking architecture: A biologically inspired approach to the design of scalable, adaptive, and survivable/available network applications,” in Proceedings of the 1st IEEE Symposium on Applications and the Internet (SAINT), (San Diego, CA), IEEE, 8–12 January 2001.

Wang, Tang and Zhang

[28] Dario Floreano and Joseba Urzelai. Neural Morphogenesis, Synaptic Plasticity, and Evolution, “Theory in Biosciences,” vol. 120, no. 3–4, pp. 225–240(16), Urban & Fischer, 2001.

[29] Dan C. Marinescu, Ladislau Boloni. Biological Metaphor in the design of complex software software systems. Future Generation Computer Systems Vol17( 2001),345–360

[30] Eduardo Sanchez , Daniel Mange , Moshe Sipper , Marco Tomassini , Andrs Prez-Uribe , Andr Stau
er, Phylogeny, Ontogeny, and Epigenesis: Three Sources of Biological Inspiration for Softening Hardware, Proceedings of the First International Conference on Evolvable Systems: From Biology to Hardware, p.35–54, October 07–08, 1996

[31] Hridesh Rajan and Kevin Sullivan, Eos: Instance-Level Aspects for Integrated System Design, In 9th European Software Engineering Conference (ESEC) and 11th ACM SIGSOFT International Symposium on the Foundations of Software Engineering (FSE-11), pp.297–306, 2003.

[32] Kouhei Sakurai, Hidehiko Masuhara, Naoyasu Ubayashi, Saeko Matsuura and Seiichi Komiya, Association Aspects, In Proceedings of the 3rd International Conference on Aspect-Oriented Software Development (AOSD’04), 2004.

View publication stats

--

--