The BRAIN-IoT modelling framework

The BRAIN-IoT modelling framework

An important concept proposed in the BRAIN-IoT project is the introduction of a set of modelling tools allowing the description, verification and analysis of the different components and subsystems constituting the BRAIN-IoT architecture.

The main goals of this approach are the following:

  • Provide the ability to graphically represent the smart behaviour model.
  • Describe and verify the functional behavioural of the overall BRAIN IoT architecture.
  • Describe, analyse and verify the requirements of the Smart Behavioural components that should be developed for the BRAIN-IoT Use cases.
  • Generate automatically the code and metadata of the Smart Behavioural components.
  • Monitor deployed components and enable fast prototyping of new behaviours.
  • Reduce the development time, ease integration of interoperable solutions and improve design quality.

In order to reach these goals, BRAIN-IoT proposes to design, in WP3, a Modelling Framework as shown in Figure 1.

Figure 1: BRAIN-IoT modeling Framework

The modeling language is used to represent the behaviour model, device and IoT platforms service APIs, and Data model.  The modeling tool should support the modeling language to develop graphical model representation, then the graphical model will be transferred to the rigorous semantic to do some statistical modeling checking before generating the code. Then the java code will be generated by using the java code generator, and it will be built as OSGi bundles including the OSGi MANIFEST.  Finally, the OSGi bundles will be released in the artifacts repository.

The design flow based on the integration Papyrus, sensiNact, and BIP tools available in the CEA and UGA laboratories.

These tools will be mainly used in the different design phases, including verification and analysis. This Paragraph gives a high-level view on the planned integration of tools related to IoT-ML in the project and a preliminary timeline for its realization.

Figure 2: Interoperability of modelling tool

IoT-ML modelling tools: Papyrus and sensiNact combined solution

IoT-ML is a UML profile. It implements the IoT-A reference architecture for common concepts in IoT systems. This IoT-ML profile contains stereotypes that extend UML meta-classes, giving them new syntax and semantics. Indeed, IoT-ML has stereotypes that inherits the properties of OMG-standard MARTE 1.1 stereotypes for real-time embedded system modelling and the properties of OMG-standard SysML 1.4 stereotypes for system and requirements modelling. Moreover, the profile fosters the construction of models that may be used to make quantitative predictions considering IoT characteristics. The difference with standard MARTE is that IoT-ML adds new concepts that have evolved in embedded systems since the foundation of MARTE. Therefore, IoT-ML will become MARTE 2.0 and the standardization process will take place at the OMG.

To model and design a desired IoT system, the IoT-ML allows to portray it using different views e.g.:

  • Functional views including smart and behavioral units.
  • Things views describing the physical entities annotated with Web of Things (WoT) properties.
  • Hardware platform views showing up the connections between the instantiated physical entities.
  • Deployment views describing how the instantiated functional units are allocated to the processing units of the physical entities, etc.

In the context of BRAIN-IoT, the IoT-ML models are made in the Papyrus modeler tool. These models will be used to generate metadata for the modeled IoT services, either in WoT Thing Descriptions (as JSON-LD files) or OSGi bundle descriptors (MANIFEST.MF files). Generation will be implemented in Papyrus.

The models will also be used for code generation and analysis, through a bridge to the BIP framework (see next paragraph).

Finally, after deployment, the IoT-ML models will be used to monitor the states of devices and services connected to sensiNact platform. This is done through, first, the synchronization of the sensiNact data model, in sensiNact Studio, with the real devices and services. Then, the sensiNact data model is synchronized with the IoT-ML model. The latter synchronization is not bijective as IoT-ML models are described in their particular views, as described above, which are different from the basic sensiNact data model representation. Finally, IoT-ML state-machines can be transformed into sensiNact textual DSL descriptions. The DSL represent simple behaviors of services that can be quickly tested through the sensiNact platform, without redeployment. This allows fast prototyping of new behaviors after deployment.

BIP (Behaviour, Interaction, Priority)

In the framework of BRAIN-IoT, the operational semantics of the IoT-ML models is intended to be captured in BIP language, through a model-to-model transformation.  By doing so, the transformed BIP model can be checked by simulation using the BIP Compiler and Engines. The BIP compiler allows generating C++ simulation code, while the BIP engine orchestrates the execution of the generated code.  Simulation can be moreover combined with statistical model checking (SMC). Models subject to SMC analysis are described in the stochastic BIP language.

In order to represent the model of the system in BIP (see Figure 2), two levels of representation of abstraction levels are planned.   Actually, the abstraction level depends on the level of details that the designers want to observe on the system/ building block under development:

  • The first refinement corresponds to a purely functional view, as only the Things and the smart behavior are represented, whereas the platform and the middleware are fully abstracted. In this case, Things are endowed with a behavior expressed in state-machines and communicate directly with the smart behavior. For instance, a robot movement actuator could be endowed with a specific behavior dictated by a smart component as it adjusts its position without collision based on the AI algorithm.  In some cases, this model can also include libraries (i.e. Drivers) that interfaces with Things.  This model can be simulated and checked with BIP Tool for any properties related to the functionality. For instance, the designer could check if the robot avoids collisions from a starting to the ending point.
  • The second refinement brings hardware platform and middleware aspects to the BIP model. In this case, northbound bridges and southbound bridges are integrated at that level for accurate requests routing. These bridges embody the so-called sensiNact gateway. On one hand, southbound bridges are in charge of collecting data from sensors and services, and to send orders which will be performed by actuators. On the other hand, northbound bridges are specialized in interacting with remote systems (Deliverable 3.1). In addition, devices (i.e. sensors and actuators) and physical connectivity (i.e. Ethernet bus) are represented at that level. Besides, we have to note that the degree of details depends on requirements that the designers want to satisfy. Also, the refined models will enable additional analysis using the BIP tools, in particular, quantitative assessments regarding the performance of the system. For instance, what is the probability that two robots will collide?

In addition to analysis, the BIP framework is also intended to generate Java code for specific components in the functional view (see Figure.2). For example, a monitoring service for the home temperature is automatically generated except the Things part which is wrapped as an OSGi bundle and plugged in the PAREMUS Service Fabric. These bundles interact with external service fabric using sensiNact bridges. Moreover, actuation and data and data gathering (to/from things respectively) are enabled through those bridges.

It should be noted that the Things are only modeled because in the frame of BRAIN-IoT no new things will be designed, they are available on the shelf.

Figure 3:  Model to model transformation levels


Leave a Reply