Search results for: interoperability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 106

Search results for: interoperability

46 Distributed Manufacturing (DM)- Smart Units and Collaborative Processes

Authors: Hermann Kuehnle

Abstract:

Developments in ICT totally reshape manufacturing as machines, objects and equipment on the shop floors will be smart and online. Interactions with virtualizations and models of a manufacturing unit will appear exactly as interactions with the unit itself. These virtualizations may be driven by providers with novel ICT services on demand that might jeopardize even well established business models. Context aware equipment, autonomous orders, scalable machine capacity or networkable manufacturing unit will be the terminology to get familiar with in manufacturing and manufacturing management. Such newly appearing smart abilities with impact on network behavior, collaboration procedures and human resource development will make distributed manufacturing a preferred model to produce. Computing miniaturization and smart devices revolutionize manufacturing set ups, as virtualizations and atomization of resources unwrap novel manufacturing principles. Processes and resources obey novel specific laws and have strategic impact on manufacturing and major operational implications. Mechanisms from distributed manufacturing engaging interacting smart manufacturing units and decentralized planning and decision procedures already demonstrate important effects from this shift of focus towards collaboration and interoperability.

Keywords: autonomous unit, networkability, smart manufacturing unit, virtualization

Procedia PDF Downloads 501
45 Software Quality Assurance in 5G Technology-Redefining Wireless Communication: A Comprehensive Survey

Authors: Sumbal Riaz, Sardar-un-Nisa, Mehreen Sirshar

Abstract:

5G - The 5th generation of mobile phone and data communication standards is the next edge of innovation for whole mobile industry. 5G is Real Wireless World System and it will provide a totally wireless communication system all over the world without limitations. 5G uses many 4g technologies and it will hit the market in 2020. This research is the comprehensive survey on the quality parameters of 5G technology.5G provide High performance, Interoperability, easy roaming, fully converged services, friendly interface and scalability at low cost. To meet the traffic demands in future fifth generation wireless communications systems will include i) higher densification of heterogeneous networks with massive deployment of small base stations supporting various Radio Access Technologies (RATs), ii) use of massive Multiple Input Multiple Output (MIMO) arrays, iii) use of millimetre Wave spectrum where larger wider frequency bands are available, iv) direct device to device (D2D) communication, v) simultaneous transmission and reception, vi) cognitive radio technology.

Keywords: 5G, 5th generation, innovation, standard, wireless communication

Procedia PDF Downloads 413
44 Network Coding with Buffer Scheme in Multicast for Broadband Wireless Network

Authors: Gunasekaran Raja, Ramkumar Jayaraman, Rajakumar Arul, Kottilingam Kottursamy

Abstract:

Broadband Wireless Network (BWN) is the promising technology nowadays due to the increased number of smartphones. Buffering scheme using network coding considers the reliability and proper degree distribution in Worldwide interoperability for Microwave Access (WiMAX) multi-hop network. Using network coding, a secure way of transmission is performed which helps in improving throughput and reduces the packet loss in the multicast network. At the outset, improved network coding is proposed in multicast wireless mesh network. Considering the problem of performance overhead, degree distribution makes a decision while performing buffer in the encoding / decoding process. Consequently, BuS (Buffer Scheme) based on network coding is proposed in the multi-hop network. Here the encoding process introduces buffer for temporary storage to transmit packets with proper degree distribution. The simulation results depend on the number of packets received in the encoding/decoding with proper degree distribution using buffering scheme.

Keywords: encoding and decoding, buffer, network coding, degree distribution, broadband wireless networks, multicast

Procedia PDF Downloads 371
43 BIM Modeling of Site and Existing Buildings: Case Study of ESTP Paris Campus

Authors: Rita Sassine, Yassine Hassani, Mohamad Al Omari, Stéphanie Guibert

Abstract:

Building Information Modelling (BIM) is the process of creating, managing, and centralizing information during the building lifecycle. BIM can be used all over a construction project, from the initiation phase to the planning and execution phases to the maintenance and lifecycle management phase. For existing buildings, BIM can be used for specific applications such as lifecycle management. However, most of the existing buildings don’t have a BIM model. Creating a compatible BIM for existing buildings is very challenging. It requires special equipment for data capturing and efforts to convert these data into a BIM model. The main difficulties for such projects are to define the data needed, the level of development (LOD), and the methodology to be adopted. In addition to managing information for an existing building, studying the impact of the built environment is a challenging topic. So, integrating the existing terrain that surrounds buildings into the digital model is essential to be able to make several simulations as flood simulation, energy simulation, etc. Making a replication of the physical model and updating its information in real-time to make its Digital Twin (DT) is very important. The Digital Terrain Model (DTM) represents the ground surface of the terrain by a set of discrete points with unique height values over 2D points based on reference surface (e.g., mean sea level, geoid, and ellipsoid). In addition, information related to the type of pavement materials, types of vegetation and heights and damaged surfaces can be integrated. Our aim in this study is to define the methodology to be used in order to provide a 3D BIM model for the site and the existing building based on the case study of “Ecole Spéciale des Travaux Publiques (ESTP Paris)” school of engineering campus. The property is located on a hilly site of 5 hectares and is composed of more than 20 buildings with a total area of 32 000 square meters and a height between 50 and 68 meters. In this work, the campus precise levelling grid according to the NGF-IGN69 altimetric system and the grid control points are computed according to (Réseau Gédésique Français) RGF93 – Lambert 93 french system with different methods: (i) Land topographic surveying methods using robotic total station, (ii) GNSS (Global Network Satellite sytem) levelling grid with NRTK (Network Real Time Kinematic) mode, (iii) Point clouds generated by laser scanning. These technologies allow the computation of multiple building parameters such as boundary limits, the number of floors, the floors georeferencing, the georeferencing of the 4 base corners of each building, etc. Once the entry data are identified, the digital model of each building is done. The DTM is also modeled. The process of altimetric determination is complex and requires efforts in order to collect and analyze multiple data formats. Since many technologies can be used to produce digital models, different file formats such as DraWinG (DWG), LASer (LAS), Comma-separated values (CSV), Industry Foundation Classes (IFC) and ReViT (RVT) will be generated. Checking the interoperability between BIM models is very important. In this work, all models are linked together and shared on 3DEXPERIENCE collaborative platform.

Keywords: building information modeling, digital terrain model, existing buildings, interoperability

Procedia PDF Downloads 74
42 Implementation of the Outputs of Computer Simulation to Support Decision-Making Processes

Authors: Jiri Barta

Abstract:

At the present time, awareness, education, computer simulation and information systems protection are very serious and relevant topics. The article deals with perspectives and possibilities of implementation of emergence or natural hazard threats into the system which is developed for communication among members of crisis management staffs. The Czech Hydro-Meteorological Institute with its System of Integrated Warning Service resents the largest usable base of information. National information systems are connected to foreign systems, especially to flooding emergency systems of neighboring countries, systems of European Union and international organizations where the Czech Republic is a member. Use of outputs of particular information systems and computer simulations on a single communication interface of information system for communication among members of crisis management staff and setting the site interoperability in the net will lead to time savings in decision-making processes in solving extraordinary events and crisis situations. Faster managing of an extraordinary event or a crisis situation will bring positive effects and minimize the impact of negative effects on the environment.

Keywords: computer simulation, communication, continuity, critical infrastructure, information systems, safety

Procedia PDF Downloads 306
41 Employing a Knime-based and Open-source Tools to Identify AMI and VER Metabolites from UPLC-MS Data

Authors: Nouf Alourfi

Abstract:

This study examines the metabolism of amitriptyline (AMI) and verapamil (VER) using a KNIME-based method. KNIME improved workflow is an open-source data-analytics platform that integrates a number of open-source metabolomics tools such as CFMID and MetFrag to provide standard data visualisations, predict candidate metabolites, assess them against experimental data, and produce reports on identified metabolites. The use of this workflow is demonstrated by employing three types of liver microsomes (human, rat, and Guinea pig) to study the in vitro metabolism of the two drugs (AMI and VER). This workflow is used to create and treat UPLC-MS (Orbitrap) data. The formulas and structures of these drugs' metabolites can be assigned automatically. The key metabolic routes for amitriptyline are hydroxylation, N-dealkylation, N-oxidation, and conjugation, while N-demethylation, O-demethylation and N-dealkylation, and conjugation are the primary metabolic routes for verapamil. The identified metabolites are compatible to the published, clarifying the solidity of the workflow technique and the usage of computational tools like KNIME in supporting the integration and interoperability of emerging novel software packages in the metabolomics area.

Keywords: KNIME, CFMID, MetFrag, Data Analysis, Metabolomics

Procedia PDF Downloads 90
40 Advanced Simulation and Enhancement for Distributed and Energy Efficient Scheduling for IEEE802.11s Wireless Enhanced Distributed Channel Access Networks

Authors: Fisayo G. Ojo, Shamala K. Subramaniam, Zuriati Ahmad Zukarnain

Abstract:

As technology is advancing and wireless applications are becoming dependable sources, while the physical layer of the applications are been embedded into tiny layer, so the more the problem on energy efficiency and consumption. This paper reviews works done in recent years in wireless applications and distributed computing, we discovered that applications are becoming dependable, and resource allocation sharing with other applications in distributed computing. Applications embedded in distributed system are suffering from power stability and efficiency. In the reviews, we also prove that discrete event simulation has been left behind untouched and not been adapted into distributed system as a simulation technique in scheduling of each event that took place in the development of distributed computing applications. We shed more lights on some researcher proposed techniques and results in our reviews to prove the unsatisfactory results, and to show that more work still have to be done on issues of energy efficiency in wireless applications, and congestion in distributed computing.

Keywords: discrete event simulation (DES), distributed computing, energy efficiency (EE), internet of things (IOT), quality of service (QOS), user equipment (UE), wireless mesh network (WMN), wireless sensor network (wsn), worldwide interoperability for microwave access x (WiMAX)

Procedia PDF Downloads 160
39 IT Perspective of Service-Oriented e-Government Enterprise

Authors: Anu Paul, Varghese Paul

Abstract:

The focal aspire of e-Government (eGovt) is to offer citizen-centered service delivery. Accordingly, the citizenry consumes services from multiple government agencies through national portal. Thus, eGovt is an enterprise with the primary business motive of transparent, efficient and effective public services to its citizenry and its logical structure is the eGovernment Enterprise Architecture (eGEA). Since eGovt is IT oriented multifaceted service-centric system, EA doesn’t do much on an automated enterprise other than the business artifacts. Service-Oriented Architecture (SOA) manifestation led some governments to pertain this in their eGovts, but it limits the source of business artifacts. The concurrent use of EA and SOA in eGovt executes interoperability and integration and leads to Service-Oriented e-Government Enterprise (SOeGE). Consequently, agile eGovt system becomes a reality. As an IT perspective eGovt comprises of centralized public service artifacts with the existing application logics belong to various departments at central, state and local level. The eGovt is renovating to SOeGE by apply the Service-Orientation (SO) principles in the entire system. This paper explores IT perspective of SOeGE in India which encompasses the public service models and illustrated with a case study the Passport service of India.

Keywords: enterprise architecture, service-oriented e-Government enterprise, service interface layer, service model

Procedia PDF Downloads 483
38 High-Value Health System for All: Technologies for Promoting Health Education and Awareness

Authors: M. P. Sebastian

Abstract:

Health for all is considered as a sign of well-being and inclusive growth. New healthcare technologies are contributing to the quality of human lives by promoting health education and awareness, leading to the prevention, early diagnosis and treatment of the symptoms of diseases. Healthcare technologies have now migrated from the medical and institutionalized settings to the home and everyday life. This paper explores these new technologies and investigates how they contribute to health education and awareness, promoting the objective of high-value health system for all. The methodology used for the research is literature review. The paper also discusses the opportunities and challenges with futuristic healthcare technologies. The combined advances in genomics medicine, wearables and the IoT with enhanced data collection in electronic health record (EHR) systems, environmental sensors, and mobile device applications can contribute in a big way to high-value health system for all. The promise by these technologies includes reduced total cost of healthcare, reduced incidence of medical diagnosis errors, and reduced treatment variability. The major barriers to adoption include concerns with security, privacy, and integrity of healthcare data, regulation and compliance issues, service reliability, interoperability and portability of data, and user friendliness and convenience of these technologies.

Keywords: big data, education, healthcare, information communication technologies (ICT), patients, technologies

Procedia PDF Downloads 174
37 From Waste to Wealth: A Future Paradigm for Plastic Management Using Blockchain Technology

Authors: Jim Shi, Jasmine Chang, Nesreen El-Rayes

Abstract:

The world has been experiencing a steadily increasing trend in both the production and consumption of plastic. The global consumer revolution should not have been possible without plastic, thanks to its salient feature of inexpensiveness and durability. But, as a two-edged sword, its durable quality has returned to haunt and even jeopardized us. That exacerbating the plastic crisis has attracted various global initiatives and actions. Simultaneously, firms are eager to adopt new technology as they witness and perceive more potential and merit of Industry 4.0 technologies. For example, Blockchain technology (BCT) is drawing the attention of numerous stakeholders because of its wide range of outstanding features that promise to enhance supply chain operations. However, from a research perspective, most of the literature addresses the plastic crisis from either environmental or social perspectives. In contrast, analysis from the data science perspective and technology is relatively scarce. To this end, this study aims to fill this gap and cover the plastic crisis from a holistic view of environmental, social, technological, and business perspectives. In particular, we propose a mathematical model to examine the inclusion of BCT to enhance and improve the efficiency on the upstream and the downstream sides of the plastic value, where the whole value chain is coordinated systematically, and its interoperability can be optimized. Consequently, the Environmental, Social, and Governance (ESG) goal and Circular Economics (CE) sustainability can be maximized.

Keywords: blockchain technology, plastic, circular economy, sustainability

Procedia PDF Downloads 52
36 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy

Authors: Amir Tosson, Mohammad Reza, Christian Gutt

Abstract:

Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.

Keywords: metadata, FAIR, data analysis, XPCS, IoT

Procedia PDF Downloads 36
35 An Ontology-Based Framework to Support Asset Integrity Modeling: Case Study of Offshore Riser Integrity

Authors: Mohammad Sheikhalishahi, Vahid Ebrahimipour, Amir Hossein Radman-Kian

Abstract:

This paper proposes an Ontology framework for knowledge modeling and representation of the equipment integrity process in a typical oil and gas production plant. Our aim is to construct a knowledge modeling that facilitates translation, interpretation, and conversion of human-readable integrity interpretation into computer-readable representation. The framework provides a function structure related to fault propagation using ISO 14224 and ISO 15926 OWL-Lite/ Resource Description Framework (RDF) to obtain a generic system-level model of asset integrity that can be utilized in the integrity engineering process during the equipment life cycle. It employs standard terminology developed by ISO 15926 and ISO 14224 to map textual descriptions of equipment failure and then convert it to a causality-driven logic by semantic interpretation and computer-based representation using Lite/RDF. The framework applied for an offshore gas riser. The result shows that the approach can cross-link the failure-related integrity words and domain-specific logic to obtain a representation structure of equipment integrity with causality inference based on semantic extraction of inspection report context.

Keywords: asset integrity modeling, interoperability, OWL, RDF/XML

Procedia PDF Downloads 153
34 Steps towards the Development of National Health Data Standards in Developing Countries

Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray

Abstract:

The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.

Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia

Procedia PDF Downloads 303
33 Optimization of a Hand-Fan Shaped Microstrip Patch Antenna by Means of Orthogonal Design Method of Design of Experiments for L-Band and S-Band Applications

Authors: Jaswinder Kaur, Nitika, Navneet Kaur, Rajesh Khanna

Abstract:

A hand-fan shaped microstrip patch antenna (MPA) for L-band and S-band applications is designed, and its characteristics have been reconnoitered. The proposed microstrip patch antenna with double U-slot defected ground structure (DGS) is fabricated on an FR4 substrate which is a very readily available and inexpensive material. The suggested antenna is optimized using Orthogonal Design Method (ODM) of Design of Experiments (DOE) to cover the frequency range from 0.91-2.82 GHz for L-band and S-band applications. The L-band covers the frequency range of 1-2 GHz, which is allocated to telemetry, aeronautical, and military systems for passive satellite sensors, weather radars, radio astronomy, and mobile communication. The S-band covers the frequency range of 2-3 GHz, which is used by weather radars, surface ship radars and communication satellites and is also reserved for various wireless applications such as Worldwide Interoperability for Microwave Access (Wi-MAX), super high frequency radio frequency identification (SHF RFID), industrial, scientific and medical bands (ISM), Bluetooth, wireless broadband (Wi-Bro) and wireless local area network (WLAN). The proposed method of optimization is very time efficient and accurate as compared to the conventional evolutionary algorithms due to its statistical strategy. Moreover, the antenna is tested, followed by the comparison of simulated and measured results.

Keywords: design of experiments, hand fan shaped MPA, L-Band, orthogonal design method, S-Band

Procedia PDF Downloads 104
32 Smart Security Concept in the East Mediterranean: Anti Asymmetrical Area Denial (A3D)

Authors: Serkan Tezgel

Abstract:

The two qualities of the sea, as a medium of transportation and as a resource, necessitate maritime security for economic stability and good order at sea. The borderless nature of the sea makes it one of the best platforms to contribute to regional peace and international order. For this reason, the establishment of maritime security in East Mediterranean will enhance the security-peace-democracy triangle in the region. This paper proposes the application of the Smart Security Concept in the East Mediterranean. Smart Security aims to secure critical infrastructure, such as hydrocarbon platforms, against asymmetrical threats. The concept is based on Anti Asymmetrical Area Denial (A3D) which necessitates limiting freedom of action of maritime terrorists and piracy by founding safe and secure maritime areas through sea lines of communication using short range capabilities. Smart Security is a regional maritime cooperation concept for the narrow seas. Cooperation and interoperability are essential attributes of this regional security concept. Therefore, multinational excellence centers such as Multinational Maritime Security Center of Excellence-Aksaz in Turkey, which will determine necessary capabilities and plan/coordinate workshops, training and exercises, are bound to be the principal characteristic of Smart Security concept and similar regional concepts. Smart Security, a crucial enabler of energy and regional security, can provide an enduring approach for operating in the challenging environment of narrow seas and for countering asymmetrical threats.

Keywords: security, cooperation, asymmetrical, area denial

Procedia PDF Downloads 777
31 Design and Modeling of a Green Building Energy Efficient System

Authors: Berhane Gebreslassie

Abstract:

Conventional commericial buildings are among the highest unwisely consumes enormous amount of energy and as consequence produce significant amount Carbon Dioxide (CO2). Traditional/conventional buildings have been built for years without consideration being given to their impact on the global warming issues as well as their CO2 contributions. Since 1973, simulation of Green Building (GB) for Energy Efficiency started and many countries in particular the US showed a positive response to minimize the usage of energy in respect to reducing the CO2 emission. As a consequence many software companies developed their own unique building energy efficiency simulation software, interfacing interoperability with Building Information Modeling (BIM). The last decade has witnessed very rapid growing number of researches on GB energy efficiency system. However, the study also indicates that the results of current GB simulation are not yet satisfactory to meet the objectives of GB. In addition most of these previous studies are unlikely excluded the studies of ultimate building energy efficiencies simulation. The aim of this project is to meet the objectives of GB by design, modeling and simulation of building ultimate energy efficiencies system. This research project presents multi-level, L-shape office building in which every particular part of the building materials has been tested for energy efficiency. An overall of 78.62% energy is saved, approaching to NetZero energy saving. Furthermore, the building is implements with distributed energy resources like renewable energies and integrating with Smart Building Automation System (SBAS) for controlling and monitoring energy usage.

Keywords: ultimate energy saving, optimum energy saving, green building, sustainable materials and renewable energy

Procedia PDF Downloads 247
30 Design of Multiband Microstrip Antenna Using Stepped Cut Method for WLAN/WiMAX and C/Ku-Band Applications

Authors: Ahmed Boutejdar, Bishoy I. Halim, Soumia El Hani, Larbi Bellarbi, Amal Afyf

Abstract:

In this paper, a planar monopole antenna for multi band applications is proposed. The antenna structure operates at three operating frequencies at 3.7, 6.2, and 13.5 GHz which cover different communication frequency ranges. The antenna consists of a quasi-modified rectangular radiating patch with a partial ground plane and two parasitic elements (open-loop-ring resonators) to serve as coupling-bridges. A stepped cut at lower corners of the radiating patch and the partial ground plane are used, to achieve the multiband features. The proposed antenna is manufactured on the FR4 substrate and is simulated and optimized using High Frequency Simulation System (HFSS). The antenna topology possesses an area of 30.5 x 30 x 1.6 mm3. The measured results demonstrate that the candidate antenna has impedance bandwidths for 10 dB return loss and operates from 3.80 – 3.90 GHz, 4.10 – 5.20 GHz, 11.2 – 11.5 GHz and from 12.5 – 14.0 GHz, which meet the requirements of the wireless local area network (WLAN), worldwide interoperability for microwave access (WiMAX), C- (Uplink) and Ku- (Uplink) band applications. Acceptable agreement is obtained between measurement and simulation results. Experimental results show that the antenna is successfully simulated and measured, and the tri-band antenna can be achieved by adjusting the lengths of the three elements and it gives good gains across all the operation bands.

Keywords: planar monopole antenna, FR4 substrate, HFSS, WLAN, WiMAX, C and Ku

Procedia PDF Downloads 162
29 Normalized Enterprises Architectures: Portugal's Public Procurement System Application

Authors: Tiago Sampaio, André Vasconcelos, Bruno Fragoso

Abstract:

The Normalized Systems Theory, which is designed to be applied to software architectures, provides a set of theorems, elements and rules, with the purpose of enabling evolution in Information Systems, as well as ensuring that they are ready for change. In order to make that possible, this work’s solution is to apply the Normalized Systems Theory to the domain of enterprise architectures, using Archimate. This application is achieved through the adaptation of the elements of this theory, making them artifacts of the modeling language. The theorems are applied through the identification of the viewpoints to be used in the architectures, as well as the transformation of the theory’s encapsulation rules into architectural rules. This way, it is possible to create normalized enterprise architectures, thus fulfilling the needs and requirements of the business. This solution was demonstrated using the Portuguese Public Procurement System. The Portuguese government aims to make this system as fair as possible, allowing every organization to have the same business opportunities. The aim is for every economic operator to have access to all public tenders, which are published in any of the 6 existing platforms, independently of where they are registered. In order to make this possible, we applied our solution to the construction of two different architectures, which are able of fulfilling the requirements of the Portuguese government. One of those architectures, TO-BE A, has a Message Broker that performs the communication between the platforms. The other, TO-BE B, represents the scenario in which the platforms communicate with each other directly. Apart from these 2 architectures, we also represent the AS-IS architecture that demonstrates the current behavior of the Public Procurement Systems. Our evaluation is based on a comparison between the AS-IS and the TO-BE architectures, regarding the fulfillment of the rules and theorems of the Normalized Systems Theory and some quality metrics.

Keywords: archimate, architecture, broker, enterprise, evolvable systems, interoperability, normalized architectures, normalized systems, normalized systems theory, platforms

Procedia PDF Downloads 328
28 Bluetooth Communication Protocol Study for Multi-Sensor Applications

Authors: Joao Garretto, R. J. Yarwood, Vamsi Borra, Frank Li

Abstract:

Bluetooth Low Energy (BLE) has emerged as one of the main wireless communication technologies used in low-power electronics, such as wearables, beacons, and Internet of Things (IoT) devices. BLE’s energy efficiency characteristic, smart mobiles interoperability, and Over the Air (OTA) capabilities are essential features for ultralow-power devices, which are usually designed with size and cost constraints. Most current research regarding the power analysis of BLE devices focuses on the theoretical aspects of the advertising and scanning cycles, with most results being presented in the form of mathematical models and computer software simulations. Such computer modeling and simulations are important for the comprehension of the technology, but hardware measurement is essential for the understanding of how BLE devices behave in real operation. In addition, recent literature focuses mostly on the BLE technology, leaving possible applications and its analysis out of scope. In this paper, a coin cell battery-powered BLE Data Acquisition Device, with a 4-in-1 sensor and one accelerometer, is proposed and evaluated with respect to its Power Consumption. First, evaluations of the device in advertising mode with the sensors turned off completely, followed by the power analysis when each of the sensors is individually turned on and data is being transmitted, and concluding with the power consumption evaluation when both sensors are on and respectively broadcasting the data to a mobile phone. The results presented in this paper are real-time measurements of the electrical current consumption of the BLE device, where the energy levels that are demonstrated are matched to the BLE behavior and sensor activity.

Keywords: bluetooth low energy, power analysis, BLE advertising cycle, wireless sensor node

Procedia PDF Downloads 64
27 Block-Chain Land Administration Technology in Nigeria: Opportunities and Challenges

Authors: Babalola Sunday Oyetayo, Igbinomwanhia Uyi Osamwonyi, Idowu T. O., Herbert Tata

Abstract:

This paper explores the potential benefits of adopting blockchain technology in Nigeria's land administration systems while also addressing the challenges and implications of its implementation in the country's unique context. Through a comprehensive literature review and analysis of existing research, the paper delves into the key attributes of blockchain that can revolutionize land administration practices, with a particular focus on simplifying land registration procedures, expediting land title issuance, and enhancing data transparency and security. The decentralized and immutable nature of blockchain offers unique advantages, instilling trust and confidence in land transactions, which are especially crucial in Nigeria's land governance landscape. However, integrating blockchain in Nigeria's land administration ecosystem presents specific challenges, necessitating a critical evaluation of technical, socio-economic, and infrastructural barriers. These challenges encompass data privacy concerns, scalability, interoperability with outdated systems, and gaining acceptance from various stakeholders. By synthesizing these insights, the paper proposes strategies tailored to Nigeria's context to optimize the benefits of blockchain adoption while addressing the identified challenges. The research findings contribute significantly to the ongoing discourse on blockchain technology in Nigeria's land governance, offering evidence-based recommendations to policymakers, land administrators, and stakeholders. Ultimately, the paper aims to promote the effective utilization of blockchain, fostering efficiency, transparency, and trust in Nigeria's land administration systems to drive sustainable development and societal progress.

Keywords: block-chain, technology, stakeholders, land registration

Procedia PDF Downloads 36
26 Towards Improved Public Information on Industrial Emissions in Italy: Concepts and Specific Issues Associated to the Italian Experience in IPPC Permit Licensing

Authors: C. Mazziotti Gomez de Teran, D. Fiore, B. Cola, A. Fardelli

Abstract:

The present paper summarizes the analysis of the request for consultation of information and data on industrial emissions made publicly available on the web site of the Ministry of Environment, Land and Sea on integrated pollution prevention and control from large industrial installations, the so called “AIA Portal”. However, since also local Competent Authorities have been organizing their own web sites on IPPC permits releasing procedures for public consultation purposes, as a result, a huge amount of information on national industrial plants is already available on internet, although it is usually proposed as textual documentation or images. Thus, it is not possible to access all the relevant information through interoperability systems and also to retrieval relevant information for decision making purposes as well as rising of awareness on environmental issue. Moreover, since in Italy the number of institutional and private subjects involved in the management of the public information on industrial emissions is substantial, the access to the information is provided on internet web sites according to different criteria; thus, at present it is not structurally homogeneous and comparable. To overcome the mentioned difficulties in the case of the Coordinating Committee for the implementation of the Agreement for the industrial area in Taranto and Statte, operating before the IPPC permit granting procedures of the relevant installation located in the area, a big effort was devoted to elaborate and to validate data and information on characterization of soil, ground water aquifer and coastal sea at disposal of different subjects to derive a global perspective for decision making purposes. Thus, the present paper also focuses on main outcomes matured during such experience.

Keywords: public information, emissions into atmosphere, IPPC permits, territorial information systems

Procedia PDF Downloads 242
25 Co-Operation in Hungarian Agriculture

Authors: Eszter Hamza

Abstract:

The competitiveness of economic operators is based on interoperability, which is relatively low in Hungary. The development of co-operation is high priority in Common Agricultural Policy 2014-2020. The aim of the paper to assess co-operations in Hungarian agriculture, estimate the economic outputs and benefits of co-operations, based on statistical data processing and literature. Further objective is to explore the potential of agricultural co-operation with the help of interviews and questionnaire survey. The research seeks to answer questions as to what fundamental factors play role in the development of co-operation, and what are the motivations of the actors and the key success factors and pitfalls. The results were analysed using econometric methods. In Hungarian agriculture we can find several forms of co-operation: cooperatives, producer groups (PG) and producer organizations (PO), machinery cooperatives, integrator companies, product boards and interbranch organisations. Despite the several appearance of the agricultural co-operation, their economic weight is significantly lower in Hungary than in western European countries. Considering the agricultural importance, the integrator companies represent the most weight among the co-operations forms. Hungarian farmers linked to co-operations or organizations mostly in relation to procurement and sales. Less than 30 percent of surveyed farmers are members of a producer organization or cooperative. The trust level is low among farmers. The main obstacle to the development of formalized co-operation, is producers' risk aversion and the black economy in agriculture. Producers often prefer informal co-operation instead of long-term contractual relationships. The Hungarian agricultural co-operations are characterized by non-dynamic development, but slow qualitative change. For the future, one breakout point could be the association of producer groups and organizations, which in addition to the benefits of market concentration, in the dissemination of knowledge, advisory network operation and innovation can act more effectively.

Keywords: agriculture, co-operation, producer organisation, trust level

Procedia PDF Downloads 364
24 Reimagining the Management of Telco Supply Chain with Blockchain

Authors: Jeaha Yang, Ahmed Khan, Donna L. Rodela, Mohammed A. Qaudeer

Abstract:

Traditional supply chain silos still exist today due to the difficulty of establishing trust between various partners and technological barriers across industries. Companies lose opportunities and revenue and inadvertently make poor business decisions resulting in further challenges. Blockchain technology can bring a new level of transparency through sharing information with a distributed ledger in a decentralized manner that creates a basis of trust for business. Blockchain is a loosely coupled, hub-style communication network in which trading partners can work indirectly with each other for simpler integration, but they work together through the orchestration of their supply chain operations under a coherent process that is developed jointly. A Blockchain increases efficiencies, lowers costs, and improves interoperability to strengthen and automate the supply chain management process while all partners share the risk. Blockchain ledger is built to track inventory lifecycle for supply chain transparency and keeps a journal of inventory movement for real-time reconciliation. State design patterns are used to capture the life cycle (behavior) of inventory management as a state machine for a common, transparent and coherent process which creates an opportunity for trading partners to become more responsive in terms of changes or improvements in process, reconcile discrepancies, and comply with internal governance and external regulations. It enables end-to-end, inter-company visibility at the unit level for more accurate demand planning with better insight into order fulfillment and replenishment.

Keywords: supply chain management, inventory trace-ability, perpetual inventory system, inventory lifecycle, blockchain, inventory consignment, supply chain transparency, digital thread, demand planning, hyper ledger fabric

Procedia PDF Downloads 62
23 Discrete-Event Modeling and Simulation Methodologies: Past, Present and Future

Authors: Gabriel Wainer

Abstract:

Modeling and Simulation methods have been used to better analyze the behavior of complex physical systems, and it is now common to use simulation as a part of the scientific and technological discovery process. M&S advanced thanks to the improvements in computer technology, which, in many cases, resulted in the development of simulation software using ad-hoc techniques. Formal M&S appeared in order to try to improve the development task of very complex simulation systems. Some of these techniques proved to be successful in providing a sound base for the development of discrete-event simulation models, improving the ease of model definition and enhancing the application development tasks; reducing costs and favoring reuse. The DEVS formalism is one of these techniques, which proved to be successful in providing means for modeling while reducing development complexity and costs. DEVS model development is based on a sound theoretical framework. The independence of M&S tasks made possible to run DEVS models on different environments (personal computers, parallel computers, real-time equipment, and distributed simulators) and middleware. We will present a historical perspective of discrete-event M&S methodologies, showing different modeling techniques. We will introduce DEVS origins and general ideas, and compare it with some of these techniques. We will then show the current status of DEVS M&S, and we will discuss a technological perspective to solve current M&S problems (including real-time simulation, interoperability, and model-centered development techniques). We will show some examples of the current use of DEVS, including applications in different fields. We will finally show current open topics in the area, which include advanced methods for centralized, parallel or distributed simulation, the need for real-time modeling techniques, and our view in these fields.

Keywords: modeling and simulation, discrete-event simulation, hybrid systems modeling, parallel and distributed simulation

Procedia PDF Downloads 298
22 A Web Service-Based Framework for Mining E-Learning Data

Authors: Felermino D. M. A. Ali, S. C. Ng

Abstract:

E-learning is an evolutionary form of distance learning and has become better over time as new technologies emerged. Today, efforts are still being made to embrace E-learning systems with emerging technologies in order to make them better. Among these advancements, Educational Data Mining (EDM) is one that is gaining a huge and increasing popularity due to its wide application for improving the teaching-learning process in online practices. However, even though EDM promises to bring many benefits to educational industry in general and E-learning environments in particular, its principal drawback is the lack of easy to use tools. The current EDM tools usually require users to have some additional technical expertise to effectively perform EDM tasks. Thus, in response to these limitations, this study intends to design and implement an EDM application framework which aims at automating and simplify the development of EDM in E-learning environment. The application framework introduces a Service-Oriented Architecture (SOA) that hides the complexity of technical details and enables users to perform EDM in an automated fashion. The framework was designed based on abstraction, extensibility, and interoperability principles. The framework implementation was made up of three major modules. The first module provides an abstraction for data gathering, which was done by extending Moodle LMS (Learning Management System) source code. The second module provides data mining methods and techniques as services; it was done by converting Weka API into a set of Web services. The third module acts as an intermediary between the first two modules, it contains a user-friendly interface that allows dynamically locating data provider services, and running knowledge discovery tasks on data mining services. An experiment was conducted to evaluate the overhead of the proposed framework through a combination of simulation and implementation. The experiments have shown that the overhead introduced by the SOA mechanism is relatively small, therefore, it has been concluded that a service-oriented architecture can be effectively used to facilitate educational data mining in E-learning environments.

Keywords: educational data mining, e-learning, distributed data mining, moodle, service-oriented architecture, Weka

Procedia PDF Downloads 218
21 Planning the Journey of Unifying Medical Record Numbers in Five Facilities and the Expected Challenges: Case Study in Saudi Arabia

Authors: N. Al Khashan, H. Al Shammari, W. Al Bahli

Abstract:

Patients who are eligible to receive treatment at the National Guard Health Affairs (NGHA), Saudi Arabia will typically have four medical record numbers (MRN), one in each of the geographical areas. More hospitals and primary healthcare facilities in other geographical areas will launch soon which means more MRNs. When patients own four MRNs, this will cause major drawbacks in patients’ quality of care such as creating new medical files in different regions for relocated patients and using referral system among regions. Consequently, the access to a patient’s medical record from other regions and the interoperability of health information between the four hospitals’ information system would be challenging. Thus, there is a need to unify medical records among these five facilities. As part of the effort to increase the quality of care, a new Hospital Information Systems (HIS) was implemented in all NGHA facilities by the end of 2016. NGHA’s plan is put to be aligned with the Saudi Arabian national transformation program 2020; whereby 70% citizens and residents of Saudi Arabia would have a unified medical record number that enables transactions between multiple Electronic Medical Records (EMRs) vendors. The aim of the study is to explore the plan, the challenges and barriers of unifying the 4 MRNs into one Enterprise Patient Identifier (EPI) in NGHA hospitals by December 2018. A descriptive study methodology was used. A journey map and a project plan are created to be followed by the project team to ensure a smooth implementation of the EPI. It includes the following: 1) Approved project charter, 2) Project management plan, 3) Change management plan, 4) Project milestone dates. Currently, the HIS is using the regional MRN. Therefore, the HIS and all integrated health care systems in all regions will need modification to move from MRN to EPI without interfering with patient care. For now, the NGHA have successfully implemented an EPI connected with the 4 MRNs that work in the back end in the systems’ database.

Keywords: consumer health, health informatics, hospital information system, universal medical record number

Procedia PDF Downloads 165
20 Data Management System for Environmental Remediation

Authors: Elizaveta Petelina, Anton Sizo

Abstract:

Environmental remediation projects deal with a wide spectrum of data, including data collected during site assessment, execution of remediation activities, and environmental monitoring. Therefore, an appropriate data management is required as a key factor for well-grounded decision making. The Environmental Data Management System (EDMS) was developed to address all necessary data management aspects, including efficient data handling and data interoperability, access to historical and current data, spatial and temporal analysis, 2D and 3D data visualization, mapping, and data sharing. The system focuses on support of well-grounded decision making in relation to required mitigation measures and assessment of remediation success. The EDMS is a combination of enterprise and desktop level data management and Geographic Information System (GIS) tools assembled to assist to environmental remediation, project planning, and evaluation, and environmental monitoring of mine sites. EDMS consists of seven main components: a Geodatabase that contains spatial database to store and query spatially distributed data; a GIS and Web GIS component that combines desktop and server-based GIS solutions; a Field Data Collection component that contains tools for field work; a Quality Assurance (QA)/Quality Control (QC) component that combines operational procedures for QA and measures for QC; Data Import and Export component that includes tools and templates to support project data flow; a Lab Data component that provides connection between EDMS and laboratory information management systems; and a Reporting component that includes server-based services for real-time report generation. The EDMS has been successfully implemented for the Project CLEANS (Clean-up of Abandoned Northern Mines). Project CLEANS is a multi-year, multimillion-dollar project aimed at assessing and reclaiming 37 uranium mine sites in northern Saskatchewan, Canada. The EDMS has effectively facilitated integrated decision-making for CLEANS project managers and transparency amongst stakeholders.

Keywords: data management, environmental remediation, geographic information system, GIS, decision making

Procedia PDF Downloads 125
19 Applications and Development of a Plug Load Management System That Automatically Identifies the Type and Location of Connected Devices

Authors: Amy Lebar, Kim L. Trenbath, Bennett Doherty, William Livingood

Abstract:

Plug and process loads (PPLs) account for 47% of U.S. commercial building energy use. There is a huge potential to reduce whole building consumption by targeting PPLs for energy savings measures or implementing some form of plug load management (PLM). Despite this potential, there has yet to be a widely adopted commercial PLM technology. This paper describes the Automatic Type and Location Identification System (ATLIS), a PLM system framework with automatic and dynamic load detection (ADLD). ADLD gives PLM systems the ability to automatically identify devices as they are plugged into the outlets of a building. The ATLIS framework takes advantage of smart, connected devices to identify device locations in a building, meter and control their power, and communicate this information to a central database. ATLIS includes five primary capabilities: location identification, communication, control, energy metering and data storage. A laboratory proof of concept (PoC) demonstrated all but the data storage capabilities and these capabilities were validated using an office building scenario. The PoC can identify when a device is plugged into an outlet and the location of the device in the building. When a device is moved, the PoC’s dashboard and database are automatically updated with the new location. The PoC implements controls to devices from the system dashboard so that devices maintain correct schedules regardless of where they are plugged in within a building. ATLIS’s primary technology application is improved PLM, but other applications include asset management, energy audits, and interoperability for grid-interactive efficient buildings. A system like ATLIS could also be used to direct power to critical devices, such as ventilators, during a brownout or blackout. Such a framework is an opportunity to make PLM more widespread and reduce the amount of energy consumed by PPLs in current and future commercial buildings.

Keywords: commercial buildings, grid-interactive efficient buildings (GEB), miscellaneous electric loads (MELs), plug loads, plug load management (PLM)

Procedia PDF Downloads 111
18 Developing a SOA-Based E-Healthcare Systems

Authors: Hend Albassam, Nouf Alrumaih

Abstract:

Nowadays we are in the age of technologies and communication and there is no doubt that technologies such as the Internet can offer many advantages for many business fields, and the health field is no execution. In fact, using the Internet provide us with a new path to improve the quality of health care throughout the world. The e-healthcare offers many advantages such as: efficiency by reducing the cost and avoiding duplicate diagnostics, empowerment of patients by enabling them to access their medical records, enhancing the quality of healthcare and enabling information exchange and communication between healthcare organizations. There are many problems that result from using papers as a way of communication, for example, paper-based prescriptions. Usually, the doctor writes a prescription and gives it to the patient who in turn carries it to the pharmacy. After that, the pharmacist takes the prescription to fill it and give it to the patient. Sometimes the pharmacist might find difficulty in reading the doctor’s handwriting; the patient could change and counterfeit the prescription. These existing problems and many others heighten the need to improve the quality of the healthcare. This project is set out to develop a distributed e-healthcare system that offers some features of e-health and addresses some of the above-mentioned problems. The developed system provides an electronic health record (EHR) and enables communication between separate health care organizations such as the clinic, pharmacy and laboratory. To develop this system, the Service Oriented Architecture (SOA) is adopted as a design approach, which helps to design several independent modules that communicate by using web services. The layering design pattern is used in designing each module as it provides reusability that allows the business logic layer to be reused by different higher layers such as the web service or the website in our system. The experimental analysis has shown that the project has successfully achieved its aims toward solving the problems related to the paper-based healthcare systems and it enables different health organization to communicate effectively. It implements four independent modules including healthcare provider, pharmacy, laboratory and medication information provider. Each module provides different functionalities and is used by a different type of user. These modules interoperate with each other using a set of web services.

Keywords: e-health, services oriented architecture (SOA), web services, interoperability

Procedia PDF Downloads 272
17 Creating and Questioning Research-Oriented Digital Outputs to Manuscript Metadata: A Case-Based Methodological Investigation

Authors: Diandra Cristache

Abstract:

The transition of traditional manuscript studies into the digital framework closely affects the methodological premises upon which manuscript descriptions are modeled, created, and questioned for the purpose of research. This paper intends to explore the issue by presenting a methodological investigation into the process of modeling, creating, and questioning manuscript metadata. The investigation is founded on a close observation of the Polonsky Greek Manuscripts Project, a collaboration between the Universities of Cambridge and Heidelberg. More than just providing a realistic ground for methodological exploration, along with a complete metadata set for computational demonstration, the case study also contributes to a broader purpose: outlining general methodological principles for making the most out of manuscript metadata by means of research-oriented digital outputs. The analysis mainly focuses on the scholarly approach to manuscript descriptions, in the specific instance where the act of metadata recording does not have a programmatic research purpose. Close attention is paid to the encounter of 'traditional' practices in manuscript studies with the formal constraints of the digital framework: does the shift in practices (especially from the straight narrative of free writing towards the hierarchical constraints of the TEI encoding model) impact the structure of metadata and its capability to respond specific research questions? It is argued that flexible structure of TEI and traditional approaches to manuscript description lead to a proliferation of markup: does an 'encyclopedic' descriptive approach ensure the epistemological relevance of the digital outputs to metadata? To provide further insight on the computational approach to manuscript metadata, the metadata of the Polonsky project are processed with techniques of distant reading and data networking, thus resulting in a new group of digital outputs (relational graphs, geographic maps). The computational process and the digital outputs are thoroughly illustrated and discussed. Eventually, a retrospective analysis evaluates how the digital outputs respond to the scientific expectations of research, and the other way round, how the requirements of research questions feed back into the creation and enrichment of metadata in an iterative loop.

Keywords: digital manuscript studies, digital outputs to manuscripts metadata, metadata interoperability, methodological issues

Procedia PDF Downloads 114