Search results for: third party monitoring software
7764 Surveillance of Super-Extended Objects: Bimodal Approach
Authors: Andrey V. Timofeev, Dmitry Egorov
Abstract:
This paper describes an effective solution to the task of a remote monitoring of super-extended objects (oil and gas pipeline, railways, national frontier). The suggested solution is based on the principle of simultaneously monitoring of seismoacoustic and optical/infrared physical fields. The principle of simultaneous monitoring of those fields is not new but in contrast to the known solutions the suggested approach allows to control super-extended objects with very limited operational costs. So-called C-OTDR (Coherent Optical Time Domain Reflectometer) systems are used to monitor the seismoacoustic field. Far-CCTV systems are used to monitor the optical/infrared field. A simultaneous data processing provided by both systems allows effectively detecting and classifying target activities, which appear in the monitored objects vicinity. The results of practical usage had shown high effectiveness of the suggested approach.Keywords: C-OTDR monitoring system, bimodal processing, LPboost, SVM
Procedia PDF Downloads 4707763 Classification Method for Turnover While Sleeping Using Multi-Point Unconstrained Sensing Devices
Authors: K. Shiba, T. Kobayashi, T. Kaburagi, Y. Kurihara
Abstract:
Elderly population in the world is increasing, and consequently, their nursing burden is also increasing. In such situations, monitoring and evaluating their daily action facilitates efficient nursing care. Especially, we focus on an unconscious activity during sleep, i.e. turnover. Monitoring turnover during sleep is essential to evaluate various conditions related to sleep. Bedsores are considered as one of the monitoring conditions. Changing patient’s posture every two hours is required for caregivers to prevent bedsore. Herein, we attempt to develop an unconstrained nocturnal monitoring system using a sensing device based on piezoelectric ceramics that can detect the vibrations owing to human body movement on the bed. In the proposed method, in order to construct a multi-points sensing, we placed two sensing devices under the right and left legs at the head-side of an ordinary bed. Using this equipment, when a subject lies on the bed, feature is calculated from the output voltages of the sensing devices. In order to evaluate our proposed method, we conducted an experiment with six healthy male subjects. Consequently, the period during which turnover occurs can be correctly classified as the turnover period with 100% accuracy.Keywords: turnover, piezoelectric ceramics, multi-points sensing, unconstrained monitoring system
Procedia PDF Downloads 1947762 Design of Labview Based DAQ System
Authors: Omar A. A. Shaebi, Matouk M. Elamari, Salaheddin Allid
Abstract:
The Information Computing System of Monitoring (ICSM) for the Research Reactor of Tajoura Nuclear Research Centre (TNRC) stopped working since early 1991. According to the regulations, the computer is necessary to operate the reactor up to its maximum power (10 MW). The fund is secured via IAEA to develop a modern computer based data acquisition system to replace the old computer. This paper presents the development of the Labview based data acquisition system to allow automated measurements using National Instruments Hardware and its labview software. The developed system consists of SCXI 1001 chassis, the chassis house four SCXI 1100 modules each can maintain 32 variables. The chassis is interfaced with the PC using NI PCI-6023 DAQ Card. Labview, developed by National Instruments, is used to run and operate the DAQ System. Labview is graphical programming environment suited for high level design. It allows integrating different signal processing components or subsystems within a graphical framework. The results showed system capabilities in monitoring variables, acquiring and saving data. Plus the capability of the labview to control the DAQ.Keywords: data acquisition, labview, signal conditioning, national instruments
Procedia PDF Downloads 4947761 A Hybrid Feature Selection Algorithm with Neural Network for Software Fault Prediction
Authors: Khalaf Khatatneh, Nabeel Al-Milli, Amjad Hudaib, Monther Ali Tarawneh
Abstract:
Software fault prediction identify potential faults in software modules during the development process. In this paper, we present a novel approach for software fault prediction by combining a feedforward neural network with particle swarm optimization (PSO). The PSO algorithm is employed as a feature selection technique to identify the most relevant metrics as inputs to the neural network. Which enhances the quality of feature selection and subsequently improves the performance of the neural network model. Through comprehensive experiments on software fault prediction datasets, the proposed hybrid approach achieves better results, outperforming traditional classification methods. The integration of PSO-based feature selection with the neural network enables the identification of critical metrics that provide more accurate fault prediction. Results shows the effectiveness of the proposed approach and its potential for reducing development costs and effort by detecting faults early in the software development lifecycle. Further research and validation on diverse datasets will help solidify the practical applicability of the new approach in real-world software engineering scenarios.Keywords: feature selection, neural network, particle swarm optimization, software fault prediction
Procedia PDF Downloads 957760 Tardiness and Self-Regulation: Degree and Reason for Tardiness in Undergraduate Students in Japan
Authors: Keiko Sakai
Abstract:
In Japan, all stages of public education aim to foster a zest for life. ‘Zest’ implies solving problems by oneself, using acquired knowledge and skills. It is related to the self-regulation of metacognition. To enhance this, establishing good learning habits is important. Tardiness in undergraduate students should be examined based on self-regulation. Accordingly, we focussed on self-monitoring and self-planning strategies among self-regulated learning factors to examine the causes of tardiness. This study examines the impact of self-monitoring and self-planning learning skills on the degree and reason for tardiness in undergraduate students. A questionnaire survey was conducted, targeted to undergraduate students in University X in the autumn semester of 2018. Participants were 247 (average age 19.7, SD 1.9; 144 males, 101 females, 2 no answers). The survey contained the following items and measures: school year, the number of classes in the semester, degree of tardiness in the semester (subjective degree and objective times), active participation in and action toward schoolwork, self-planning and self-monitoring learning skills, and reason for tardiness (open-ended question). First, the relation between strategies and tardiness was examined by multiple regressions. A statistically significant relationship between a self-monitoring learning strategy and the degree of subjective and objective tardiness was revealed, after statistically controlling the school year and the number of classes. There was no significant relationship between a self-planning learning strategy and the degree of tardiness. These results suggest that self-monitoring skills reduce tardiness. Secondly, the relation between a self-monitoring learning strategy and the reason of tardiness was analysed, after classifying the reason for tardiness into one of seven categories: ‘overslept’, ‘illness’, ‘poor time management’, ‘traffic delays’, ‘carelessness’, ‘low motivation’, and ‘stuff to do’. Chi-square tests and Fisher’s exact tests showed a statistically significant relationship between a self-monitoring learning strategy and the frequency of ‘traffic delays’. This result implies that self-monitoring skills prevent tardiness because of traffic delays. Furthermore, there was a weak relationship between a self-monitoring learning strategy score and the reason-for-tardiness categories. When self-monitoring skill is higher, a decrease in ‘overslept’ and ‘illness’, and an increase in ‘poor time management’, ‘carelessness’, and ‘low motivation’ are indicated. It is suggested that a self-monitoring learning strategy is related to an internal causal attribution of failure and self-management for how to prevent tardiness. From these findings, the effectiveness of a self-monitoring learning skill strategy for reducing tardiness in undergraduate students is indicated.Keywords: higher-education, self-monitoring, self-regulation, tardiness
Procedia PDF Downloads 1357759 Noninvasive Continuous Glucose Monitoring Device Using a Photon-Assisted Tunneling Photodetector Based on a Quantum Metal-Oxide-Semiconductor
Authors: Wannakorn Sangthongngam, Melissa Huerta, Jaewoo Kim, Doyeon Kim
Abstract:
Continuous glucose monitoring systems are essential for diabetics to avoid health complications but come at a costly price, especially when insurance does not fully cover the diabetic testing kits needed. This paper proposes a noninvasive continuous glucose monitoring system to provide an accessible, low-cost, and painless alternative method of accurate glucose measurements to help improve quality of life. Using a light source with a wavelength of 850nm illuminates the fingertip for the photodetector to detect the transmitted light. Utilizing SeeDevice’s photon-assisted tunneling photodetector (PAT-PD)-based QMOS™ sensor, fluctuations of voltage based on photon absorption in blood cells are comparable to traditional glucose measurements. The performance of the proposed method was validated using 4 test participants’ transmitted voltage readings compared with measurements obtained from the Accu-Chek glucometer. The proposed method was able to successfully measure concentrations from linear regression calculations.Keywords: continuous glucose monitoring, non-invasive continuous glucose monitoring, NIR, photon-assisted tunneling photodetector, QMOS™, wearable device
Procedia PDF Downloads 977758 Quantifying Parallelism of Vectors Is the Quantification of Distributed N-Party Entanglement
Authors: Shreya Banerjee, Prasanta K. Panigrahi
Abstract:
The three-way distributive entanglement is shown to be related to the parallelism of vectors. Using a measurement-based approach a set of 2−dimensional vectors is formed, representing the post-measurement states of one of the parties. These vectors originate at the same point and have an angular distance between them. The area spanned by a pair of such vectors is a measure of the entanglement of formation. This leads to a geometrical manifestation of the 3−tangle in 2−dimensions, from inequality in the area which generalizes for n− qubits to reveal that the n− tangle also has a planar structure. Quantifying the genuine n−party entanglement in every 1|(n − 1) bi-partition it is shown that the genuine n−way entanglement does not manifest in n− tangle. A new quantity geometrically similar to 3−tangle is then introduced that represents the genuine n− way entanglement. Extending the formalism to 3− qutrits, the nonlocality without entanglement can be seen to arise from a condition under which the post-measurement state vectors of a separable state show parallelism. A connection to nontrivial sum uncertainty relation analogous to Maccone and Pati uncertainty relation is then presented using decomposition of post-measurement state vectors along parallel and perpendicular direction of the pre-measurement state vectors. This study opens a novel way to understand multiparty entanglement in qubit and qudit systems.Keywords: Geometry of quantum entanglement, Multipartite and distributive entanglement, Parallelism of vectors , Tangle
Procedia PDF Downloads 1547757 Medical Surveillance Management
Authors: Jina K., Kittinan C. Athitaya J., Weerapat B., Amornrat T., Waraphan N.
Abstract:
Working in the exploration and production of petroleum exposed workers to various health risks, including but not limited to physical and chemical risks. Although lots of barriers have been put in place, e.g., hazard monitoring in the workplace, appropriate training on health hazards, proper personal protective equipment (PPE), the health hazard may harm the workers if the barriers are not effectively implemented. To prove the effectiveness of these barriers, it is necessary to monitor exposure by putting in place the medical surveillance program via biological monitoring of chemical hazards and physical check-ups for physical hazards. Medical surveillance management is the systematic assessment and monitoring of employees exposed or potentially exposed to occupational hazards with the goal of reducing and ultimately preventing occupational illness and injury. The paper aims to demonstrate the effectiveness of medical surveillance management in mitigating health risks associated with physical and chemical hazards in the petroleum industry by focusing on implementing programs for biological monitoring and physical examinations, including defining procedures for biological monitoring, urine sample collection, physical examinations, and result management on offshore petroleum platforms. The implementation of medical surveillance management has proven effective in monitoring worker exposure to physical and chemical hazards, leading to reduced medical expenses and the risk associated with work-related diseases significantly.Keywords: medical surveillance, petroleum industry, occupational hazards, medical surveillance process
Procedia PDF Downloads 177756 Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools
Authors: Loke Mun Sei
Abstract:
Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, Robot Framework, and Jenkins.Keywords: test automation tools, test case, test execution, test reporting
Procedia PDF Downloads 5837755 Optimizing Stormwater Sampling Design for Estimation of Pollutant Loads
Authors: Raja Umer Sajjad, Chang Hee Lee
Abstract:
Stormwater runoff is the leading contributor to pollution of receiving waters. In response, an efficient stormwater monitoring program is required to quantify and eventually reduce stormwater pollution. The overall goals of stormwater monitoring programs primarily include the identification of high-risk dischargers and the development of total maximum daily loads (TMDLs). The challenge in developing better monitoring program is to reduce the variability in flux estimates due to sampling errors; however, the success of monitoring program mainly depends on the accuracy of the estimates. Apart from sampling errors, manpower and budgetary constraints also influence the quality of the estimates. This study attempted to develop optimum stormwater monitoring design considering both cost and the quality of the estimated pollutants flux. Three years stormwater monitoring data (2012 – 2014) from a mix land use located within Geumhak watershed South Korea was evaluated. The regional climate is humid and precipitation is usually well distributed through the year. The investigation of a large number of water quality parameters is time-consuming and resource intensive. In order to identify a suite of easy-to-measure parameters to act as a surrogate, Principal Component Analysis (PCA) was applied. Means, standard deviations, coefficient of variation (CV) and other simple statistics were performed using multivariate statistical analysis software SPSS 22.0. The implication of sampling time on monitoring results, number of samples required during the storm event and impact of seasonal first flush were also identified. Based on the observations derived from the PCA biplot and the correlation matrix, total suspended solids (TSS) was identified as a potential surrogate for turbidity, total phosphorus and for heavy metals like lead, chromium, and copper whereas, Chemical Oxygen Demand (COD) was identified as surrogate for organic matter. The CV among different monitored water quality parameters were found higher (ranged from 3.8 to 15.5). It suggests that use of grab sampling design to estimate the mass emission rates in the study area can lead to errors due to large variability. TSS discharge load calculation error was found only 2 % with two different sample size approaches; i.e. 17 samples per storm event and equally distributed 6 samples per storm event. Both seasonal first flush and event first flush phenomena for most water quality parameters were observed in the study area. Samples taken at the initial stage of storm event generally overestimate the mass emissions; however, it was found that collecting a grab sample after initial hour of storm event more closely approximates the mean concentration of the event. It was concluded that site and regional climate specific interventions can be made to optimize the stormwater monitoring program in order to make it more effective and economical.Keywords: first flush, pollutant load, stormwater monitoring, surrogate parameters
Procedia PDF Downloads 2407754 Phase II Monitoring of First-Order Autocorrelated General Linear Profiles
Authors: Yihua Wang, Yunru Lai
Abstract:
Statistical process control has been successfully applied in a variety of industries. In some applications, the quality of a process or product is better characterized and summarized by a functional relationship between a response variable and one or more explanatory variables. A collection of this type of data is called a profile. Profile monitoring is used to understand and check the stability of this relationship or curve over time. The independent assumption for the error term is commonly used in the existing profile monitoring studies. However, in many applications, the profile data show correlations over time. Therefore, we focus on a general linear regression model with a first-order autocorrelation between profiles in this study. We propose an exponentially weighted moving average charting scheme to monitor this type of profile. The simulation study shows that our proposed methods outperform the existing schemes based on the average run length criterion.Keywords: autocorrelation, EWMA control chart, general linear regression model, profile monitoring
Procedia PDF Downloads 4607753 Dynamical Models for Enviromental Effect Depuration for Structural Health Monitoring of Bridges
Authors: Francesco Morgan Bono, Simone Cinquemani
Abstract:
This research aims to enhance bridge monitoring by employing innovative techniques that incorporate exogenous factors into the modeling of sensor signals, thereby improving long-term predictability beyond traditional static methods. Using real datasets from two different bridges equipped with Linear Variable Displacement Transducer (LVDT) sensors, the study investigates the fundamental principles governing sensor behavior for more precise long-term forecasts. Additionally, the research evaluates performance on noisy and synthetically damaged data, proposing a residual-based alarm system to detect anomalies in the bridge. In summary, this novel approach combines advanced modeling, exogenous factors, and anomaly detection to extend prediction horizons and improve preemptive damage recognition, significantly advancing structural health monitoring practices.Keywords: structural health monitoring, dynamic models, sindy, railway bridges
Procedia PDF Downloads 387752 A Research Using Remote Monitoring Technology for Pump Output Monitoring in Distributed Fuel Stations in Nigeria
Authors: Ofoegbu Ositadinma Edward
Abstract:
This research paper discusses a web based monitoring system that enables effective monitoring of fuel pump output and sales volume from distributed fuel stations under the domain of a single company/organization. The traditional method of operation by these organizations in Nigeria is non-automated and accounting for dispensed product is usually approximated and manual as there is little or no technology implemented to presently provide information relating to the state of affairs in the station both to on-ground staff and to supervisory staff that are not physically present in the station. This results in unaccountable losses in product and revenue as well as slow decision making. Remote monitoring technology as a vast research field with numerous application areas incorporating various data collation techniques and sensor networks can be applied to provide information relating to fuel pump status in distributed fuel stations reliably. Thus, the proposed system relies upon a microcontroller, keypad and pump to demonstrate the traditional fuel dispenser. A web-enabled PC with an accompanying graphic user interface (GUI) was designed using virtual basic which is connected to the microcontroller via the serial port which is to provide the web implementation.Keywords: fuel pump, microcontroller, GUI, web
Procedia PDF Downloads 4347751 Cement-Based Composites with Carbon Nanofillers for Smart Structural Health Monitoring Sensors
Authors: Antonella D'Alessandro, Filippo Ubertini, Annibale Luigi Materazzi
Abstract:
The progress of nanotechnology resulted in the development of new instruments in the field of civil engineering. In particular, the introduction of carbon nanofillers into construction materials can enhance their mechanical and electrical properties. In construction, concrete is among the most used materials. Due to the characteristics of its components and its structure, concrete is suitable for modification, at the nanometer level too. Moreover, to guarantee structural safety, it is desirable to achieve a widespread monitoring of structures. The ideal thing would be to realize structures able to identify their behavior modifications, states of incipient damage or conditions of possible risk for people. This paper presents a research work about novel cementitious composites with conductive carbon nanoinclusions able of monitoring their state of deformation, with particular attention to concrete. The self-sensing ability is achieved through the correlation between the variation of stress or strain and that of electrical resistance. Carbon nanofillers appear particularly suitable for such applications. Nanomodified concretes with different carbon nanofillers has been tested. The samples have been subjected to cyclic and dynamic loads. The experimental campaign shows the potentialities of this new type of sensors made of nanomodified concrete for diffuse Structural Health Monitoring.Keywords: carbon nanofillers, cementitious nanocomposites, smart sensors, structural health monitoring.
Procedia PDF Downloads 3357750 Usage of “Flowchart of Diagnosis and Treatment” Software in Medical Education
Authors: Boy Subirosa Sabarguna, Aria Kekalih, Irzan Nurman
Abstract:
Introduction: Software in the form of Clinical Decision Support System could help students in understanding the mind set of decision-making in diagnosis and treatment at the stage of general practitioners. This could accelerate and ease the learning process which previously took place by using books and experience. Method: Gather 1000 members of the National Medical Multimedia Digital Community (NM2DC) who use the “flowchart of diagnosis and treatment” software, and analyse factors related to: display, speed in learning, convenience in learning, helpfulness and usefulness in the learning process, by using the Likert Scale through online questionnaire which will further be processed using percentage. Results and Discussions: Out of the 1000 members of NM2DC, apparently: 97.0% of the members use the software and 87.5% of them are students. In terms of the analysed factors related to: display, speed in learning, convenience in learning, helpfulness and usefulness of the software’s usage, the results indicate a 90.7% of fairly good performance. Therefore, the “Flowchart of Diagnosis and Treatment” software has helped students in understanding the decision-making of diagnosis and treatment. Conclusion: the use of “Flowchart of Diagnosis and Treatment” software indicates a positive role in helping students understand decision-making of diagnosis and treatment.Keywords: usage, software, diagnosis and treatment, medical education
Procedia PDF Downloads 3597749 The Organizational Justice-Citizenship Behavior Link in Hotels: Does Customer Orientation Matter?
Authors: Pablo Zoghbi-Manrique-de-Lara, Miguel A. Suárez-Acosta
Abstract:
The goal of the present paper is to model two classic lines of research in which employees starred, organizational justice and citizenship behaviour (OCB), but that have never been studied together when targeting customers. The suggestion is made that a hotel’s fair treatment (in terms of distributive, procedural, and interactional justice) toward customers will be appreciated by the employees, who will reciprocate in kind by favouring the hotel with increased customer-oriented behaviours (COBs). Data were collected from 204 employees at eight upscale hotels in the Canary Islands (Spain). Unlike in the case of perceptions of distributive justice, results of structural equation modelling demonstrate that employees substantively react to interactional and procedural justice toward guests by engaging in customer-oriented behaviours (COBs). The findings offer new reasons why employees decide to engage in COBs, and they highlight potentially beneficial effects of fair treatment toward guests bring to hospitality through promoting COBs.Keywords: hotel guests’ (mis) treatment, customer-oriented behaviours, employee citizenship, organizational justice, third-party observers, third-party intervention
Procedia PDF Downloads 2737748 The Problems with the Amendment of a Living Trust in South Africa
Authors: Rika van Zyl
Abstract:
It was ruled that an inter vivos trust must be amended according to the rules of the stipulatio alteri, or ‘contract in favour of a third party’, that South African adopted from its Roman-Dutch common law. The application of the principles of the stipulatio alteri on the inter vivos trust has developed in case law to imply that once the beneficiary has accepted benefits, he becomes a party to the contract. This consequently means that he must consent to any amendments that the trustees want to make. This poses practical difficulties such as finding all the beneficiaries that have accepted to sign the amendment that the trustees would want to circumvent in administering the trust. One of the questions relating to this issue is, however, whether the principles of the stipulatio alteri are correctly interpreted and consequently applied to the inter vivos trust to mean that the beneficiaries who accepted must consent to any amendment. The subsequent question relates to the rights the beneficiary receives upon acceptance. There seems to be a different view of what a vested right or a contingent right of the beneficiary means in relation to the inter vivos trust. These rights also have an impact on the amendment of a trust deed. Such an investigation and refining of the interpretation of the stipulatio alteri’s application on the inter vivos trust may result in solutions to circumvent the adverse effects of getting the beneficiary’s consent for amendments.Keywords: inter vivos trust, stipulatio alteri, amendment, beneficiary rights
Procedia PDF Downloads 1707747 Hydroinformatics of Smart Cities: Real-Time Water Quality Prediction Model Using a Hybrid Approach
Authors: Elisa Coraggio, Dawei Han, Weiru Liu, Theo Tryfonas
Abstract:
Water is one of the most important resources for human society. The world is currently undergoing a wave of urban growth, and pollution problems are of a great impact. Monitoring water quality is a key task for the future of the environment and human species. In recent times, researchers, using Smart Cities technologies are trying to mitigate the problems generated by the population growth in urban areas. The availability of huge amounts of data collected by a pervasive urban IoT can increase the transparency of decision making. Several services have already been implemented in Smart Cities, but more and more services will be involved in the future. Water quality monitoring can successfully be implemented in the urban IoT. The combination of water quality sensors, cloud computing, smart city infrastructure, and IoT technology can lead to a bright future for environmental monitoring. In the past decades, lots of effort has been put on monitoring and predicting water quality using traditional approaches based on manual collection and laboratory-based analysis, which are slow and laborious. The present study proposes a methodology for implementing a water quality prediction model using artificial intelligence techniques and comparing the results obtained with different algorithms. Furthermore, a 3D numerical model will be created using the software D-Water Quality, and simulation results will be used as a training dataset for the artificial intelligence algorithm. This study derives the methodology and demonstrates its implementation based on information and data collected at the floating harbour in the city of Bristol (UK). The city of Bristol is blessed with the Bristol-Is-Open infrastructure that includes Wi-Fi network and virtual machines. It was also named the UK ’s smartest city in 2017.In recent times, researchers, using Smart Cities technologies are trying to mitigate the problems generated by the population growth in urban areas. The availability of huge amounts of data collected by a pervasive urban IoT can increase the transparency of decision making. Several services have already been implemented in Smart Cities, but more and more services will be involved in the future. Water quality monitoring can successfully be implemented in the urban IoT. The combination of water quality sensors, cloud computing, smart city infrastructure, and IoT technology can lead to a bright future for the environment monitoring. In the past decades, lots of effort has been put on monitoring and predicting water quality using traditional approaches based on manual collection and laboratory-based analysis, which are slow and laborious. The present study proposes a new methodology for implementing a water quality prediction model using artificial intelligence techniques and comparing the results obtained with different algorithms. Furthermore, a 3D numerical model will be created using the software D-Water Quality, and simulation results will be used as a training dataset for the Artificial Intelligence algorithm. This study derives the methodology and demonstrate its implementation based on information and data collected at the floating harbour in the city of Bristol (UK). The city of Bristol is blessed with the Bristol-Is-Open infrastructure that includes Wi-Fi network and virtual machines. It was also named the UK ’s smartest city in 2017.Keywords: artificial intelligence, hydroinformatics, numerical modelling, smart cities, water quality
Procedia PDF Downloads 1887746 Bridging the Gap between Problem and Solution Space with Domain-Driven Design
Authors: Anil Kumar, Lavisha Gupta
Abstract:
Domain-driven design (DDD) is a pivotal methodology in software development, emphasizing the understanding and modeling of core business domains to create effective solutions. This paper explores the significance of DDD in aligning software architecture with real-world domains, with a focus on its application within Siemens. We delve into the challenges faced by development teams in understanding domains and propose DDD as a solution to bridge the gap between problem and solution spaces. Key concepts of DDD, such as Ubiquitous Language, Bounded Contexts, Entities, Value Objects, and Aggregates, are discussed, along with their practical implications in software development. Through a real project example in the automatic generation of hardware and software plant engineering, we illustrate how DDD principles can transform complex domains into coherent and adaptable software solutions, echoing Siemens' commitment to excellence and innovation.Keywords: domain-driven design, software architecture, ubiquitous language, bounded contexts, entities, value objects, aggregates
Procedia PDF Downloads 367745 The Use of Geographic Information System Technologies for Geotechnical Monitoring of Pipeline Systems
Authors: A. G. Akhundov
Abstract:
Issues of obtaining unbiased data on the status of pipeline systems of oil- and oil product transportation become especially important when laying and operating pipelines under severe nature and climatic conditions. The essential attention is paid here to researching exogenous processes and their impact on linear facilities of the pipeline system. Reliable operation of pipelines under severe nature and climatic conditions, timely planning and implementation of compensating measures are only possible if operation conditions of pipeline systems are regularly monitored, and changes of permafrost soil and hydrological operation conditions are accounted for. One of the main reasons for emergency situations to appear is the geodynamic factor. Emergency situations are proved by the experience to occur within areas characterized by certain conditions of the environment and to develop according to similar scenarios depending on active processes. The analysis of natural and technical systems of main pipelines at different stages of monitoring gives a possibility of making a forecast of the change dynamics. The integration of GIS technologies, traditional means of geotechnical monitoring (in-line inspection, geodetic methods, field observations), and remote methods (aero-visual inspection, aero photo shooting, air and ground laser scanning) provides the most efficient solution of the problem. The united environment of geo information system (GIS) is a comfortable way to implement the monitoring system on the main pipelines since it provides means to describe a complex natural and technical system and every element thereof with any set of parameters. Such GIS enables a comfortable simulation of main pipelines (both in 2D and 3D), the analysis of situations and selection of recommendations to prevent negative natural or man-made processes and to mitigate their consequences. The specifics of such systems include: a multi-dimensions simulation of facilities in the pipeline system, math modelling of the processes to be observed, and the use of efficient numeric algorithms and software packets for forecasting and analyzing. We see one of the most interesting possibilities of using the monitoring results as generating of up-to-date 3D models of a facility and the surrounding area on the basis of aero laser scanning, data of aerophotoshooting, and data of in-line inspection and instrument measurements. The resulting 3D model shall be the basis of the information system providing means to store and process data of geotechnical observations with references to the facilities of the main pipeline; to plan compensating measures, and to control their implementation. The use of GISs for geotechnical monitoring of pipeline systems is aimed at improving the reliability of their operation, reducing the probability of negative events (accidents and disasters), and at mitigation of consequences thereof if they still are to occur.Keywords: databases, 3D GIS, geotechnical monitoring, pipelines, laser scaning
Procedia PDF Downloads 1897744 A Portable Cognitive Tool for Engagement Level and Activity Identification
Authors: Terry Teo, Sun Woh Lye, Yufei Li, Zainuddin Zakaria
Abstract:
Wearable devices such as Electroencephalography (EEG) hold immense potential in the monitoring and assessment of a person’s task engagement. This is especially so in remote or online sites. Research into its use in measuring an individual's cognitive state while performing task activities is therefore expected to increase. Despite the growing number of EEG research into brain functioning activities of a person, key challenges remain in adopting EEG for real-time operations. These include limited portability, long preparation time, high number of channel dimensionality, intrusiveness, as well as level of accuracy in acquiring neurological data. This paper proposes an approach using a 4-6 EEG channels to determine the cognitive states of a subject when undertaking a set of passive and active monitoring tasks of a subject. Air traffic controller (ATC) dynamic-tasks are used as a proxy. The work found that when using the channel reduction and identifier algorithm, good trend adherence of 89.1% can be obtained between a commercially available BCI 14 channel Emotiv EPOC+ EEG headset and that of a carefully selected set of reduced 4-6 channels. The approach can also identify different levels of engagement activities ranging from general monitoring ad hoc and repeated active monitoring activities involving information search, extraction, and memory activities.Keywords: assessment, neurophysiology, monitoring, EEG
Procedia PDF Downloads 767743 Description of a Structural Health Monitoring and Control System Using Open Building Information Modeling
Authors: Wahhaj Ahmed Farooqi, Bilal Ahmad, Sandra Maritza Zambrano Bernal
Abstract:
In view of structural engineering, monitoring of structural responses over time is of great importance with respect to recent developments of construction technologies. Recently, developments of advanced computing tools have enabled researcher’s better execution of structural health monitoring (SHM) and control systems. In the last decade, building information modeling (BIM) has substantially enhanced the workflow of planning and operating engineering structures. Typically, building information can be stored and exchanged via model files that are based on the Industry Foundation Classes (IFC) standard. In this study a modeling approach for semantic modeling of SHM and control systems is integrated into the BIM methodology using the IFC standard. For validation of the modeling approach, a laboratory test structure, a four-story shear frame structure, is modeled using a conventional BIM software tool. An IFC schema extension is applied to describe information related to monitoring and control of a prototype SHM and control system installed on the laboratory test structure. The SHM and control system is described by a semantic model applying Unified Modeling Language (UML). Subsequently, the semantic model is mapped into the IFC schema. The test structure is composed of four aluminum slabs and plate-to-column connections are fully fixed. In the center of the top story, semi-active tuned liquid column damper (TLCD) is installed. The TLCD is used to reduce effects of structural responses in context of dynamic vibration and displacement. The wireless prototype SHM and control system is composed of wireless sensor nodes. For testing the SHM and control system, acceleration response is automatically recorded by the sensor nodes equipped with accelerometers and analyzed using embedded computing. As a result, SHM and control systems can be described within open BIM, dynamic responses and information of damages can be stored, documented, and exchanged on the formal basis of the IFC standard.Keywords: structural health monitoring, open building information modeling, industry foundation classes, unified modeling language, semi-active tuned liquid column damper, nondestructive testing
Procedia PDF Downloads 1517742 Design and Development of an Autonomous Underwater Vehicle for Irrigation Canal Monitoring
Authors: Mamoon Masud, Suleman Mazhar
Abstract:
Indus river basin’s irrigation system in Pakistan is extremely complex, spanning over 50,000 km. Maintenance and monitoring of this demands enormous resources. This paper describes the development of a streamlined and low-cost autonomous underwater vehicle (AUV) for the monitoring of irrigation canals including water quality monitoring and water theft detection. The vehicle is a hovering-type AUV, designed mainly for monitoring irrigation canals, with fully documented design and open source code. It has a length of 17 inches, and a radius of 3.5 inches with a depth rating of 5m. Multiple sensors are present onboard the AUV for monitoring water quality parameters including pH, turbidity, total dissolved solids (TDS) and dissolved oxygen. A 9-DOF Inertial Measurement Unit (IMU), GY-85, is used, which incorporates an Accelerometer (ADXL345), a Gyroscope (ITG-3200) and a Magnetometer (HMC5883L). The readings from these sensors are fused together using directional cosine matrix (DCM) algorithm, providing the AUV with the heading angle, while a pressure sensor gives the depth of the AUV. 2 sonar-based range sensors are used for obstacle detection, enabling the vehicle to align itself with the irrigation canals edges. 4 thrusters control the vehicle’s surge, heading and heave, providing 3 DOF. The thrusters are controlled using a proportional-integral-derivative (PID) feedback control system, with heading angle and depth being the controller’s input and the thruster motor speed as the output. A flow sensor has been incorporated to monitor canal water level to detect water-theft event in the irrigation system. In addition to water theft detection, the vehicle also provides information on water quality, providing us with the ability to identify the source(s) of water contamination. Detection of such events can provide useful policy inputs for improving irrigation efficiency and reducing water contamination. The AUV being low cost, small sized and suitable for autonomous maneuvering, water level and quality monitoring in the irrigation canals, can be used for irrigation network monitoring at a large scale.Keywords: the autonomous underwater vehicle, irrigation canal monitoring, water quality monitoring, underwater line tracking
Procedia PDF Downloads 1477741 An Architectural Approach for the Dynamic Adaptation of Services-Based Software
Authors: Mohhamed Yassine Baroudi, Abdelkrim Benammar, Fethi Tarik Bendimerad
Abstract:
This paper proposes software architecture for dynamical service adaptation. The services are constituted by reusable software components. The adaptation’s goal is to optimize the service function of their execution context. For a first step, the context will take into account just the user needs but other elements will be added. A particular feature in our proposition is the profiles that are used not only to describe the context’s elements but also the components itself. An adapter analyzes the compatibility between all these profiles and detects the points where the profiles are not compatibles. The same Adapter search and apply the possible adaptation solutions: component customization, insertion, extraction or replacement.Keywords: adaptative service, software component, service, dynamic adaptation
Procedia PDF Downloads 2987740 Suitability of Black Box Approaches for the Reliability Assessment of Component-Based Software
Authors: Anjushi Verma, Tirthankar Gayen
Abstract:
Although, reliability is an important attribute of quality, especially for mission critical systems, yet, there does not exist any versatile model even today for the reliability assessment of component-based software. The existing Black Box models are found to make various assumptions which may not always be realistic and may be quite contrary to the actual behaviour of software. They focus on observing the manner in which the system behaves without considering the structure of the system, the components composing the system, their interconnections, dependencies, usage frequencies, etc.As a result, the entropy (uncertainty) in assessment using these models is much high.Though, there are some models based on operation profile yet sometimes it becomes extremely difficult to obtain the exact operation profile concerned with a given operation. This paper discusses the drawbacks, deficiencies and limitations of Black Box approaches from the perspective of various authors and finally proposes a conceptual model for the reliability assessment of software.Keywords: black box, faults, failure, software reliability
Procedia PDF Downloads 4437739 Implementation of a Web-Based Clinical Outcomes Monitoring and Reporting Platform across the Fortis Network
Authors: Narottam Puri, Bishnu Panigrahi, Narayan Pendse
Abstract:
Background: Clinical Outcomes are the globally agreed upon, evidence-based measurable changes in health or quality of life resulting from the patient care. Reporting of outcomes and its continuous monitoring provides an opportunity for both assessing and improving the quality of patient care. In 2012, International Consortium Of HealthCare Outcome Measurement (ICHOM) was founded which has defined global Standard Sets for measuring the outcome of various treatments. Method: Monitoring of Clinical Outcomes was identified as a pillar of Fortis’ core value of Patient Centricity. The project was started as an in-house developed Clinical Outcomes Reporting Portal by the Fortis Medical IT team. Standard sets of Outcome measurement developed by ICHOM were used. A pilot was run at Fortis Escorts Heart Institute from Aug’13 – Dec’13.Starting Jan’14, it was implemented across 11 hospitals of the group. The scope was hospital-wide and major clinical specialties: Cardiac Sciences, Orthopedics & Joint Replacement were covered. The internally developed portal had its limitations of report generation and also capturing of Patient related outcomes was restricted. A year later, the company provisioned for an ICHOM Certified Software product which could provide a platform for data capturing and reporting to ensure compliance with all ICHOM requirements. Post a year of the launch of the software; Fortis Healthcare has become the 1st Healthcare Provider in Asia to publish Clinical Outcomes data for the Coronary Artery Disease Standard Set comprising of Coronary Artery Bypass Graft and Percutaneous Coronary Interventions) in the public domain. (Jan 2016). Results: This project has helped in firmly establishing a culture of monitoring and reporting Clinical Outcomes across Fortis Hospitals. Given the diverse nature of the healthcare delivery model at Fortis Network, which comprises of hospitals of varying size and specialty-mix and practically covering the entire span of the country, standardization of data collection and reporting methodology is a huge achievement in itself. 95% case reporting was achieved with more than 90% data completion at the end of Phase 1 (March 2016). Post implementation the group now has one year of data from its own hospitals. This has helped identify the gaps and plan towards ways to bridge them and also establish internal benchmarks for continual improvement. Besides the value created for the group includes: 1. Entire Fortis community has been sensitized on the importance of Clinical Outcomes monitoring for patient centric care. Initial skepticism and cynicism has been countered by effective stakeholder engagement and automation of processes. 2. Measuring quality is the first step in improving quality. Data analysis has helped compare clinical results with best-in-class hospitals and identify improvement opportunities. 3. Clinical fraternity is extremely pleased to be part of this initiative and has taken ownership of the project. Conclusion: Fortis Healthcare is the pioneer in the monitoring of Clinical Outcomes. Implementation of ICHOM standards has helped Fortis Clinical Excellence Program in improving patient engagement and strengthening its commitment to its core value of Patient Centricity. Validation and certification of the Clinical Outcomes data by an ICHOM Certified Supplier adds confidence to its claim of being leaders in this space.Keywords: clinical outcomes, healthcare delivery, patient centricity, ICHOM
Procedia PDF Downloads 2377738 The Review of Permanent Downhole Monitoring System
Abstract:
With the increasingly difficult development and operating environment of exploration, there are many new challenges and difficulties in developing and exploiting oil and gas resources. These include the ability to dynamically monitor wells and provide data and assurance for the completion and production of high-cost and complex wells. A key technology in providing these assurances and maximizing oilfield profitability is real-time permanent reservoir monitoring. The emergence of optical fiber sensing systems has gradually begun to replace traditional electronic systems. Traditional temperature sensors can only achieve single-point temperature monitoring, but fiber optic sensing systems based on the Bragg grating principle have a high level of reliability, accuracy, stability, and resolution, enabling cost-effective monitoring, which can be done in real-time, anytime, and without well intervention. Continuous data acquisition is performed along the entire wellbore. The integrated package with the downhole pressure gauge, packer, and surface system can also realize real-time dynamic monitoring of the pressure in some sections of the downhole, avoiding oil well intervention and eliminating the production delay and operational risks of conventional surveys. Real-time information obtained through permanent optical fibers can also provide critical reservoir monitoring data for production and recovery optimization.Keywords: PDHM, optical fiber, coiled tubing, photoelectric composite cable, digital-oilfield
Procedia PDF Downloads 797737 Iterative Design Process for Development and Virtual Commissioning of Plant Control Software
Authors: Thorsten Prante, Robert Schöch, Ruth Fleisch, Vaheh Khachatouri, Alexander Walch
Abstract:
The development of industrial plant control software is a complex and often very expensive task. One of the core problems is that a lot of the implementation and adaptation work can only be done after the plant hardware has been installed. In this paper, we present our approach to virtually developing and validating plant-level control software of production plants. This way, plant control software can be virtually commissioned before actual ramp-up of a plant, reducing actual commissioning costs and time. Technically, this is achieved by linking the actual plant-wide process control software (often called plant server) and an elaborate virtual plant model together to form an emulation system. Method-wise, we are suggesting a four-step iterative process with well-defined increments and time frame. Our work is based on practical experiences from planning to commissioning and start-up of several cut-to-size plants.Keywords: iterative system design, virtual plant engineering, plant control software, simulation and emulation, virtual commissioning
Procedia PDF Downloads 4907736 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems
Authors: Bassam Istanbouli
Abstract:
With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them. In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies; the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system. Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.Keywords: blueprint, ERP, modular, normalized
Procedia PDF Downloads 1397735 Free, Fair, and Credible Election and Democratic Governance in Bangladesh
Authors: Md. Awal Hossain Mollah
Abstract:
The aim of this study was to evaluate the relation between the free, fair and credible election in ensuring democratic governance in Bangladesh. The paper is a case (Bangladesh) study and qualitative in nature and based on secondary sources of materials. For doing this study, conceptual clarification has been done first and identified few elements of free, fair and credible elections. Then, how far these elements have been ensured in Bangladeshi elections has been evaluated by analyzing all the national elections held since independence. Apart from these, major factors and challenges of holding a free, fair and credible election in Bangladesh have been examined through using the following research questions: 1. Does role of election commission matter for free, fair and credible elections to form a democratic government? 2. Does role of political parties matter for democratic governance? 3. Do role of government matter for conducting the free, fair and credible election in ensuring democratic governance? 4. Does non-party caretaker government matter for conducting a free, fair and credible election? 5. Does democratic governance depend on multi-dimensional factors and actors? Major findings of this study are: Since the independence of Bangladesh, 10 national elections held in various regimes. 4 out of 10 national elections have been found free, fair and credible which have been conducted by the non-party caretaker government. Rests of the elections are not out of controversy and full of manipulation held under elected government. However, the caretaker government has already been abolished by the AL government through 15th amendment of the constitution. The present AL government is elected by the 10th parliamentary election under incumbent (AL) government, but a major opposition allies (20 parties) lead by BNP boycotted this election and 154 of the total 300 seats being uncontested. As a result, AL again came to the power without a competitive election and most of the national and International election observers including media world consider this election as unfair and the government is suffering from lack of legitimacy. Therefore, the governance of present Bangladesh is not democratic at all and it is to be considered as one party (14 parties’ allies lead by AL) authoritarian governance in the shade of parliamentary governance. Both the position and opposition of the parliament is belonging in 14 parties’ alliances lead by AL.Keywords: democracy, governance, free, fair and credible elections, Bangladesh
Procedia PDF Downloads 326