Search results for: software in the loop
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5293

Search results for: software in the loop

3943 Triangular Hesitant Fuzzy TOPSIS Approach in Investment Projects Management

Authors: Irina Khutsishvili

Abstract:

The presented study develops a decision support methodology for multi-criteria group decision-making problem. The proposed methodology is based on the TOPSIS (Technique for Order Performance by Similarity to Ideal Solution) approach in the hesitant fuzzy environment. The main idea of decision-making problem is a selection of one best alternative or several ranking alternatives among a set of feasible alternatives. Typically, the process of decision-making is based on an evaluation of certain criteria. In many MCDM problems (such as medical diagnosis, project management, business and financial management, etc.), the process of decision-making involves experts' assessments. These assessments frequently are expressed in fuzzy numbers, confidence intervals, intuitionistic fuzzy values, hesitant fuzzy elements and so on. However, a more realistic approach is using linguistic expert assessments (linguistic variables). In the proposed methodology both the values and weights of the criteria take the form of linguistic variables, given by all decision makers. Then, these assessments are expressed in triangular fuzzy numbers. Consequently, proposed approach is based on triangular hesitant fuzzy TOPSIS decision-making model. Following the TOPSIS algorithm, first, the fuzzy positive ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS) are defined. Then the ranking of alternatives is performed in accordance with the proximity of their distances to the both FPIS and FNIS. Based on proposed approach the software package has been developed, which was used to rank investment projects in the real investment decision-making problem. The application and testing of the software were carried out based on the data provided by the ‘Bank of Georgia’.

Keywords: fuzzy TOPSIS approach, investment project, linguistic variable, multi-criteria decision making, triangular hesitant fuzzy set

Procedia PDF Downloads 427
3942 The Effect of Foundation on the Earth Fill Dam Settlement

Authors: Masoud Ghaemi, Mohammadjafar Hedayati, Faezeh Yousefzadeh, Hoseinali Heydarzadeh

Abstract:

Careful monitoring in the earth dams to measure deformation caused by settlement and movement has always been a concern for engineers in the field. In order to measure settlement and deformation of earth dams, usually, the precision instruments of settlement set and combined Inclinometer that is commonly referred to IS instrument will be used. In some dams, because the thickness of alluvium is high and there is no possibility of alluvium removal (technically and economically and in terms of performance), there is no possibility of placing the end of IS instrument (precision instruments of Inclinometer-settlement set) in the rock foundation. Inevitably, have to accept installing pipes in the weak and deformable alluvial foundation that leads to errors in the calculation of the actual settlement (absolute settlement) in different parts of the dam body. The purpose of this paper is to present new and refine criteria for predicting settlement and deformation in earth dams. The study is based on conditions in three dams with a deformation quite alluvial (Agh Chai, Narmashir and Gilan-e Gharb) to provide settlement criteria affected by the alluvial foundation. To achieve this goal, the settlement of dams was simulated by using the finite difference method with FLAC3D software, and then the modeling results were compared with the reading IS instrument. In the end, the caliber of the model and validate the results, by using regression analysis techniques and scrutinized modeling parameters with real situations and then by using MATLAB software and CURVE FITTING toolbox, new criteria for the settlement based on elasticity modulus, cohesion, friction angle, the density of earth dam and the alluvial foundation was obtained. The results of these studies show that, by using the new criteria measures, the amount of settlement and deformation for the dams with alluvial foundation can be corrected after instrument readings, and the error rate in reading IS instrument can be greatly reduced.

Keywords: earth-fill dam, foundation, settlement, finite difference, MATLAB, curve fitting

Procedia PDF Downloads 193
3941 Life Cycle Assessment of Rare Earth Metals Production: Hotspot Analysis of Didymium Electrolysis Process

Authors: Sandra H. Fukurozaki, Andre L. N. Silva, Joao B. F. Neto, Fernando J. G. Landgraf

Abstract:

Nowadays, the rare earth (RE) metals play an important role in emerging technologies that are crucial for the decarbonisation of the energy sector. Their unique properties have led to increasing clean energy applications, such as wind turbine generators, and hybrid and electric vehicles. Despite the substantial media coverage that has recently surrounded the mining and processing of rare earth metals, very little quantitative information is available concerning their subsequent life stages, especially related to the metallic production of didymium (Nd-Pr) in fluoride molten salt system. Here we investigate a gate to gate scale life cycle assessment (LCA) of the didymium electrolysis based on three different scenarios of operational conditions. The product system is modeled with SimaPro Analyst 8.0.2 software, and IMPACT 2002+ was applied as an impact assessment tool. In order to develop a life cycle inventories built in software databases, patents, and other published sources together with energy/mass balance were utilized. Analysis indicates that from the 14 midpoint impact categories evaluated, the global warming potential (GWP) is the main contributors to the total environmental burden, ranging from 2.7E2 to 3.2E2 kg CO2eq/kg Nd-Pr. At the damage step assessment, the results suggest that slight changes in materials flows associated with enhancement of current efficiency (between 2.5% and 5%), could lead a reduction up to 12% and 15% of human health and climate change damage, respectively. Additionally, this paper highlights the knowledge gaps and future research efforts needing to understand the environmental impacts of Nd-Pr electrolysis process from the life cycle perspective.

Keywords: didymium electrolysis, environmental impacts, life cycle assessment, rare earth metals

Procedia PDF Downloads 185
3940 Comparative Analysis of Control Techniques Based Sliding Mode for Transient Stability Assessment for Synchronous Multicellular Converter

Authors: Rihab Hamdi, Amel Hadri Hamida, Fatiha Khelili, Sakina Zerouali, Ouafae Bennis

Abstract:

This paper features a comparative study performance of sliding mode controller (SMC) for closed-loop voltage control of direct current to direct current (DC-DC) three-cells buck converter connected in parallel, operating in continuous conduction mode (CCM), based on pulse-width modulation (PWM) with SMC based on hysteresis modulation (HM) where an adaptive feedforward technique is adopted. On one hand, for the PWM-based SM, the approach is to incorporate a fixed-frequency PWM scheme which is effectively a variant of SM control. On the other hand, for the HM-based SM, oncoming an adaptive feedforward control that makes the hysteresis band variable in the hysteresis modulator of the SM controller in the aim to restrict the switching frequency variation in the case of any change of the line input voltage or output load variation are introduced. The results obtained under load change, input change and reference change clearly demonstrates a similar dynamic response of both proposed techniques, their effectiveness is fast and smooth tracking of the desired output voltage. The PWM-based SM technique has greatly improved the dynamic behavior with a bit advantageous compared to the HM-based SM technique, as well as provide stability in any operating conditions. Simulation studies in MATLAB/Simulink environment have been performed to verify the concept.

Keywords: DC-DC converter, hysteresis modulation, parallel multi-cells converter, pulse-width modulation, robustness, sliding mode control

Procedia PDF Downloads 166
3939 Quality Analysis of Vegetables Through Image Processing

Authors: Abdul Khalique Baloch, Ali Okatan

Abstract:

The quality analysis of food and vegetable from image is hot topic now a day, where researchers make them better then pervious findings through different technique and methods. In this research we have review the literature, and find gape from them, and suggest better proposed approach, design the algorithm, developed a software to measure the quality from images, where accuracy of image show better results, and compare the results with Perouse work done so for. The Application we uses an open-source dataset and python language with tensor flow lite framework. In this research we focus to sort food and vegetable from image, in the images, the application can sorts and make them grading after process the images, it could create less errors them human base sorting errors by manual grading. Digital pictures datasets were created. The collected images arranged by classes. The classification accuracy of the system was about 94%. As fruits and vegetables play main role in day-to-day life, the quality of fruits and vegetables is necessary in evaluating agricultural produce, the customer always buy good quality fruits and vegetables. This document is about quality detection of fruit and vegetables using images. Most of customers suffering due to unhealthy foods and vegetables by suppliers, so there is no proper quality measurement level followed by hotel managements. it have developed software to measure the quality of the fruits and vegetables by using images, it will tell you how is your fruits and vegetables are fresh or rotten. Some algorithms reviewed in this thesis including digital images, ResNet, VGG16, CNN and Transfer Learning grading feature extraction. This application used an open source dataset of images and language used python, and designs a framework of system.

Keywords: deep learning, computer vision, image processing, rotten fruit detection, fruits quality criteria, vegetables quality criteria

Procedia PDF Downloads 68
3938 Automated Method Time Measurement System for Redesigning Dynamic Facility Layout

Authors: Salam Alzubaidi, G. Fantoni, F. Failli, M. Frosolini

Abstract:

The dynamic facility layout problem is a really critical issue in the competitive industrial market; thus, solving this problem requires robust design and effective simulation systems. The sustainable simulation requires inputting reliable and accurate data into the system. So this paper describes an automated system integrated into the real environment to measure the duration of the material handling operations, collect the data in real-time, and determine the variances between the actual and estimated time schedule of the operations in order to update the simulation software and redesign the facility layout periodically. The automated method- time measurement system collects the real data through using Radio Frequency-Identification (RFID) and Internet of Things (IoT) technologies. Hence, attaching RFID- antenna reader and RFID tags enables the system to identify the location of the objects and gathering the time data. The real duration gathered will be manipulated by calculating the moving average duration of the material handling operations, choosing the shortest material handling path, and then updating the simulation software to redesign the facility layout accommodating with the shortest/real operation schedule. The periodic simulation in real-time is more sustainable and reliable than the simulation system relying on an analysis of historical data. The case study of this methodology is in cooperation with a workshop team for producing mechanical parts. Although there are some technical limitations, this methodology is promising, and it can be significantly useful in the redesigning of the manufacturing layout.

Keywords: dynamic facility layout problem, internet of things, method time measurement, radio frequency identification, simulation

Procedia PDF Downloads 119
3937 Comparison of Different Hydrograph Routing Techniques in XPSTORM Modelling Software: A Case Study

Authors: Fatema Akram, Mohammad Golam Rasul, Mohammad Masud Kamal Khan, Md. Sharif Imam Ibne Amir

Abstract:

A variety of routing techniques are available to develop surface runoff hydrographs from rainfall. The selection of runoff routing method is very vital as it is directly related to the type of watershed and the required degree of accuracy. There are different modelling softwares available to explore the rainfall-runoff process in urban areas. XPSTORM, a link-node based, integrated storm-water modelling software, has been used in this study for developing surface runoff hydrograph for a Golf course area located in Rockhampton in Central Queensland in Australia. Four commonly used methods, namely SWMM runoff, Kinematic wave, Laurenson, and Time-Area are employed to generate runoff hydrograph for design storm of this study area. In runoff mode of XPSTORM, the rainfall, infiltration, evaporation and depression storage for sub-catchments were simulated and the runoff from the sub-catchment to collection node was calculated. The simulation results are presented, discussed and compared. The total surface runoff generated by SWMM runoff, Kinematic wave and Time-Area methods are found to be reasonably close, which indicates any of these methods can be used for developing runoff hydrograph of the study area. Laurenson method produces a comparatively less amount of surface runoff, however, it creates highest peak of surface runoff among all which may be suitable for hilly region. Although the Laurenson hydrograph technique is widely acceptable surface runoff routing technique in Queensland (Australia), extensive investigation is recommended with detailed topographic and hydrologic data in order to assess its suitability for use in the case study area.

Keywords: ARI, design storm, IFD, rainfall temporal pattern, routing techniques, surface runoff, XPSTORM

Procedia PDF Downloads 452
3936 Preparation of Electrospun PLA/ENR Fibers

Authors: Jaqueline G. L. Cosme, Paulo H. S. Picciani, Regina C. R. Nunes

Abstract:

Electrospinning is a technique for the fabrication of nanoscale fibers. The general electrospinning system consists of a syringe filled with polymer solution, a syringe pump, a high voltage source and a grounded counter electrode. During electrospinning a volumetric flow is set by the syringe pump and an electric voltage is applied. This forms an electric potential between the needle and the counter electrode (collector plate), which results in the formation of a Taylor cone and the jet. The jet is moved towards the lower potential, the counter electrode, wherein the solvent of the polymer solution is evaporated and the polymer fiber is formed. On the way to the counter electrode, the fiber is accelerated by the electric field. The bending instabilities that occur form a helical loop movements of the jet, which result from the coulomb repulsion of the surface charge. Trough bending instabilities the jet is stretched, so that the fiber diameter decreases. In this study, a thermoplastic/elastomeric binary blend of non-vulcanized epoxidized natural rubber (ENR) and poly(latic acid) (PLA) was electrospun using polymer solutions consisting of varying proportions of PCL and NR. Specifically, 15% (w/v) PLA/ENR solutions were prepared in /chloroform at proportions of 5, 10, 25, and 50% (w/w). The morphological and thermal properties of the electrospun mats were investigated by scanning electron microscopy (SEM) and differential scanning calorimetry analysis. The SEM images demonstrated the production of micrometer- and sub-micrometer-sized fibers with no bead formation. The blend miscibility was evaluated by thermal analysis, which showed that blending did not improve the thermal stability of the systems.

Keywords: epoxidized natural rubber, poly(latic acid), electrospinning, chemistry

Procedia PDF Downloads 408
3935 An Adaptive Controller Method Based on Full-State Linear Model of Variable Cycle Engine

Authors: Jia Li, Huacong Li, Xiaobao Han

Abstract:

Due to the more variable geometry parameters of VCE (variable cycle aircraft engine), presents an adaptive controller method based on the full-state linear model of VCE and has simulated to solve the multivariate controller design problem of the whole flight envelops. First, analyzes the static and dynamic performances of bypass ratio and other state parameters caused by variable geometric components, and develops nonlinear component model of VCE. Then based on the component model, through small deviation linearization of main fuel (Wf), the area of tail nozzle throat (A8) and the angle of rear bypass ejector (A163), setting up multiple linear model which variable geometric parameters can be inputs. Second, designs the adaptive controllers for VCE linear models of different nominal points. Among them, considering of modeling uncertainties and external disturbances, derives the adaptive law by lyapunov function. The simulation results showed that, the adaptive controller method based on full-state linear model used the angle of rear bypass ejector as input and effectively solved the multivariate control problems of VCE. The performance of all nominal points could track the desired closed-loop reference instructions. The adjust time was less than 1.2s, and the system overshoot was less than 1%, at the same time, the errors of steady states were less than 0.5% and the dynamic tracking errors were less than 1%. In addition, the designed controller could effectively suppress interference and reached the desired commands with different external random noise signals.

Keywords: variable cycle engine (VCE), full-state linear model, adaptive control, by-pass ratio

Procedia PDF Downloads 313
3934 Design and Implementation of an Affordable Electronic Medical Records in a Rural Healthcare Setting: A Qualitative Intrinsic Phenomenon Case Study

Authors: Nitika Sharma, Yogesh Jain

Abstract:

Introduction: An efficient Information System helps in improving the service delivery as well provides the foundation for policy and regulation of other building blocks of Health System. Health care organizations require an integrated working of its various sub-systems. An efficient EMR software boosts the teamwork amongst the various sub-systems thereby resulting in improved service delivery. Although there has been a huge impetus to EMR under the Digital India initiative, it has still not been mandated in India. It is generally implemented in huge funded public or private healthcare organizations only. Objective: The study was conducted to understand the factors that lead to the successful adoption of an affordable EMR in the low level healthcare organization. It intended to understand the design of the EMR and address the solutions to the challenges faced in adoption of the EMR. Methodology: The study was conducted in a non-profit registered Healthcare organization that has been providing healthcare facilities to more than 2500 villages including certain areas that are difficult to access. The data was collected with help of field notes, in-depth interviews and participant observation. A total of 16 participants using the EMR from different departments were enrolled via purposive sampling technique. The participants included in the study were working in the organization before the implementation of the EMR system. The study was conducted in one month period from 25 June-20 July 2018. The Ethical approval was taken from the institute along with prior approval of the participants. Data analysis: A word document of more than 4000 words was obtained after transcribing and translating the answers of respondents. It was further analyzed by focused coding, a line by line review of the transcripts, underlining words, phrases or sentences that might suggest themes to do thematic narrative analysis. Results: Based on the answers the results were thematically grouped under four headings: 1. governance of organization, 2. architecture and design of the software, 3. features of the software, 4. challenges faced in adoption and the solutions to address them. It was inferred that the successful implementation was attributed to the easy and comprehensive design of the system which has facilitated not only easy data storage and retrieval but contributes in constructing a decision support system for the staff. Portability has lead to increased acceptance by physicians. The proper division of labor, increased efficiency of staff, incorporation of auto-correction features and facilitation of task shifting has lead to increased acceptance amongst the users of various departments. Geographical inhibitions, low computer literacy and high patient load were the major challenges faced during its implementation. Despite of dual efforts made both by the architects and administrators to combat these challenges, there are still certain ongoing challenges faced by organization. Conclusion: Whenever any new technology is adopted there are certain innovators, early adopters, late adopters and laggards. The same pattern was followed in adoption of this software. He challenges were overcome with joint efforts of organization administrators and users as well. Thereby this case study provides a framework of implementing similar systems in public sector of countries that are struggling for digitizing the healthcare in presence of crunch of human and financial resources.

Keywords: EMR, healthcare technology, e-health, EHR

Procedia PDF Downloads 105
3933 The Impact of Green Building Envelopes on the Urban Microclimate of the Urban Canopy-Case Study: Fawzy Moaz Street, Alexandria, Egypt

Authors: Amany Haridy, Ahmed Elseragy, Fahd Omar

Abstract:

The issue of temperature increase in the urban microclimate has been at the center of attention recently, especially in dense urban areas, such as the City of Alexandria in Egypt, where building surfaces have become the dominant element (more than green areas and streets). Temperatures have been rising during daytime as well as nighttime, however, the research focused on the rise of air temperature at night, a phenomenon known as the urban heat island. This phenomenon has many effects on ecological life, as well as human health. This study provided evidence of the possibility of reducing the urban heat island by using a green building envelope (green wall and green roof) in Alexandria, Egypt. This City has witnessed a boom in growth in its urban fabric and population. A simulation analysis using the Envi-met software to find the ratio of air temperature reduction was performed. The simulation depended on the orientation of the green areas and their density, which was defined through a process of climatic analysis made by the Diva plugin using the Grasshopper software. Results showed that the reduction in air temperature varies from 0.8–2.0 °C, increasing with the increasing density of green areas. Many systems of green wall and green roof can be found in the local market. However, treating an existing building requires a careful choice of system to fit the building construction load and the surrounding nature. Among the systems of choice, there was the ‘geometric system’ of vertical greening that can be fixed on a light aluminum structure for walls and the extensive green system for roofs. Finally, native plants were the best choice in the long term because they fare well in the local climate.

Keywords: envi-met, green building envelope, urban heat island, urban microclimate

Procedia PDF Downloads 205
3932 Evaluating the Understanding of the University Students (Basic Sciences and Engineering) about the Numerical Representation of the Average Rate of Change

Authors: Saeid Haghjoo, Ebrahim Reyhani, Fahimeh Kolahdouz

Abstract:

The present study aimed to evaluate the understanding of the students in Tehran universities (Iran) about the numerical representation of the average rate of change based on the Structure of Observed Learning Outcomes (SOLO). In the present descriptive-survey research, the statistical population included undergraduate students (basic sciences and engineering) in the universities of Tehran. The samples were 604 students selected by random multi-stage clustering. The measurement tool was a task whose face and content validity was confirmed by math and mathematics education professors. Using Cronbach's Alpha criterion, the reliability coefficient of the task was obtained 0.95, which verified its reliability. The collected data were analyzed by descriptive statistics and inferential statistics (chi-squared and independent t-tests) under SPSS-24 software. According to the SOLO model in the prestructural, unistructural, and multistructural levels, basic science students had a higher percentage of understanding than that of engineering students, although the outcome was inverse at the relational level. However, there was no significant difference in the average understanding of both groups. The results indicated that students failed to have a proper understanding of the numerical representation of the average rate of change, in addition to missconceptions when using physics formulas in solving the problem. In addition, multiple solutions were derived along with their dominant methods during the qualitative analysis. The current research proposed to focus on the context problems with approximate calculations and numerical representation, using software and connection common relations between math and physics in the teaching process of teachers and professors.

Keywords: average rate of change, context problems, derivative, numerical representation, SOLO taxonomy

Procedia PDF Downloads 91
3931 Passive Vibration Isolation Analysis and Optimization for Mechanical Systems

Authors: Ozan Yavuz Baytemir, Ender Cigeroglu, Gokhan Osman Ozgen

Abstract:

Vibration is an important issue in the design of various components of aerospace, marine and vehicular applications. In order not to lose the components’ function and operational performance, vibration isolation design involving the optimum isolator properties selection and isolator positioning processes appear to be a critical study. Knowing the growing need for the vibration isolation system design, this paper aims to present two types of software capable of implementing modal analysis, response analysis for both random and harmonic types of excitations, static deflection analysis, Monte Carlo simulations in addition to study of parameter and location optimization for different types of isolation problem scenarios. Investigating the literature, there is no such study developing a software-based tool that is capable of implementing all those analysis, simulation and optimization studies in one platform simultaneously. In this paper, the theoretical system model is generated for a 6-DOF rigid body. The vibration isolation system of any mechanical structure is able to be optimized using hybrid method involving both global search and gradient-based methods. Defining the optimization design variables, different types of optimization scenarios are listed in detail. Being aware of the need for a user friendly vibration isolation problem solver, two types of graphical user interfaces (GUIs) are prepared and verified using a commercial finite element analysis program, Ansys Workbench 14.0. Using the analysis and optimization capabilities of those GUIs, a real application used in an air-platform is also presented as a case study at the end of the paper.

Keywords: hybrid optimization, Monte Carlo simulation, multi-degree-of-freedom system, parameter optimization, location optimization, passive vibration isolation analysis

Procedia PDF Downloads 564
3930 Investigating the Relationship Between the Auditor’s Personality Type and the Quality of Financial Reporting in Companies Listed on the Tehran Stock Exchange

Authors: Seyedmohsen Mortazavi

Abstract:

The purpose of this research is to investigate the personality types of internal auditors on the quality of financial reporting in companies admitted to the Tehran Stock Exchange. Personality type is one of the issues that emphasizes the field of auditors' behavior, and this field has attracted the attention of shareholders and stock companies today, because the auditors' personality can affect the type of financial reporting and its quality. The research is applied in terms of purpose and descriptive and correlational in terms of method, and a researcher-made questionnaire was used to check the research hypotheses. The statistical population of the research is all the auditors, accountants and financial managers of the companies admitted to the Tehran Stock Exchange, and due to their large number and the uncertainty of their exact number, 384 people have been considered as a statistical sample using Morgan's table. The researcher-made questionnaire was approved by experts in the field, and then its validity and reliability were obtained using software. For the validity of the questionnaire, confirmatory factor analysis was first examined, and then using divergent and convergent validity; Fornell-Larker and cross-sectional load test of the validity of the questionnaire were confirmed; Then, the reliability of the questionnaire was examined using Cronbach's alpha and composite reliability, and the results of these two tests showed the appropriate reliability of the questionnaire. After checking the validity and reliability of the research hypotheses, PLS software was used to check the hypotheses. The results of the research showed that the personalities of internal auditors can affect the quality of financial reporting; The personalities investigated in this research include neuroticism, extroversion, flexibility, agreeableness and conscientiousness, all of these personality types can affect the quality of financial reporting.

Keywords: flexibility, quality of financial reporting, agreeableness, conscientiousness

Procedia PDF Downloads 100
3929 Valorization of the Waste Generated in Building Energy-Efficiency Rehabilitation Works as Raw Materials for Gypsum Composites

Authors: Paola Villoria Saez, Mercedes Del Rio Merino, Jaime Santacruz Astorqui, Cesar Porras Amores

Abstract:

In construction the Circular Economy covers the whole cycle of the building construction: from production and consumption to waste management and the market for secondary raw materials. The circular economy will definitely contribute to 'closing the loop' of construction product lifecycles through greater recycling and re-use, helping to build a market for reused construction materials salvaged from demolition sites, boosting global competitiveness and fostering sustainable economic growth. In this context, this paper presents the latest research of 'Waste to resources (W2R)' project funded by the Spanish Government, which seeks new solutions to improve energy efficiency in buildings by developing new building materials and products that are less expensive, more durable, with higher quality and more environmentally friendly. This project differs from others as its main objective is to reduce to almost zero the Construction and Demolition Waste (CDW) generated in building rehabilitation works. In order to achieve this objective, the group is looking for new ways of CDW recycling as raw materials for new conglomerate materials. With these new materials, construction elements reducing building energy consumption will be proposed. In this paper, the results obtained in the project are presented. Several tests were performed to gypsum samples containing different percentages of CDW waste generated in Spanish building retroffiting works. Results were further analyzed and one of the gypsum composites was highlighted and discussed. Acknowledgements: This research was supported by the Spanish State Secretariat for Research, Development and Innovation of the Ministry of Economy and Competitiveness under 'Waste 2 Resources' Project (BIA2013-43061-R).

Keywords: building waste, CDW, gypsum, recycling, resources

Procedia PDF Downloads 328
3928 A Tool to Represent People Approach to the Use of Pharmaceuticals and Related Criticality and Needs: A Territory Experience

Authors: Barbara Pittau, Piergiorgio Palla, Antonio Mastino

Abstract:

Communication is fundamental to health education. The proper use of medicinal products is a crucial aspect of the health of citizens that affects both safety and health care spending. Therefore, encouraging/promoting communication, concerning the importance of proper use of pharmaceuticals, has substantial implications in terms of individual health, health care, and health care system sustainability. In view of these considerations, in the context of two projects, one of which is still in progress, a relational database-backed web application named COLLABORAFARMACISOLA has been designed and developed as a tool to analyze and visualize how people approach the use of medicinal products, with the aim of improving and enhancing communication efficacy. The software application is being used to collect information (anonymously and voluntarily) from the citizens of Sardinia, an Italian region, regarding their knowledge, experiences, and opinions towards pharmaceuticals. This study that was conducted to date on thousand of interviewed people, has focused on different aspects such as: the treatment interruption and the "self-prescription” without medical consultation, the attention paid to reading the leaflets, the awareness of the economic value of the pharmaceuticals, the importance of avoiding the waste of medicinal products and the attitudes towards the use of generics. To this purpose, our software application provides a set of ad hoc parsing routines, to store information into the structure of a relational database and to process and visualize it through a set of interactive tools aimed to emphasize the findings and the insights obtained. The results of our preliminary analysis show the efficacy of the awareness plan and, at the same time, the criticality and the needs of the territory under examination. The ultimate goal of our study is to provide a contribution to the community by improving communication that can result in a benefit for public health in a context strictly connected to the reality of the territory.

Keywords: communication, pharmaceuticals, public health, relational database, tool, web application

Procedia PDF Downloads 137
3927 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method

Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi

Abstract:

This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.

Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure

Procedia PDF Downloads 490
3926 Probabilistic Crash Prediction and Prevention of Vehicle Crash

Authors: Lavanya Annadi, Fahimeh Jafari

Abstract:

Transportation brings immense benefits to society, but it also has its costs. Costs include such as the cost of infrastructure, personnel and equipment, but also the loss of life and property in traffic accidents on the road, delays in travel due to traffic congestion and various indirect costs in terms of air transport. More research has been done to identify the various factors that affect road accidents, such as road infrastructure, traffic, sociodemographic characteristics, land use, and the environment. The aim of this research is to predict the probabilistic crash prediction of vehicles using machine learning due to natural and structural reasons by excluding spontaneous reasons like overspeeding etc., in the United States. These factors range from weather factors, like weather conditions, precipitation, visibility, wind speed, wind direction, temperature, pressure, and humidity to human made structures like road structure factors like bump, roundabout, no exit, turning loop, give away, etc. Probabilities are dissected into ten different classes. All the predictions are based on multiclass classification techniques, which are supervised learning. This study considers all crashes that happened in all states collected by the US government. To calculate the probability, multinomial expected value was used and assigned a classification label as the crash probability. We applied three different classification models, including multiclass Logistic Regression, Random Forest and XGBoost. The numerical results show that XGBoost achieved a 75.2% accuracy rate which indicates the part that is being played by natural and structural reasons for the crash. The paper has provided in-deep insights through exploratory data analysis.

Keywords: road safety, crash prediction, exploratory analysis, machine learning

Procedia PDF Downloads 109
3925 In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR

Authors: Pascal Mwenge, Tumisang Seodigeng

Abstract:

The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.

Keywords: biodiesel, calibration, chemometrics, methanolysis, multivariate analysis, transesterification, FTIR

Procedia PDF Downloads 146
3924 Geospatial Technologies in Support of Civic Engagement and Cultural Heritage: Lessons Learned from Three Participatory Planning Workshops for Involving Local Communities in the Development of Sustainable Tourism Practices in Latiano, Brindisi

Authors: Mark Opmeer

Abstract:

The fruitful relationship between cultural heritage and digital technology is evident. Due to the development of user-friendly software, an increasing amount of heritage scholars use ict for their research activities. As a result, the implementation of information technology for heritage planning has become a research objective in itself. During the last decades, we have witnessed a growing debate and literature about the importance of computer technologies for the field of cultural heritage and ecotourism. Indeed, implementing digital technology in support of these domains can be very fruitful for one’s research practice. However, due to the rapid development of new software scholars may find it challenging to use these innovations in an appropriate way. As such, this contribution seeks to explore the interplay between geospatial technologies (geo-ict), civic engagement and cultural heritage and tourism. In this article, we discuss our findings on the use of geo-ict in support of civic participation, cultural heritage and sustainable tourism development in the southern Italian district of Brindisi. In the city of Latiano, three workshops were organized that involved local members of the community to distinguish and discuss interesting points of interests (POI’s) which represent the cultural significance and identity of the area. During the first workshop, a so called mappa della comunità was created on a touch table with collaborative mapping software, that allowed the participators to highlight potential destinations for tourist purposes. Furthermore, two heritage-based itineraries along a selection of identified POI’s was created to make the region attractive for recreants and tourists. These heritage-based itineraries reflect the communities’ ideas about the cultural identity of the region. Both trails were subsequently implemented in a dedicated mobile application (app) and was evaluated using a mixed-method approach with the members of the community during the second workshop. In the final workshop, the findings of the collaboration, the heritage trails and the app was evaluated with all participants. Based on our conclusions, we argue that geospatial technologies have a significant potential for involving local communities in heritage planning and tourism development. The participants of the workshops found it increasingly engaging to share their ideas and knowledge using the digital map of the touch table. Secondly, the use of a mobile application as instrument to test the heritage-based itineraries in the field was broadly considered as fun and beneficial for enhancing community awareness and participation in local heritage. The app furthermore stimulated the communities’ awareness of the added value of geospatial technologies for sustainable tourism development in the area. We conclude this article with a number of recommendations in order to provide a best practice for organizing heritage workshops with similar objectives.

Keywords: civic engagement, geospatial technologies, tourism development, cultural heritage

Procedia PDF Downloads 285
3923 Design of an Innovative Geothermal Heat Pump with a PCM Thermal Storage

Authors: Emanuele Bonamente, Andrea Aquino

Abstract:

This study presents an innovative design for geothermal heat pumps with the goal of maximizing the system efficiency (COP - Coefficient of Performance), reducing the soil use (e.g. length/depth of geothermal boreholes) and initial investment costs. Based on experimental data obtained from a two-year monitoring of a working prototype implemented for a commercial building in the city of Perugia, Italy, an upgrade of the system is proposed and the performance is evaluated via CFD simulations. The prototype was designed to include a thermal heat storage (i.e. water), positioned between the boreholes and the heat pump, acting as a flywheel. Results from the monitoring campaign show that the system is still capable of providing the required heating and cooling energy with a reduced geothermal installation (approx. 30% of the standard length). In this paper, an optimization of the system is proposed, re-designing the heat storage to include phase change materials (PCMs). Two stacks of PCMs, characterized by melting temperatures equal to those needed to maximize the system COP for heating and cooling, are disposed within the storage. During the working cycle, the latent heat of the PCMs is used to heat (cool) the water used by the heat pump while the boreholes independently cool (heat) the storage. The new storage is approximately 10 times smaller and can be easily placed close to the heat pump in the technical room. First, a validation of the CFD simulation of the storage is performed against experimental data. The simulation is then used to test possible alternatives of the original design and it is finally exploited to evaluate the PCM-storage performance for two different configurations (i.e. single- and double-loop systems).

Keywords: geothermal heat pump, phase change materials (PCM), energy storage, renewable energies

Procedia PDF Downloads 313
3922 Modelling of Damage as Hinges in Segmented Tunnels

Authors: Gelacio JuáRez-Luna, Daniel Enrique GonzáLez-RamíRez, Enrique Tenorio-Montero

Abstract:

Frame elements coupled with springs elements are used for modelling the development of hinges in segmented tunnels, the spring elements modelled the rotational, transversal and axial failure. These spring elements are equipped with constitutive models to include independently the moment, shear force and axial force, respectively. These constitutive models are formulated based on damage mechanics and experimental test reported in the literature review. The mesh of the segmented tunnels was discretized in the software GID, and the nonlinear analyses were carried out in the finite element software ANSYS. These analyses provide the capacity curve of the primary and secondary lining of a segmented tunnel. Two numerical examples of segmented tunnels show the capability of the spring elements to release energy by the development of hinges. The first example is a segmental concrete lining discretized with frame elements loaded until hinges occurred in the lining. The second example is a tunnel with primary and secondary lining, discretized with a double ring frame model. The outer ring simulates the segmental concrete lining and the inner ring simulates the secondary cast-in-place concrete lining. Spring elements also modelled the joints between the segments in the circumferential direction and the ring joints, which connect parallel adjacent rings. The computed load vs displacement curves are congruent with numerical and experimental results reported in the literature review. It is shown that the modelling of a tunnel with primary and secondary lining with frame elements and springs provides reasonable results and save computational cost, comparing with 2D or 3D models equipped with smeared crack models.

Keywords: damage, hinges, lining, tunnel

Procedia PDF Downloads 387
3921 Study on the Integration Schemes and Performance Comparisons of Different Integrated Solar Combined Cycle-Direct Steam Generation Systems

Authors: Liqiang Duan, Ma Jingkai, Lv Zhipeng, Haifan Cai

Abstract:

The integrated solar combined cycle (ISCC) system has a series of advantages such as increasing the system power generation, reducing the cost of solar power generation, less pollutant and CO2 emission. In this paper, the parabolic trough collectors with direct steam generation (DSG) technology are considered to replace the heat load of heating surfaces in heat regenerator steam generation (HRSG) of a conventional natural gas combined cycle (NGCC) system containing a PG9351FA gas turbine and a triple pressure HRSG with reheat. The detailed model of the NGCC system is built in ASPEN PLUS software and the parabolic trough collectors with DSG technology is modeled in EBSILON software. ISCC-DSG systems with the replacement of single, two, three and four heating surfaces are studied in this paper. Results show that: (1) the ISCC-DSG systems with the replacement heat load of HPB, HPB+LPE, HPE2+HPB+HPS, HPE1+HPE2+ HPB+HPS are the best integration schemes when single, two, three and four stages of heating surfaces are partly replaced by the parabolic trough solar energy collectors with DSG technology. (2) Both the changes of feed water flow and the heat load of the heating surfaces in ISCC-DSG systems with the replacement of multi-stage heating surfaces are smaller than those in ISCC-DSG systems with the replacement of single heating surface. (3) ISCC-DSG systems with the replacement of HPB+LPE heating surfaces can increase the solar power output significantly. (4) The ISCC-DSG systems with the replacement of HPB heating surfaces has the highest solar-thermal-to-electricity efficiency (47.45%) and the solar radiation energy-to-electricity efficiency (30.37%), as well as the highest exergy efficiency of solar field (33.61%).

Keywords: HRSG, integration scheme, parabolic trough collectors with DSG technology, solar power generation

Procedia PDF Downloads 251
3920 A Web-Based Real Property Updating System for Efficient and Sustainable Urban Development: A Case Study in Ethiopia

Authors: Eyosiyas Aga

Abstract:

The development of information communication technology has transformed the paper-based mapping and land registration processes to a computerized and networked system. The computerization and networking of real property information system play a vital role in good governance and sustainable development of emerging countries through cost effective, easy and accessible service delivery for the customer. The efficient, transparent and sustainable real property system is becoming the basic infrastructure for the urban development thus improve the data management system and service delivery in the organizations. In Ethiopia, the real property administration is paper based as a result, it confronted problems of data management, illegal transactions, corruptions, and poor service delivery. In order to solve this problem and to facilitate real property market, the implementation of web-based real property updating system is crucial. A web-based real property updating is one of the automation (computerizations) methods to facilitate data sharing, reduce time and cost of the service delivery in real property administration system. In additions, it is useful for the integration of data onto different information systems and organizations. This system is designed by combining open source software which supported by open Geo-spatial consortium. The web-based system is mainly designed by using open source software with the help of open Geo-spatial Consortium. The Open Geo-spatial Consortium standards such as the Web Feature Service and Web Map Services are the most widely used standards to support and improves web-based real property updating. These features allow the integration of data from different sources, and it can be used to maintain consistency of data throughout transactions. The PostgreSQL and Geoserver are used to manage and connect a real property data to the flex viewer and user interface. The system is designed for both internal updating system (municipality); which is mainly updating of spatial and textual information, and the external system (customer) which focus on providing and interacting with the customer. This research assessed the potential of open source web applications and adopted this technology for real property updating system in Ethiopia through simple, cost effective and secured way. The system is designed by combining and customizing open source software to enhance the efficiency of the system in cost effective way. The existing workflow for real property updating is analyzed to identify the bottlenecks, and the new workflow is designed for the system. The requirement is identified through questionnaire and literature review, and the system is prototype for the study area. The research mainly aimed to integrate human resource with technology in designing of the system to reduce data inconsistency and security problems. In additions, the research reflects on the current situation of real property administration and contributions of effective data management system for efficient, transparent and sustainable urban development in Ethiopia.

Keywords: cadaster, real property, sustainable, transparency, web feature service, web map service

Procedia PDF Downloads 264
3919 Study of Mobile Game Addiction Using Electroencephalography Data Analysis

Authors: Arsalan Ansari, Muhammad Dawood Idrees, Maria Hafeez

Abstract:

Use of mobile phones has been increasing considerably over the past decade. Currently, it is one of the main sources of communication and information. Initially, mobile phones were limited to calls and messages, but with the advent of new technology smart phones were being used for many other purposes including video games. Despite of positive outcomes, addiction to video games on mobile phone has become a leading cause of psychological and physiological problems among many people. Several researchers examined the different aspects of behavior addiction with the use of different scales. Objective of this study is to examine any distinction between mobile game addicted and non-addicted players with the use of electroencephalography (EEG), based upon psycho-physiological indicators. The mobile players were asked to play a mobile game and EEG signals were recorded by BIOPAC equipment with AcqKnowledge as data acquisition software. Electrodes were places, following the 10-20 system. EEG was recorded at sampling rate of 200 samples/sec (12,000samples/min). EEG recordings were obtained from the frontal (Fp1, Fp2), parietal (P3, P4), and occipital (O1, O2) lobes of the brain. The frontal lobe is associated with behavioral control, personality, and emotions. The parietal lobe is involved in perception, understanding logic, and arithmetic. The occipital lobe plays a role in visual tasks. For this study, a 60 second time window was chosen for analysis. Preliminary analysis of the signals was carried out with Acqknowledge software of BIOPAC Systems. From the survey based on CGS manual study 2010, it was concluded that five participants out of fifteen were in addictive category. This was used as prior information to group the addicted and non-addicted by physiological analysis. Statistical analysis showed that by applying clustering analysis technique authors were able to categorize the addicted and non-addicted players specifically on theta frequency range of occipital area.

Keywords: mobile game, addiction, psycho-physiology, EEG analysis

Procedia PDF Downloads 164
3918 Slope Stability Analysis and Evaluation of Road Cut Slope in Case of Goro to Abagada Road, Adama

Authors: Ezedin Geta Seid

Abstract:

Slope failures are among the common geo-environmental natural hazards in the hilly and mountainous terrain of the world causing damages to human life and destruction of infrastructures. In Ethiopia, the demand for the construction of infrastructures, especially highways and railways, has increased to connect the developmental centers. However, the failure of roadside slopes formed due to the difficulty of geographical locations is the major difficulty for this development. As a result, a comprehensive site-specific investigation of destabilizing agents and a suitable selection of slope profiles are needed during design. Hence, this study emphasized the stability analysis and performance evaluation of slope profiles (single slope, multi-slope, and benched slope). The analysis was conducted for static and dynamic loading conditions using limit equilibrium (slide software) and finite element method (Praxis software). The analysis results in selected critical sections show that the slope is marginally stable, with FS varying from 1.2 to 1.5 in static conditions, and unstable with FS below 1 in dynamic conditions. From the comparison of analysis methods, the finite element method provides more valuable information about the failure surface of a slope than limit equilibrium analysis. Performance evaluation of geometric profiles shows that geometric modification provides better and more economical slope stability. Benching provides significant stability for cut slopes (i.e., the use of 2m and 3m bench improves the factor of safety by 7.5% and 12% from a single slope profile). The method is more effective on steep slopes. Similarly, the use of a multi-slope profile improves the stability of the slope in stratified soil with varied strength. The performance is more significant when it is used in combination with benches. The study also recommends drainage control and slope reinforcement as a remedial measure for cut slopes.

Keywords: slope failure, slope profile, bench slope, multi slope

Procedia PDF Downloads 29
3917 Groundhog Day as a Model for the Repeating Spectator and the Film Academic: Re-Watching the Same Films Again Can Create Different Experiences and Ideas

Authors: Leiya Ho Yin Lee

Abstract:

Groundhog Day (Harold Ramis, 1993) may seemingly be a fairly unremarkable Hollywood comedy film in the 90s, it is argued that the film, with its protagonist Phil (Bill Murray), inadvertently, but perfectly, demonstrates an important aspect in filmmaking, film spectatorship and film research: repetition. Very rarely does a narrative film use one, and only one, take in its shooting. The multiple ‘repeats’ of Phil’s various endeavours due to his being trapped in a perpetual loop of the same day — from stealing money and tricking a woman into a casual relationship, to his multiple suicides, to eventually helping people in need — make the process of doing multiple ‘takes’ in filmmaking explicit. But perhaps more significantly, Phil represents a perfect model for the spectator/cinephile who has seen their favourite film for multiple times that they can remember every single detail. Crucially, their favourite film never changes, as it is a recording, but the cinephile’s experience of that very same film is most likely different each time they watch it again, just as Phil’s character and personality has completely transformed, from selfish and egotistic, to depressed and nihilistic, and ultimately to sympathetic and caring, even though he is living the exact same day. Furthermore, the author did not come up with this stimulating juxtaposition of film spectatorship and Groundhog Day the first time the author saw the film; it took the author a few casual re-viewings to notice the film’s self-reflexivity. And then, when working on it in the author’s research, the author had to re-view the film for more times, and have subsequently noticed even more things previously unnoticed. In this way, Groundhog Day not only stands for a model for filmmaking and film spectatorship, it also illustrates the act of academic research, especially in Film Studies where repeatedly viewing the same films is a prerequisite before new ideas and concepts are discovered from old material. This also recalls Deleuze’s thesis on difference and repetition in that repetition creates difference and it is difference that creates thought.

Keywords: narrative comprehension, repeated viewing, repetition, spectatorship

Procedia PDF Downloads 318
3916 Fundamentals of Mobile Application Architecture

Authors: Mounir Filali

Abstract:

Companies use many innovative ways to reach their customers to stay ahead of the competition. Along with the growing demand for innovative business solutions is the demand for new technology. The most noticeable area of demand for business innovations is the mobile application industry. Recently, companies have recognized the growing need to integrate proprietary mobile applications into their suite of services; Companies have realized that developing mobile apps gives them a competitive edge. As a result, many have begun to rapidly develop mobile apps to stay ahead of the competition. Mobile application development helps companies meet the needs of their customers. Mobile apps also help businesses to take advantage of every potential opportunity to generate leads that convert into sales. Mobile app download growth statistics with the recent rise in demand for business-related mobile apps, there has been a similar rise in the range of mobile app solutions being offered. Today, companies can use the traditional route of the software development team to build their own mobile applications. However, there are also many platform-ready "low-code and no-code" mobile apps available to choose from. These mobile app development options have more streamlined business processes. This helps them be more responsive to their customers without having to be coding experts. Companies must have a basic understanding of mobile app architecture to attract and maintain the interest of mobile app users. Mobile application architecture refers to the buildings or structural systems and design elements that make up a mobile application. It also includes the technologies, processes, and components used during application development. The underlying foundation of all applications consists of all elements of the mobile application architecture; developing a good mobile app architecture requires proper planning and strategic design. The technology framework or platform on the back end and user-facing side of a mobile application is part of the mobile architecture of the application. In-application development Software programmers loosely refer to this set of mobile architecture systems and processes as the "technology stack."

Keywords: mobile applications, development, architecture, technology

Procedia PDF Downloads 104
3915 An Overview of Technology Availability to Support Remote Decentralized Clinical Trials

Authors: Simone Huber, Bianca Schnalzer, Baptiste Alcalde, Sten Hanke, Lampros Mpaltadoros, Thanos G. Stavropoulos, Spiros Nikolopoulos, Ioannis Kompatsiaris, Lina Pérez- Breva, Vallivana Rodrigo-Casares, Jaime Fons-Martínez, Jeroen de Bruin

Abstract:

Developing new medicine and health solutions and improving patient health currently rely on the successful execution of clinical trials, which generate relevant safety and efficacy data. For their success, recruitment and retention of participants are some of the most challenging aspects of protocol adherence. Main barriers include: i) lack of awareness of clinical trials; ii) long distance from the clinical site; iii) the burden on participants, including the duration and number of clinical visits and iv) high dropout rate. Most of these aspects could be addressed with a new paradigm, namely the Remote Decentralized Clinical Trials (RDCTs). Furthermore, the COVID-19 pandemic has highlighted additional advantages and challenges for RDCTs in practice, allowing participants to join trials from home and not depend on site visits, etc. Nevertheless, RDCTs should follow the process and the quality assurance of conventional clinical trials, which involve several processes. For each part of the trial, the Building Blocks, existing software and technologies were assessed through a systematic search. The technology needed to perform RDCTs is widely available and validated but is yet segmented and developed in silos, as different software solutions address different parts of the trial and at various levels. The current paper is analyzing the availability of technology to perform RDCTs, identifying gaps and providing an overview of Basic Building Blocks and functionalities that need to be covered to support the described processes.

Keywords: architectures and frameworks for health informatics systems, clinical trials, information and communications technology, remote decentralized clinical trials, technology availability

Procedia PDF Downloads 215
3914 Multi-Label Approach to Facilitate Test Automation Based on Historical Data

Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally

Abstract:

The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.

Keywords: machine learning, multi-class, multi-label, supervised learning, test automation

Procedia PDF Downloads 131