Search results for: FELA software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4792

Search results for: FELA software

3862 Integrated Gas Turbine Performance Diagnostics and Condition Monitoring Using Adaptive GPA

Authors: Yi-Guang Li, Suresh Sampath

Abstract:

Gas turbine performance degrades over time, and the degradation is greatly affected by environmental, ambient, and operating conditions. The engines may degrade slowly under favorable conditions and result in a waste of engine life if a scheduled maintenance scheme is followed. They may also degrade fast and fail before a scheduled overhaul if the conditions are unfavorable, resulting in serious secondary damage, loss of engine availability, and increased maintenance costs. To overcome these problems, gas turbine owners are gradually moving from scheduled maintenance to condition-based maintenance, where condition monitoring is one of the key supporting technologies. This paper presents an integrated adaptive GPA diagnostics and performance monitoring system developed at Cranfield University for gas turbine gas path condition monitoring. It has the capability to predict the performance degradation of major gas path components of gas turbine engines, such as compressors, combustors, and turbines, using gas path measurement data. It is also able to predict engine key performance parameters for condition monitoring, such as turbine entry temperature that cannot be directly measured. The developed technology has been implemented into digital twin computer Software, Pythia, to support the condition monitoring of gas turbine engines. The capabilities of the integrated GPA condition monitoring system are demonstrated in three test cases using a model gas turbine engine similar to the GE aero-derivative LM2500 engine widely used in power generation and marine propulsion. It shows that when the compressor of the model engine degrades, the Adaptive GPA is able to predict the degradation and the changing engine performance accurately using gas path measurements. Such a presented technology and software are generic, can be applied to different types of gas turbine engines, and provide crucial engine health and performance parameters to support condition monitoring and condition-based maintenance.

Keywords: gas turbine, adaptive GPA, performance, diagnostics, condition monitoring

Procedia PDF Downloads 88
3861 Development of E-Tendering Models for Nigerian Public Procuring Entities

Authors: Bello Abdullahi, Kabir Bala, Yahaya M. Ibrahim, Ahmed D. Ibrahim

Abstract:

Public sector tendering has traditionally been conducted using manual paper-based processes which are known to be inefficient, less transparent, and more prone to manipulations and errors. However, the advent of the Internet and its associated technologies has led to the development of numerous e-Tendering systems that addressed many of the problems associated with the manual paper-based tendering system. Currently, in Nigeria, the public tendering processes are largely conducted based on manual paper-based system that is bedevilled by a number of problems such as inordinate delays, inefficiencies, manipulation of the tender evaluation process, corruption, lack of transparency and competition, among other problems. These problems can be addressed through the adoption of existing web-based e-Tendering systems which are known to address most of these problems. However, these existing e-Tendering systems that have been developed are not based on the Nigerian legal procurement processes and as such their suitability for local application is very limited. This paper is part of a larger study that attempt to address this problem through the development of an e-Tendering system that is based on the requirements of the Nigerian public procuring entities. In this paper, the identified tendering processes commonly used by Nigerian public procuring entities in the selection of construction sources are presented. A multi-methods research approach was used to identify those tendering processes. Specifically, 19 existing business use cases used by Nigerian public procuring entities were identified and 61 system use cases were prescribed based on the identified business use cases. The use cases were used as the basis for the development of domain and software conceptual models. The models were successfully used to guide the development of an e-Tendering system called NPS-eTender. Ripple and Unified Process were adopted as the software development methodologies.

Keywords: e-tendering, e-procurement, requirement model, conceptual model, public sector tendering, public procurement

Procedia PDF Downloads 195
3860 Development of an Interactive and Robust Image Analysis and Diagnostic Tool in R for Early Detection of Cervical Cancer

Authors: Kumar Dron Shrivastav, Ankan Mukherjee Das, Arti Taneja, Harpreet Singh, Priya Ranjan, Rajiv Janardhanan

Abstract:

Cervical cancer is one of the most common cancer among women worldwide which can be cured if detected early. Manual pathology which is typically utilized at present has many limitations. The current gold standard for cervical cancer diagnosis is exhaustive and time-consuming because it relies heavily on the subjective knowledge of the oncopathologists which leads to mis-diagnosis and missed diagnosis resulting false negative and false positive. To reduce time and complexities associated with early diagnosis, we require an interactive diagnostic tool for early detection particularly in developing countries where cervical cancer incidence and related mortality is high. Incorporation of digital pathology in place of manual pathology for cervical cancer screening and diagnosis can increase the precision and strongly reduce the chances of error in a time-specific manner. Thus, we propose a robust and interactive cervical cancer image analysis and diagnostic tool, which can categorically process both histopatholgical and cytopathological images to identify abnormal cells in the least amount of time and settings with minimum resources. Furthermore, incorporation of a set of specific parameters that are typically referred to for identification of abnormal cells with the help of open source software -’R’ is one of the major highlights of the tool. The software has the ability to automatically identify and quantify the morphological features, color intensity, sensitivity and other parameters digitally to differentiate abnormal from normal cells, which may improve and accelerate screening and early diagnosis, ultimately leading to timely treatment of cervical cancer.

Keywords: cervical cancer, early detection, digital Pathology, screening

Procedia PDF Downloads 178
3859 Use of computer and peripherals in the Archaeological Surveys of Sistan in Eastern Iran

Authors: Mahyar Mehrafarin, Reza Mehrafarin

Abstract:

The Sistan region in eastern Iran is a significant archaeological area in Iran and the Middle East, encompassing 10,000 square kilometers. Previous archeological field surveys have identified 1662 ancient sites dating from prehistoric periods to the Islamic period. Research Aim: This article aims to explore the utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, and the benefits derived from their implementation. Methodology: The research employs a descriptive-analytical approach combined with field methods. New technologies and software, such as GPS, drones, magnetometers, equipped cameras, satellite images, and software programs like GIS, Map source, and Excel, were utilized to collect information and analyze data. Findings: The use of modern technologies and computers in archaeological field surveys proved to be essential. Traditional archaeological activities, such as excavation and field surveys, are time-consuming and costly. Employing modern technologies helps in preserving ancient sites, accurately recording archaeological data, reducing errors and mistakes, and facilitating correct and accurate analysis. Creating a comprehensive and accessible database, generating statistics, and producing graphic designs and diagrams are additional advantages derived from the use of efficient technologies in archaeology. Theoretical Importance: The integration of computers and modern technologies in archaeology contributes to interdisciplinary collaborations and facilitates the involvement of specialists from various fields, such as geography, history, art history, anthropology, laboratory sciences, and computer engineering. The utilization of computers in archaeology spanned across diverse areas, including database creation, statistical analysis, graphics implementation, laboratory and engineering applications, and even artificial intelligence, which remains an unexplored area in Iranian archaeology. Data Collection and Analysis Procedures: Information was collected using modern technologies and software, capturing geographic coordinates, aerial images, archeogeophysical data, and satellite images. This data was then inputted into various software programs for analysis, including GIS, Map source, and Excel. The research employed both descriptive and analytical methods to present findings effectively. Question Addressed: The primary question addressed in this research is how the use of modern technologies and computers in archeological field surveys in Sistan, Iran, can enhance archaeological data collection, preservation, analysis, and accessibility. Conclusion: The utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, has proven to be necessary and beneficial. These technologies aid in preserving ancient sites, accurately recording archaeological data, reducing errors, and facilitating comprehensive analysis. The creation of accessible databases, statistics generation, graphic designs, and interdisciplinary collaborations are further advantages observed. It is recommended to explore the potential of artificial intelligence in Iranian archaeology as an unexplored area. The research has implications for cultural heritage organizations, archaeology students, and universities involved in archaeological field surveys in Sistan and Baluchistan province. Additionally, it contributes to enhancing the understanding and preservation of Iran's archaeological heritage.

Keywords: archaeological surveys, computer use, iran, modern technologies, sistan

Procedia PDF Downloads 78
3858 Ensuring Safe Operation by Providing an End-To-End Field Monitoring and Incident Management Approach for Autonomous Vehicle Based on ML/Dl SW Stack

Authors: Lucas Bublitz, Michael Herdrich

Abstract:

By achieving the first commercialization approval in San Francisco the Autonomous Driving (AD) industry proves the technology maturity of the SAE L4 AD systems and the corresponding software and hardware stack. This milestone reflects the upcoming phase in the industry, where the focus is now about scaling and supervising larger autonomous vehicle (AV) fleets in different operation areas. This requires an operation framework, which organizes and assigns responsibilities to the relevant AV technology and operation stakeholders from the AV system provider, the Remote Intervention Operator, the MaaS provider and regulatory & approval authority. This holistic operation framework consists of technological, processual, and organizational activities to ensure safe operation for fully automated vehicles. Regarding the supervision of large autonomous vehicle fleets, a major focus is on the continuous field monitoring. The field monitoring approach must reflect the safety and security criticality of incidents in the field during driving operation. This includes an automatic containment approach, with the overall goal to avoid safety critical incidents and reduce downtime by a malfunction of the AD software stack. An End-to-end (E2E) field monitoring approach detects critical faults in the field, uses a knowledge-based approach for evaluating the safety criticality and supports the automatic containment of these E/E faults. Applying such an approach will ensure the scalability of AV fleets, which is determined by the handling of incidents in the field and the continuous regulatory compliance of the technology after enhancing the Operational Design Domain (ODD) or the function scope by Functions on Demand (FoD) over the entire digital product lifecycle.

Keywords: field monitoring, incident management, multicompliance management for AI in AD, root cause analysis, database approach

Procedia PDF Downloads 75
3857 i2kit: A Tool for Immutable Infrastructure Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.

Keywords: container, deployment, immutable infrastructure, microservice

Procedia PDF Downloads 179
3856 Techno-Economic Optimization and Evaluation of an Integrated Industrial Scale NMC811 Cathode Active Material Manufacturing Process

Authors: Usama Mohamed, Sam Booth, Aliysn J. Nedoma

Abstract:

As part of the transition to electric vehicles, there has been a recent increase in demand for battery manufacturing. Cathodes typically account for approximately 50% of the total lithium-ion battery cell cost and are a pivotal factor in determining the viability of new industrial infrastructure. Cathodes which offer lower costs whilst maintaining or increasing performance, such as nickel-rich layered cathodes, have a significant competitive advantage when scaling up the manufacturing process. This project evaluates the techno-economic value proposition of an integrated industrial scale cathode active material (CAM) production process, closing the mass and energy balances, and optimizing the operation conditions using a sensitivity analysis. This is done by developing a process model of a co-precipitation synthesis route using Aspen Plus software and validated based on experimental data. The mechanism chemistry and equilibrium conditions were established based on previous literature and HSC-Chemistry software. This is then followed by integrating the energy streams, adding waste recovery and treatment processes, as well as testing the effect of key parameters (temperature, pH, reaction time, etc.) on CAM production yield and emissions. Finally, an economic analysis estimating the fixed and variable costs (including capital expenditure, labor costs, raw materials, etc.) to calculate the cost of CAM ($/kg and $/kWh), total plant cost ($) and net present value (NPV). This work sets the foundational blueprint for future research into sustainable industrial scale processes for CAM manufacturing.

Keywords: cathodes, industrial production, nickel-rich layered cathodes, process modelling, techno-economic analysis

Procedia PDF Downloads 100
3855 Triangular Hesitant Fuzzy TOPSIS Approach in Investment Projects Management

Authors: Irina Khutsishvili

Abstract:

The presented study develops a decision support methodology for multi-criteria group decision-making problem. The proposed methodology is based on the TOPSIS (Technique for Order Performance by Similarity to Ideal Solution) approach in the hesitant fuzzy environment. The main idea of decision-making problem is a selection of one best alternative or several ranking alternatives among a set of feasible alternatives. Typically, the process of decision-making is based on an evaluation of certain criteria. In many MCDM problems (such as medical diagnosis, project management, business and financial management, etc.), the process of decision-making involves experts' assessments. These assessments frequently are expressed in fuzzy numbers, confidence intervals, intuitionistic fuzzy values, hesitant fuzzy elements and so on. However, a more realistic approach is using linguistic expert assessments (linguistic variables). In the proposed methodology both the values and weights of the criteria take the form of linguistic variables, given by all decision makers. Then, these assessments are expressed in triangular fuzzy numbers. Consequently, proposed approach is based on triangular hesitant fuzzy TOPSIS decision-making model. Following the TOPSIS algorithm, first, the fuzzy positive ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS) are defined. Then the ranking of alternatives is performed in accordance with the proximity of their distances to the both FPIS and FNIS. Based on proposed approach the software package has been developed, which was used to rank investment projects in the real investment decision-making problem. The application and testing of the software were carried out based on the data provided by the ‘Bank of Georgia’.

Keywords: fuzzy TOPSIS approach, investment project, linguistic variable, multi-criteria decision making, triangular hesitant fuzzy set

Procedia PDF Downloads 428
3854 The Effect of Foundation on the Earth Fill Dam Settlement

Authors: Masoud Ghaemi, Mohammadjafar Hedayati, Faezeh Yousefzadeh, Hoseinali Heydarzadeh

Abstract:

Careful monitoring in the earth dams to measure deformation caused by settlement and movement has always been a concern for engineers in the field. In order to measure settlement and deformation of earth dams, usually, the precision instruments of settlement set and combined Inclinometer that is commonly referred to IS instrument will be used. In some dams, because the thickness of alluvium is high and there is no possibility of alluvium removal (technically and economically and in terms of performance), there is no possibility of placing the end of IS instrument (precision instruments of Inclinometer-settlement set) in the rock foundation. Inevitably, have to accept installing pipes in the weak and deformable alluvial foundation that leads to errors in the calculation of the actual settlement (absolute settlement) in different parts of the dam body. The purpose of this paper is to present new and refine criteria for predicting settlement and deformation in earth dams. The study is based on conditions in three dams with a deformation quite alluvial (Agh Chai, Narmashir and Gilan-e Gharb) to provide settlement criteria affected by the alluvial foundation. To achieve this goal, the settlement of dams was simulated by using the finite difference method with FLAC3D software, and then the modeling results were compared with the reading IS instrument. In the end, the caliber of the model and validate the results, by using regression analysis techniques and scrutinized modeling parameters with real situations and then by using MATLAB software and CURVE FITTING toolbox, new criteria for the settlement based on elasticity modulus, cohesion, friction angle, the density of earth dam and the alluvial foundation was obtained. The results of these studies show that, by using the new criteria measures, the amount of settlement and deformation for the dams with alluvial foundation can be corrected after instrument readings, and the error rate in reading IS instrument can be greatly reduced.

Keywords: earth-fill dam, foundation, settlement, finite difference, MATLAB, curve fitting

Procedia PDF Downloads 195
3853 Life Cycle Assessment of Rare Earth Metals Production: Hotspot Analysis of Didymium Electrolysis Process

Authors: Sandra H. Fukurozaki, Andre L. N. Silva, Joao B. F. Neto, Fernando J. G. Landgraf

Abstract:

Nowadays, the rare earth (RE) metals play an important role in emerging technologies that are crucial for the decarbonisation of the energy sector. Their unique properties have led to increasing clean energy applications, such as wind turbine generators, and hybrid and electric vehicles. Despite the substantial media coverage that has recently surrounded the mining and processing of rare earth metals, very little quantitative information is available concerning their subsequent life stages, especially related to the metallic production of didymium (Nd-Pr) in fluoride molten salt system. Here we investigate a gate to gate scale life cycle assessment (LCA) of the didymium electrolysis based on three different scenarios of operational conditions. The product system is modeled with SimaPro Analyst 8.0.2 software, and IMPACT 2002+ was applied as an impact assessment tool. In order to develop a life cycle inventories built in software databases, patents, and other published sources together with energy/mass balance were utilized. Analysis indicates that from the 14 midpoint impact categories evaluated, the global warming potential (GWP) is the main contributors to the total environmental burden, ranging from 2.7E2 to 3.2E2 kg CO2eq/kg Nd-Pr. At the damage step assessment, the results suggest that slight changes in materials flows associated with enhancement of current efficiency (between 2.5% and 5%), could lead a reduction up to 12% and 15% of human health and climate change damage, respectively. Additionally, this paper highlights the knowledge gaps and future research efforts needing to understand the environmental impacts of Nd-Pr electrolysis process from the life cycle perspective.

Keywords: didymium electrolysis, environmental impacts, life cycle assessment, rare earth metals

Procedia PDF Downloads 186
3852 Quality Analysis of Vegetables Through Image Processing

Authors: Abdul Khalique Baloch, Ali Okatan

Abstract:

The quality analysis of food and vegetable from image is hot topic now a day, where researchers make them better then pervious findings through different technique and methods. In this research we have review the literature, and find gape from them, and suggest better proposed approach, design the algorithm, developed a software to measure the quality from images, where accuracy of image show better results, and compare the results with Perouse work done so for. The Application we uses an open-source dataset and python language with tensor flow lite framework. In this research we focus to sort food and vegetable from image, in the images, the application can sorts and make them grading after process the images, it could create less errors them human base sorting errors by manual grading. Digital pictures datasets were created. The collected images arranged by classes. The classification accuracy of the system was about 94%. As fruits and vegetables play main role in day-to-day life, the quality of fruits and vegetables is necessary in evaluating agricultural produce, the customer always buy good quality fruits and vegetables. This document is about quality detection of fruit and vegetables using images. Most of customers suffering due to unhealthy foods and vegetables by suppliers, so there is no proper quality measurement level followed by hotel managements. it have developed software to measure the quality of the fruits and vegetables by using images, it will tell you how is your fruits and vegetables are fresh or rotten. Some algorithms reviewed in this thesis including digital images, ResNet, VGG16, CNN and Transfer Learning grading feature extraction. This application used an open source dataset of images and language used python, and designs a framework of system.

Keywords: deep learning, computer vision, image processing, rotten fruit detection, fruits quality criteria, vegetables quality criteria

Procedia PDF Downloads 70
3851 Automated Method Time Measurement System for Redesigning Dynamic Facility Layout

Authors: Salam Alzubaidi, G. Fantoni, F. Failli, M. Frosolini

Abstract:

The dynamic facility layout problem is a really critical issue in the competitive industrial market; thus, solving this problem requires robust design and effective simulation systems. The sustainable simulation requires inputting reliable and accurate data into the system. So this paper describes an automated system integrated into the real environment to measure the duration of the material handling operations, collect the data in real-time, and determine the variances between the actual and estimated time schedule of the operations in order to update the simulation software and redesign the facility layout periodically. The automated method- time measurement system collects the real data through using Radio Frequency-Identification (RFID) and Internet of Things (IoT) technologies. Hence, attaching RFID- antenna reader and RFID tags enables the system to identify the location of the objects and gathering the time data. The real duration gathered will be manipulated by calculating the moving average duration of the material handling operations, choosing the shortest material handling path, and then updating the simulation software to redesign the facility layout accommodating with the shortest/real operation schedule. The periodic simulation in real-time is more sustainable and reliable than the simulation system relying on an analysis of historical data. The case study of this methodology is in cooperation with a workshop team for producing mechanical parts. Although there are some technical limitations, this methodology is promising, and it can be significantly useful in the redesigning of the manufacturing layout.

Keywords: dynamic facility layout problem, internet of things, method time measurement, radio frequency identification, simulation

Procedia PDF Downloads 120
3850 Comparison of Different Hydrograph Routing Techniques in XPSTORM Modelling Software: A Case Study

Authors: Fatema Akram, Mohammad Golam Rasul, Mohammad Masud Kamal Khan, Md. Sharif Imam Ibne Amir

Abstract:

A variety of routing techniques are available to develop surface runoff hydrographs from rainfall. The selection of runoff routing method is very vital as it is directly related to the type of watershed and the required degree of accuracy. There are different modelling softwares available to explore the rainfall-runoff process in urban areas. XPSTORM, a link-node based, integrated storm-water modelling software, has been used in this study for developing surface runoff hydrograph for a Golf course area located in Rockhampton in Central Queensland in Australia. Four commonly used methods, namely SWMM runoff, Kinematic wave, Laurenson, and Time-Area are employed to generate runoff hydrograph for design storm of this study area. In runoff mode of XPSTORM, the rainfall, infiltration, evaporation and depression storage for sub-catchments were simulated and the runoff from the sub-catchment to collection node was calculated. The simulation results are presented, discussed and compared. The total surface runoff generated by SWMM runoff, Kinematic wave and Time-Area methods are found to be reasonably close, which indicates any of these methods can be used for developing runoff hydrograph of the study area. Laurenson method produces a comparatively less amount of surface runoff, however, it creates highest peak of surface runoff among all which may be suitable for hilly region. Although the Laurenson hydrograph technique is widely acceptable surface runoff routing technique in Queensland (Australia), extensive investigation is recommended with detailed topographic and hydrologic data in order to assess its suitability for use in the case study area.

Keywords: ARI, design storm, IFD, rainfall temporal pattern, routing techniques, surface runoff, XPSTORM

Procedia PDF Downloads 453
3849 Design and Implementation of an Affordable Electronic Medical Records in a Rural Healthcare Setting: A Qualitative Intrinsic Phenomenon Case Study

Authors: Nitika Sharma, Yogesh Jain

Abstract:

Introduction: An efficient Information System helps in improving the service delivery as well provides the foundation for policy and regulation of other building blocks of Health System. Health care organizations require an integrated working of its various sub-systems. An efficient EMR software boosts the teamwork amongst the various sub-systems thereby resulting in improved service delivery. Although there has been a huge impetus to EMR under the Digital India initiative, it has still not been mandated in India. It is generally implemented in huge funded public or private healthcare organizations only. Objective: The study was conducted to understand the factors that lead to the successful adoption of an affordable EMR in the low level healthcare organization. It intended to understand the design of the EMR and address the solutions to the challenges faced in adoption of the EMR. Methodology: The study was conducted in a non-profit registered Healthcare organization that has been providing healthcare facilities to more than 2500 villages including certain areas that are difficult to access. The data was collected with help of field notes, in-depth interviews and participant observation. A total of 16 participants using the EMR from different departments were enrolled via purposive sampling technique. The participants included in the study were working in the organization before the implementation of the EMR system. The study was conducted in one month period from 25 June-20 July 2018. The Ethical approval was taken from the institute along with prior approval of the participants. Data analysis: A word document of more than 4000 words was obtained after transcribing and translating the answers of respondents. It was further analyzed by focused coding, a line by line review of the transcripts, underlining words, phrases or sentences that might suggest themes to do thematic narrative analysis. Results: Based on the answers the results were thematically grouped under four headings: 1. governance of organization, 2. architecture and design of the software, 3. features of the software, 4. challenges faced in adoption and the solutions to address them. It was inferred that the successful implementation was attributed to the easy and comprehensive design of the system which has facilitated not only easy data storage and retrieval but contributes in constructing a decision support system for the staff. Portability has lead to increased acceptance by physicians. The proper division of labor, increased efficiency of staff, incorporation of auto-correction features and facilitation of task shifting has lead to increased acceptance amongst the users of various departments. Geographical inhibitions, low computer literacy and high patient load were the major challenges faced during its implementation. Despite of dual efforts made both by the architects and administrators to combat these challenges, there are still certain ongoing challenges faced by organization. Conclusion: Whenever any new technology is adopted there are certain innovators, early adopters, late adopters and laggards. The same pattern was followed in adoption of this software. He challenges were overcome with joint efforts of organization administrators and users as well. Thereby this case study provides a framework of implementing similar systems in public sector of countries that are struggling for digitizing the healthcare in presence of crunch of human and financial resources.

Keywords: EMR, healthcare technology, e-health, EHR

Procedia PDF Downloads 105
3848 The Impact of Green Building Envelopes on the Urban Microclimate of the Urban Canopy-Case Study: Fawzy Moaz Street, Alexandria, Egypt

Authors: Amany Haridy, Ahmed Elseragy, Fahd Omar

Abstract:

The issue of temperature increase in the urban microclimate has been at the center of attention recently, especially in dense urban areas, such as the City of Alexandria in Egypt, where building surfaces have become the dominant element (more than green areas and streets). Temperatures have been rising during daytime as well as nighttime, however, the research focused on the rise of air temperature at night, a phenomenon known as the urban heat island. This phenomenon has many effects on ecological life, as well as human health. This study provided evidence of the possibility of reducing the urban heat island by using a green building envelope (green wall and green roof) in Alexandria, Egypt. This City has witnessed a boom in growth in its urban fabric and population. A simulation analysis using the Envi-met software to find the ratio of air temperature reduction was performed. The simulation depended on the orientation of the green areas and their density, which was defined through a process of climatic analysis made by the Diva plugin using the Grasshopper software. Results showed that the reduction in air temperature varies from 0.8–2.0 °C, increasing with the increasing density of green areas. Many systems of green wall and green roof can be found in the local market. However, treating an existing building requires a careful choice of system to fit the building construction load and the surrounding nature. Among the systems of choice, there was the ‘geometric system’ of vertical greening that can be fixed on a light aluminum structure for walls and the extensive green system for roofs. Finally, native plants were the best choice in the long term because they fare well in the local climate.

Keywords: envi-met, green building envelope, urban heat island, urban microclimate

Procedia PDF Downloads 208
3847 Evaluating the Understanding of the University Students (Basic Sciences and Engineering) about the Numerical Representation of the Average Rate of Change

Authors: Saeid Haghjoo, Ebrahim Reyhani, Fahimeh Kolahdouz

Abstract:

The present study aimed to evaluate the understanding of the students in Tehran universities (Iran) about the numerical representation of the average rate of change based on the Structure of Observed Learning Outcomes (SOLO). In the present descriptive-survey research, the statistical population included undergraduate students (basic sciences and engineering) in the universities of Tehran. The samples were 604 students selected by random multi-stage clustering. The measurement tool was a task whose face and content validity was confirmed by math and mathematics education professors. Using Cronbach's Alpha criterion, the reliability coefficient of the task was obtained 0.95, which verified its reliability. The collected data were analyzed by descriptive statistics and inferential statistics (chi-squared and independent t-tests) under SPSS-24 software. According to the SOLO model in the prestructural, unistructural, and multistructural levels, basic science students had a higher percentage of understanding than that of engineering students, although the outcome was inverse at the relational level. However, there was no significant difference in the average understanding of both groups. The results indicated that students failed to have a proper understanding of the numerical representation of the average rate of change, in addition to missconceptions when using physics formulas in solving the problem. In addition, multiple solutions were derived along with their dominant methods during the qualitative analysis. The current research proposed to focus on the context problems with approximate calculations and numerical representation, using software and connection common relations between math and physics in the teaching process of teachers and professors.

Keywords: average rate of change, context problems, derivative, numerical representation, SOLO taxonomy

Procedia PDF Downloads 92
3846 Passive Vibration Isolation Analysis and Optimization for Mechanical Systems

Authors: Ozan Yavuz Baytemir, Ender Cigeroglu, Gokhan Osman Ozgen

Abstract:

Vibration is an important issue in the design of various components of aerospace, marine and vehicular applications. In order not to lose the components’ function and operational performance, vibration isolation design involving the optimum isolator properties selection and isolator positioning processes appear to be a critical study. Knowing the growing need for the vibration isolation system design, this paper aims to present two types of software capable of implementing modal analysis, response analysis for both random and harmonic types of excitations, static deflection analysis, Monte Carlo simulations in addition to study of parameter and location optimization for different types of isolation problem scenarios. Investigating the literature, there is no such study developing a software-based tool that is capable of implementing all those analysis, simulation and optimization studies in one platform simultaneously. In this paper, the theoretical system model is generated for a 6-DOF rigid body. The vibration isolation system of any mechanical structure is able to be optimized using hybrid method involving both global search and gradient-based methods. Defining the optimization design variables, different types of optimization scenarios are listed in detail. Being aware of the need for a user friendly vibration isolation problem solver, two types of graphical user interfaces (GUIs) are prepared and verified using a commercial finite element analysis program, Ansys Workbench 14.0. Using the analysis and optimization capabilities of those GUIs, a real application used in an air-platform is also presented as a case study at the end of the paper.

Keywords: hybrid optimization, Monte Carlo simulation, multi-degree-of-freedom system, parameter optimization, location optimization, passive vibration isolation analysis

Procedia PDF Downloads 565
3845 Investigating the Relationship Between the Auditor’s Personality Type and the Quality of Financial Reporting in Companies Listed on the Tehran Stock Exchange

Authors: Seyedmohsen Mortazavi

Abstract:

The purpose of this research is to investigate the personality types of internal auditors on the quality of financial reporting in companies admitted to the Tehran Stock Exchange. Personality type is one of the issues that emphasizes the field of auditors' behavior, and this field has attracted the attention of shareholders and stock companies today, because the auditors' personality can affect the type of financial reporting and its quality. The research is applied in terms of purpose and descriptive and correlational in terms of method, and a researcher-made questionnaire was used to check the research hypotheses. The statistical population of the research is all the auditors, accountants and financial managers of the companies admitted to the Tehran Stock Exchange, and due to their large number and the uncertainty of their exact number, 384 people have been considered as a statistical sample using Morgan's table. The researcher-made questionnaire was approved by experts in the field, and then its validity and reliability were obtained using software. For the validity of the questionnaire, confirmatory factor analysis was first examined, and then using divergent and convergent validity; Fornell-Larker and cross-sectional load test of the validity of the questionnaire were confirmed; Then, the reliability of the questionnaire was examined using Cronbach's alpha and composite reliability, and the results of these two tests showed the appropriate reliability of the questionnaire. After checking the validity and reliability of the research hypotheses, PLS software was used to check the hypotheses. The results of the research showed that the personalities of internal auditors can affect the quality of financial reporting; The personalities investigated in this research include neuroticism, extroversion, flexibility, agreeableness and conscientiousness, all of these personality types can affect the quality of financial reporting.

Keywords: flexibility, quality of financial reporting, agreeableness, conscientiousness

Procedia PDF Downloads 102
3844 A Tool to Represent People Approach to the Use of Pharmaceuticals and Related Criticality and Needs: A Territory Experience

Authors: Barbara Pittau, Piergiorgio Palla, Antonio Mastino

Abstract:

Communication is fundamental to health education. The proper use of medicinal products is a crucial aspect of the health of citizens that affects both safety and health care spending. Therefore, encouraging/promoting communication, concerning the importance of proper use of pharmaceuticals, has substantial implications in terms of individual health, health care, and health care system sustainability. In view of these considerations, in the context of two projects, one of which is still in progress, a relational database-backed web application named COLLABORAFARMACISOLA has been designed and developed as a tool to analyze and visualize how people approach the use of medicinal products, with the aim of improving and enhancing communication efficacy. The software application is being used to collect information (anonymously and voluntarily) from the citizens of Sardinia, an Italian region, regarding their knowledge, experiences, and opinions towards pharmaceuticals. This study that was conducted to date on thousand of interviewed people, has focused on different aspects such as: the treatment interruption and the "self-prescription” without medical consultation, the attention paid to reading the leaflets, the awareness of the economic value of the pharmaceuticals, the importance of avoiding the waste of medicinal products and the attitudes towards the use of generics. To this purpose, our software application provides a set of ad hoc parsing routines, to store information into the structure of a relational database and to process and visualize it through a set of interactive tools aimed to emphasize the findings and the insights obtained. The results of our preliminary analysis show the efficacy of the awareness plan and, at the same time, the criticality and the needs of the territory under examination. The ultimate goal of our study is to provide a contribution to the community by improving communication that can result in a benefit for public health in a context strictly connected to the reality of the territory.

Keywords: communication, pharmaceuticals, public health, relational database, tool, web application

Procedia PDF Downloads 137
3843 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method

Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi

Abstract:

This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.

Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure

Procedia PDF Downloads 491
3842 In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR

Authors: Pascal Mwenge, Tumisang Seodigeng

Abstract:

The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.

Keywords: biodiesel, calibration, chemometrics, methanolysis, multivariate analysis, transesterification, FTIR

Procedia PDF Downloads 148
3841 Geospatial Technologies in Support of Civic Engagement and Cultural Heritage: Lessons Learned from Three Participatory Planning Workshops for Involving Local Communities in the Development of Sustainable Tourism Practices in Latiano, Brindisi

Authors: Mark Opmeer

Abstract:

The fruitful relationship between cultural heritage and digital technology is evident. Due to the development of user-friendly software, an increasing amount of heritage scholars use ict for their research activities. As a result, the implementation of information technology for heritage planning has become a research objective in itself. During the last decades, we have witnessed a growing debate and literature about the importance of computer technologies for the field of cultural heritage and ecotourism. Indeed, implementing digital technology in support of these domains can be very fruitful for one’s research practice. However, due to the rapid development of new software scholars may find it challenging to use these innovations in an appropriate way. As such, this contribution seeks to explore the interplay between geospatial technologies (geo-ict), civic engagement and cultural heritage and tourism. In this article, we discuss our findings on the use of geo-ict in support of civic participation, cultural heritage and sustainable tourism development in the southern Italian district of Brindisi. In the city of Latiano, three workshops were organized that involved local members of the community to distinguish and discuss interesting points of interests (POI’s) which represent the cultural significance and identity of the area. During the first workshop, a so called mappa della comunità was created on a touch table with collaborative mapping software, that allowed the participators to highlight potential destinations for tourist purposes. Furthermore, two heritage-based itineraries along a selection of identified POI’s was created to make the region attractive for recreants and tourists. These heritage-based itineraries reflect the communities’ ideas about the cultural identity of the region. Both trails were subsequently implemented in a dedicated mobile application (app) and was evaluated using a mixed-method approach with the members of the community during the second workshop. In the final workshop, the findings of the collaboration, the heritage trails and the app was evaluated with all participants. Based on our conclusions, we argue that geospatial technologies have a significant potential for involving local communities in heritage planning and tourism development. The participants of the workshops found it increasingly engaging to share their ideas and knowledge using the digital map of the touch table. Secondly, the use of a mobile application as instrument to test the heritage-based itineraries in the field was broadly considered as fun and beneficial for enhancing community awareness and participation in local heritage. The app furthermore stimulated the communities’ awareness of the added value of geospatial technologies for sustainable tourism development in the area. We conclude this article with a number of recommendations in order to provide a best practice for organizing heritage workshops with similar objectives.

Keywords: civic engagement, geospatial technologies, tourism development, cultural heritage

Procedia PDF Downloads 287
3840 Modelling of Damage as Hinges in Segmented Tunnels

Authors: Gelacio JuáRez-Luna, Daniel Enrique GonzáLez-RamíRez, Enrique Tenorio-Montero

Abstract:

Frame elements coupled with springs elements are used for modelling the development of hinges in segmented tunnels, the spring elements modelled the rotational, transversal and axial failure. These spring elements are equipped with constitutive models to include independently the moment, shear force and axial force, respectively. These constitutive models are formulated based on damage mechanics and experimental test reported in the literature review. The mesh of the segmented tunnels was discretized in the software GID, and the nonlinear analyses were carried out in the finite element software ANSYS. These analyses provide the capacity curve of the primary and secondary lining of a segmented tunnel. Two numerical examples of segmented tunnels show the capability of the spring elements to release energy by the development of hinges. The first example is a segmental concrete lining discretized with frame elements loaded until hinges occurred in the lining. The second example is a tunnel with primary and secondary lining, discretized with a double ring frame model. The outer ring simulates the segmental concrete lining and the inner ring simulates the secondary cast-in-place concrete lining. Spring elements also modelled the joints between the segments in the circumferential direction and the ring joints, which connect parallel adjacent rings. The computed load vs displacement curves are congruent with numerical and experimental results reported in the literature review. It is shown that the modelling of a tunnel with primary and secondary lining with frame elements and springs provides reasonable results and save computational cost, comparing with 2D or 3D models equipped with smeared crack models.

Keywords: damage, hinges, lining, tunnel

Procedia PDF Downloads 390
3839 Study on the Integration Schemes and Performance Comparisons of Different Integrated Solar Combined Cycle-Direct Steam Generation Systems

Authors: Liqiang Duan, Ma Jingkai, Lv Zhipeng, Haifan Cai

Abstract:

The integrated solar combined cycle (ISCC) system has a series of advantages such as increasing the system power generation, reducing the cost of solar power generation, less pollutant and CO2 emission. In this paper, the parabolic trough collectors with direct steam generation (DSG) technology are considered to replace the heat load of heating surfaces in heat regenerator steam generation (HRSG) of a conventional natural gas combined cycle (NGCC) system containing a PG9351FA gas turbine and a triple pressure HRSG with reheat. The detailed model of the NGCC system is built in ASPEN PLUS software and the parabolic trough collectors with DSG technology is modeled in EBSILON software. ISCC-DSG systems with the replacement of single, two, three and four heating surfaces are studied in this paper. Results show that: (1) the ISCC-DSG systems with the replacement heat load of HPB, HPB+LPE, HPE2+HPB+HPS, HPE1+HPE2+ HPB+HPS are the best integration schemes when single, two, three and four stages of heating surfaces are partly replaced by the parabolic trough solar energy collectors with DSG technology. (2) Both the changes of feed water flow and the heat load of the heating surfaces in ISCC-DSG systems with the replacement of multi-stage heating surfaces are smaller than those in ISCC-DSG systems with the replacement of single heating surface. (3) ISCC-DSG systems with the replacement of HPB+LPE heating surfaces can increase the solar power output significantly. (4) The ISCC-DSG systems with the replacement of HPB heating surfaces has the highest solar-thermal-to-electricity efficiency (47.45%) and the solar radiation energy-to-electricity efficiency (30.37%), as well as the highest exergy efficiency of solar field (33.61%).

Keywords: HRSG, integration scheme, parabolic trough collectors with DSG technology, solar power generation

Procedia PDF Downloads 253
3838 A Web-Based Real Property Updating System for Efficient and Sustainable Urban Development: A Case Study in Ethiopia

Authors: Eyosiyas Aga

Abstract:

The development of information communication technology has transformed the paper-based mapping and land registration processes to a computerized and networked system. The computerization and networking of real property information system play a vital role in good governance and sustainable development of emerging countries through cost effective, easy and accessible service delivery for the customer. The efficient, transparent and sustainable real property system is becoming the basic infrastructure for the urban development thus improve the data management system and service delivery in the organizations. In Ethiopia, the real property administration is paper based as a result, it confronted problems of data management, illegal transactions, corruptions, and poor service delivery. In order to solve this problem and to facilitate real property market, the implementation of web-based real property updating system is crucial. A web-based real property updating is one of the automation (computerizations) methods to facilitate data sharing, reduce time and cost of the service delivery in real property administration system. In additions, it is useful for the integration of data onto different information systems and organizations. This system is designed by combining open source software which supported by open Geo-spatial consortium. The web-based system is mainly designed by using open source software with the help of open Geo-spatial Consortium. The Open Geo-spatial Consortium standards such as the Web Feature Service and Web Map Services are the most widely used standards to support and improves web-based real property updating. These features allow the integration of data from different sources, and it can be used to maintain consistency of data throughout transactions. The PostgreSQL and Geoserver are used to manage and connect a real property data to the flex viewer and user interface. The system is designed for both internal updating system (municipality); which is mainly updating of spatial and textual information, and the external system (customer) which focus on providing and interacting with the customer. This research assessed the potential of open source web applications and adopted this technology for real property updating system in Ethiopia through simple, cost effective and secured way. The system is designed by combining and customizing open source software to enhance the efficiency of the system in cost effective way. The existing workflow for real property updating is analyzed to identify the bottlenecks, and the new workflow is designed for the system. The requirement is identified through questionnaire and literature review, and the system is prototype for the study area. The research mainly aimed to integrate human resource with technology in designing of the system to reduce data inconsistency and security problems. In additions, the research reflects on the current situation of real property administration and contributions of effective data management system for efficient, transparent and sustainable urban development in Ethiopia.

Keywords: cadaster, real property, sustainable, transparency, web feature service, web map service

Procedia PDF Downloads 267
3837 Study of Mobile Game Addiction Using Electroencephalography Data Analysis

Authors: Arsalan Ansari, Muhammad Dawood Idrees, Maria Hafeez

Abstract:

Use of mobile phones has been increasing considerably over the past decade. Currently, it is one of the main sources of communication and information. Initially, mobile phones were limited to calls and messages, but with the advent of new technology smart phones were being used for many other purposes including video games. Despite of positive outcomes, addiction to video games on mobile phone has become a leading cause of psychological and physiological problems among many people. Several researchers examined the different aspects of behavior addiction with the use of different scales. Objective of this study is to examine any distinction between mobile game addicted and non-addicted players with the use of electroencephalography (EEG), based upon psycho-physiological indicators. The mobile players were asked to play a mobile game and EEG signals were recorded by BIOPAC equipment with AcqKnowledge as data acquisition software. Electrodes were places, following the 10-20 system. EEG was recorded at sampling rate of 200 samples/sec (12,000samples/min). EEG recordings were obtained from the frontal (Fp1, Fp2), parietal (P3, P4), and occipital (O1, O2) lobes of the brain. The frontal lobe is associated with behavioral control, personality, and emotions. The parietal lobe is involved in perception, understanding logic, and arithmetic. The occipital lobe plays a role in visual tasks. For this study, a 60 second time window was chosen for analysis. Preliminary analysis of the signals was carried out with Acqknowledge software of BIOPAC Systems. From the survey based on CGS manual study 2010, it was concluded that five participants out of fifteen were in addictive category. This was used as prior information to group the addicted and non-addicted by physiological analysis. Statistical analysis showed that by applying clustering analysis technique authors were able to categorize the addicted and non-addicted players specifically on theta frequency range of occipital area.

Keywords: mobile game, addiction, psycho-physiology, EEG analysis

Procedia PDF Downloads 164
3836 Slope Stability Analysis and Evaluation of Road Cut Slope in Case of Goro to Abagada Road, Adama

Authors: Ezedin Geta Seid

Abstract:

Slope failures are among the common geo-environmental natural hazards in the hilly and mountainous terrain of the world causing damages to human life and destruction of infrastructures. In Ethiopia, the demand for the construction of infrastructures, especially highways and railways, has increased to connect the developmental centers. However, the failure of roadside slopes formed due to the difficulty of geographical locations is the major difficulty for this development. As a result, a comprehensive site-specific investigation of destabilizing agents and a suitable selection of slope profiles are needed during design. Hence, this study emphasized the stability analysis and performance evaluation of slope profiles (single slope, multi-slope, and benched slope). The analysis was conducted for static and dynamic loading conditions using limit equilibrium (slide software) and finite element method (Praxis software). The analysis results in selected critical sections show that the slope is marginally stable, with FS varying from 1.2 to 1.5 in static conditions, and unstable with FS below 1 in dynamic conditions. From the comparison of analysis methods, the finite element method provides more valuable information about the failure surface of a slope than limit equilibrium analysis. Performance evaluation of geometric profiles shows that geometric modification provides better and more economical slope stability. Benching provides significant stability for cut slopes (i.e., the use of 2m and 3m bench improves the factor of safety by 7.5% and 12% from a single slope profile). The method is more effective on steep slopes. Similarly, the use of a multi-slope profile improves the stability of the slope in stratified soil with varied strength. The performance is more significant when it is used in combination with benches. The study also recommends drainage control and slope reinforcement as a remedial measure for cut slopes.

Keywords: slope failure, slope profile, bench slope, multi slope

Procedia PDF Downloads 31
3835 Fundamentals of Mobile Application Architecture

Authors: Mounir Filali

Abstract:

Companies use many innovative ways to reach their customers to stay ahead of the competition. Along with the growing demand for innovative business solutions is the demand for new technology. The most noticeable area of demand for business innovations is the mobile application industry. Recently, companies have recognized the growing need to integrate proprietary mobile applications into their suite of services; Companies have realized that developing mobile apps gives them a competitive edge. As a result, many have begun to rapidly develop mobile apps to stay ahead of the competition. Mobile application development helps companies meet the needs of their customers. Mobile apps also help businesses to take advantage of every potential opportunity to generate leads that convert into sales. Mobile app download growth statistics with the recent rise in demand for business-related mobile apps, there has been a similar rise in the range of mobile app solutions being offered. Today, companies can use the traditional route of the software development team to build their own mobile applications. However, there are also many platform-ready "low-code and no-code" mobile apps available to choose from. These mobile app development options have more streamlined business processes. This helps them be more responsive to their customers without having to be coding experts. Companies must have a basic understanding of mobile app architecture to attract and maintain the interest of mobile app users. Mobile application architecture refers to the buildings or structural systems and design elements that make up a mobile application. It also includes the technologies, processes, and components used during application development. The underlying foundation of all applications consists of all elements of the mobile application architecture; developing a good mobile app architecture requires proper planning and strategic design. The technology framework or platform on the back end and user-facing side of a mobile application is part of the mobile architecture of the application. In-application development Software programmers loosely refer to this set of mobile architecture systems and processes as the "technology stack."

Keywords: mobile applications, development, architecture, technology

Procedia PDF Downloads 105
3834 An Overview of Technology Availability to Support Remote Decentralized Clinical Trials

Authors: Simone Huber, Bianca Schnalzer, Baptiste Alcalde, Sten Hanke, Lampros Mpaltadoros, Thanos G. Stavropoulos, Spiros Nikolopoulos, Ioannis Kompatsiaris, Lina Pérez- Breva, Vallivana Rodrigo-Casares, Jaime Fons-Martínez, Jeroen de Bruin

Abstract:

Developing new medicine and health solutions and improving patient health currently rely on the successful execution of clinical trials, which generate relevant safety and efficacy data. For their success, recruitment and retention of participants are some of the most challenging aspects of protocol adherence. Main barriers include: i) lack of awareness of clinical trials; ii) long distance from the clinical site; iii) the burden on participants, including the duration and number of clinical visits and iv) high dropout rate. Most of these aspects could be addressed with a new paradigm, namely the Remote Decentralized Clinical Trials (RDCTs). Furthermore, the COVID-19 pandemic has highlighted additional advantages and challenges for RDCTs in practice, allowing participants to join trials from home and not depend on site visits, etc. Nevertheless, RDCTs should follow the process and the quality assurance of conventional clinical trials, which involve several processes. For each part of the trial, the Building Blocks, existing software and technologies were assessed through a systematic search. The technology needed to perform RDCTs is widely available and validated but is yet segmented and developed in silos, as different software solutions address different parts of the trial and at various levels. The current paper is analyzing the availability of technology to perform RDCTs, identifying gaps and providing an overview of Basic Building Blocks and functionalities that need to be covered to support the described processes.

Keywords: architectures and frameworks for health informatics systems, clinical trials, information and communications technology, remote decentralized clinical trials, technology availability

Procedia PDF Downloads 218
3833 Multi-Label Approach to Facilitate Test Automation Based on Historical Data

Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally

Abstract:

The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.

Keywords: machine learning, multi-class, multi-label, supervised learning, test automation

Procedia PDF Downloads 132