Search results for: software fault prediction
5803 An Ontology Model for Systems Engineering Derived from ISO/IEC/IEEE 15288: 2015: Systems and Software Engineering - System Life Cycle Processes
Authors: Lan Yang, Kathryn Cormican, Ming Yu
Abstract:
ISO/IEC/IEEE 15288: 2015, Systems and Software Engineering - System Life Cycle Processes is an international standard that provides generic top-level process descriptions to support systems engineering (SE). However, the processes defined in the standard needs improvement to lift integrity and consistency. The goal of this research is to explore the way by building an ontology model for the SE standard to manage the knowledge of SE. The ontology model gives a whole picture of the SE knowledge domain by building connections between SE concepts. Moreover, it creates a hierarchical classification of the concepts to fulfil different requirements of displaying and analysing SE knowledge.Keywords: knowledge management, model-based systems engineering, ontology modelling, systems engineering ontology
Procedia PDF Downloads 4275802 Computational Fluid Dynamics Simulation of Reservoir for Dwell Time Prediction
Authors: Nitin Dewangan, Nitin Kattula, Megha Anawat
Abstract:
Hydraulic reservoir is the key component in the mobile construction vehicles; most of the off-road earth moving construction machinery requires bigger side hydraulic reservoirs. Their reservoir construction is very much non-uniform and designers used such design to utilize the space available under the vehicle. There is no way to find out the space utilization of the reservoir by oil and validity of design except virtual simulation. Computational fluid dynamics (CFD) helps to predict the reservoir space utilization by vortex mapping, path line plots and dwell time prediction to make sure the design is valid and efficient for the vehicle. The dwell time acceptance criteria for effective reservoir design is 15 seconds. The paper will describe the hydraulic reservoir simulation which is carried out using CFD tool acuSolve using automated mesh strategy. The free surface flow and moving reference mesh is used to define the oil flow level inside the reservoir. The first baseline design is not able to meet the acceptance criteria, i.e., dwell time below 15 seconds because the oil entry and exit ports were very close. CFD is used to redefine the port locations for the reservoir so that oil dwell time increases in the reservoir. CFD also proposed baffle design the effective space utilization. The final design proposed through CFD analysis is used for physical validation on the machine.Keywords: reservoir, turbulence model, transient model, level set, free-surface flow, moving frame of reference
Procedia PDF Downloads 1555801 Classification Framework of Production Planning and Scheduling Solutions from Supply Chain Management Perspective
Authors: Kwan Hee Han
Abstract:
In today’s business environments, frequent change of customer requirements is a tough challenge to manufacturing company. To cope with these challenges, a production planning and scheduling (PP&S) function might be established to provide accountability for both customer service and operational efficiency. Nowadays, many manufacturing firms have utilized PP&S software solutions to generate a realistic production plan and schedule to adapt to external changes efficiently. However, companies which consider the introduction of PP&S software solution, still have difficulties for selecting adequate solution to meet their specific needs. Since the task of PP&S is the one of major building blocks of SCM (Supply Chain Management) architecture, which deals with short term decision making in the production process of SCM, it is needed that the functionalities of PP&S should be analysed within the whole SCM process. The aim of this paper is to analyse the PP&S functionalities and its system architecture from the SCM perspective by using the criteria of level of planning hierarchy, major 4 SCM processes and problem-solving approaches, and finally propose a classification framework of PP&S solutions to facilitate the comparison among various commercial software solutions. By using proposed framework, several major PP&S solutions are classified and positioned according to their functional characteristics in this paper. By using this framework, practitioners who consider the introduction of computerized PP&S solutions in manufacturing firms can prepare evaluation and benchmarking sheets for selecting the most suitable solution with ease and in less time.Keywords: production planning, production scheduling, supply chain management, the advanced planning system
Procedia PDF Downloads 1995800 In-Flight Aircraft Performance Model Enhancement Using Adaptive Lookup Tables
Authors: Georges Ghazi, Magali Gelhaye, Ruxandra Botez
Abstract:
Over the years, the Flight Management System (FMS) has experienced a continuous improvement of its many features, to the point of becoming the pilot’s primary interface for flight planning operation on the airplane. With the assistance of the FMS, the concept of distance and time has been completely revolutionized, providing the crew members with the determination of the optimized route (or flight plan) from the departure airport to the arrival airport. To accomplish this function, the FMS needs an accurate Aircraft Performance Model (APM) of the aircraft. In general, APMs that equipped most modern FMSs are established before the entry into service of an individual aircraft, and results from the combination of a set of ordinary differential equations and a set of performance databases. Unfortunately, an aircraft in service is constantly exposed to dynamic loads that degrade its flight characteristics. These degradations endow two main origins: airframe deterioration (control surfaces rigging, seals missing or damaged, etc.) and engine performance degradation (fuel consumption increase for a given thrust). Thus, after several years of service, the performance databases and the APM associated to a specific aircraft are no longer representative enough of the actual aircraft performance. It is important to monitor the trend of the performance deterioration and correct the uncertainties of the aircraft model in order to improve the accuracy the flight management system predictions. The basis of this research lies in the new ability to continuously update an Aircraft Performance Model (APM) during flight using an adaptive lookup table technique. This methodology was developed and applied to the well-known Cessna Citation X business aircraft. For the purpose of this study, a level D Research Aircraft Flight Simulator (RAFS) was used as a test aircraft. According to Federal Aviation Administration the level D is the highest certification level for the flight dynamics modeling. Basically, using data available in the Flight Crew Operating Manual (FCOM), a first APM describing the variation of the engine fan speed and aircraft fuel flow w.r.t flight conditions was derived. This model was next improved using the proposed methodology. To do that, several cruise flights were performed using the RAFS. An algorithm was developed to frequently sample the aircraft sensors measurements during the flight and compare the model prediction with the actual measurements. Based on these comparisons, a correction was performed on the actual APM in order to minimize the error between the predicted data and the measured data. In this way, as the aircraft flies, the APM will be continuously enhanced, making the FMS more and more precise and the prediction of trajectories more realistic and more reliable. The results obtained are very encouraging. Indeed, using the tables initialized with the FCOM data, only a few iterations were needed to reduce the fuel flow prediction error from an average relative error of 12% to 0.3%. Similarly, the FCOM prediction regarding the engine fan speed was reduced from a maximum error deviation of 5.0% to 0.2% after only ten flights.Keywords: aircraft performance, cruise, trajectory optimization, adaptive lookup tables, Cessna Citation X
Procedia PDF Downloads 2665799 Hamiltonian Related Properties with and without Faults of the Dual-Cube Interconnection Network and Their Variations
Authors: Shih-Yan Chen, Shin-Shin Kao
Abstract:
In this paper, a thorough review about dual-cubes, DCn, the related studies and their variations are given. DCn was introduced to be a network which retains the pleasing properties of hypercube Qn but has a much smaller diameter. In fact, it is so constructed that the number of vertices of DCn is equal to the number of vertices of Q2n +1. However, each vertex in DCn is adjacent to n + 1 neighbors and so DCn has (n + 1) × 2^2n edges in total, which is roughly half the number of edges of Q2n+1. In addition, the diameter of any DCn is 2n +2, which is of the same order of that of Q2n+1. For selfcompleteness, basic definitions, construction rules and symbols are provided. We chronicle the results, where eleven significant theorems are presented, and include some open problems at the end.Keywords: dual-cubes, dual-cube extensive networks, dual-cube-like networks, hypercubes, fault-tolerant hamiltonian property
Procedia PDF Downloads 4725798 Applications for Accounting of Inherited Object-Oriented Class Members
Authors: Jehad Al Dallal
Abstract:
A class in an Object-Oriented (OO) system is the basic unit of design, and it encapsulates a set of attributes and methods. In OO systems, instead of redefining the attributes and methods that are included in other classes, a class can inherit these attributes and methods and only implement its unique attributes and methods, which results in reducing code redundancy and improving code testability and maintainability. Such mechanism is called Class Inheritance. However, some software engineering applications may require accounting for all the inherited class members (i.e., attributes and methods). This paper explains how to account for inherited class members and discusses the software engineering applications that require such consideration.Keywords: class flattening, external quality attribute, inheritance, internal quality attribute, object-oriented design
Procedia PDF Downloads 2765797 Development of pm2.5 Forecasting System in Seoul, South Korea Using Chemical Transport Modeling and ConvLSTM-DNN
Authors: Ji-Seok Koo, Hee‑Yong Kwon, Hui-Young Yun, Kyung-Hui Wang, Youn-Seo Koo
Abstract:
This paper presents a forecasting system for PM2.5 levels in Seoul, South Korea, leveraging a combination of chemical transport modeling and ConvLSTM-DNN machine learning technology. Exposure to PM2.5 has known detrimental impacts on public health, making its prediction crucial for establishing preventive measures. Existing forecasting models, like the Community Multiscale Air Quality (CMAQ) and Weather Research and Forecasting (WRF), are hindered by their reliance on uncertain input data, such as anthropogenic emissions and meteorological patterns, as well as certain intrinsic model limitations. The system we've developed specifically addresses these issues by integrating machine learning and using carefully selected input features that account for local and distant sources of PM2.5. In South Korea, the PM2.5 concentration is greatly influenced by both local emissions and long-range transport from China, and our model effectively captures these spatial and temporal dynamics. Our PM2.5 prediction system combines the strengths of advanced hybrid machine learning algorithms, convLSTM and DNN, to improve upon the limitations of the traditional CMAQ model. Data used in the system include forecasted information from CMAQ and WRF models, along with actual PM2.5 concentration and weather variable data from monitoring stations in China and South Korea. The system was implemented specifically for Seoul's PM2.5 forecasting.Keywords: PM2.5 forecast, machine learning, convLSTM, DNN
Procedia PDF Downloads 575796 Biomechanical Study of a Type II Superior Labral Anterior to Posterior Lesion in the Glenohumeral Joint Using Finite Element Analysis
Authors: Javier A. Maldonado E., Duvert A. Puentes T., Diego F. Villegas B.
Abstract:
The SLAP lesion (Superior Labral Anterior to Posterior) involves the labrum, causing pain and mobility problems in the glenohumeral joint. This injury is common in athletes practicing sports that requires throwing or those who receive traumatic impacts on the shoulder area. This paper determines the biomechanical behavior of soft tissues of the glenohumeral joint when type II SLAP lesion is present. This pathology is characterized for a tear in the superior labrum which is simulated in a 3D model of the shoulder joint. A 3D model of the glenohumeral joint was obtained using the free software Slice. Then, a Finite Element analysis was done using a general purpose software which simulates a compression test with external rotation. First, a validation was done assuming a healthy joint shoulder with a previous study. Once the initial model was validated, a lesion of the labrum built using a CAD software and the same test was done again. The results obtained were stress and strain distribution of the synovial capsule and the injured labrum. ANOVA was done for the healthy and injured glenohumeral joint finding significant differences between them. This study will help orthopedic surgeons to know the biomechanics involving this type of lesion and also the other surrounding structures affected by loading the injured joint.Keywords: biomechanics, computational model, finite elements, glenohumeral joint, superior labral anterior to posterior lesion
Procedia PDF Downloads 2105795 Realistic Modeling of the Preclinical Small Animal Using Commercial Software
Authors: Su Chul Han, Seungwoo Park
Abstract:
As the increasing incidence of cancer, the technology and modality of radiotherapy have advanced and the importance of preclinical model is increasing in the cancer research. Furthermore, the small animal dosimetry is an essential part of the evaluation of the relationship between the absorbed dose in preclinical small animal and biological effect in preclinical study. In this study, we carried out realistic modeling of the preclinical small animal phantom possible to verify irradiated dose using commercial software. The small animal phantom was modeling from 4D Digital Mouse whole body phantom. To manipulate Moby phantom in commercial software (Mimics, Materialise, Leuven, Belgium), we converted Moby phantom to DICOM image file of CT by Matlab and two- dimensional of CT images were converted to the three-dimensional image and it is possible to segment and crop CT image in Sagittal, Coronal and axial view). The CT images of small animals were modeling following process. Based on the profile line value, the thresholding was carried out to make a mask that was connection of all the regions of the equal threshold range. Using thresholding method, we segmented into three part (bone, body (tissue). lung), to separate neighboring pixels between lung and body (tissue), we used region growing function of Mimics software. We acquired 3D object by 3D calculation in the segmented images. The generated 3D object was smoothing by remeshing operation and smoothing operation factor was 0.4, iteration value was 5. The edge mode was selected to perform triangle reduction. The parameters were that tolerance (0.1mm), edge angle (15 degrees) and the number of iteration (5). The image processing 3D object file was converted to an STL file to output with 3D printer. We modified 3D small animal file using 3- Matic research (Materialise, Leuven, Belgium) to make space for radiation dosimetry chips. We acquired 3D object of realistic small animal phantom. The width of small animal phantom was 2.631 cm, thickness was 2.361 cm, and length was 10.817. Mimics software supported efficiency about 3D object generation and usability of conversion to STL file for user. The development of small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.Keywords: mimics, preclinical small animal, segmentation, 3D printer
Procedia PDF Downloads 3685794 An Improvement of ComiR Algorithm for MicroRNA Target Prediction by Exploiting Coding Region Sequences of mRNAs
Authors: Giorgio Bertolazzi, Panayiotis Benos, Michele Tumminello, Claudia Coronnello
Abstract:
MicroRNAs are small non-coding RNAs that post-transcriptionally regulate the expression levels of messenger RNAs. MicroRNA regulation activity depends on the recognition of binding sites located on mRNA molecules. ComiR (Combinatorial miRNA targeting) is a user friendly web tool realized to predict the targets of a set of microRNAs, starting from their expression profile. ComiR incorporates miRNA expression in a thermodynamic binding model, and it associates each gene with the probability of being a target of a set of miRNAs. ComiR algorithms were trained with the information regarding binding sites in the 3’UTR region, by using a reliable dataset containing the targets of endogenously expressed microRNA in D. melanogaster S2 cells. This dataset was obtained by comparing the results from two different experimental approaches, i.e., inhibition, and immunoprecipitation of the AGO1 protein; this protein is a component of the microRNA induced silencing complex. In this work, we tested whether including coding region binding sites in the ComiR algorithm improves the performance of the tool in predicting microRNA targets. We focused the analysis on the D. melanogaster species and updated the ComiR underlying database with the currently available releases of mRNA and microRNA sequences. As a result, we find that the ComiR algorithm trained with the information related to the coding regions is more efficient in predicting the microRNA targets, with respect to the algorithm trained with 3’utr information. On the other hand, we show that 3’utr based predictions can be seen as complementary to the coding region based predictions, which suggests that both predictions, from 3'UTR and coding regions, should be considered in a comprehensive analysis. Furthermore, we observed that the lists of targets obtained by analyzing data from one experimental approach only, that is, inhibition or immunoprecipitation of AGO1, are not reliable enough to test the performance of our microRNA target prediction algorithm. Further analysis will be conducted to investigate the effectiveness of the tool with data from other species, provided that validated datasets, as obtained from the comparison of RISC proteins inhibition and immunoprecipitation experiments, will be available for the same samples. Finally, we propose to upgrade the existing ComiR web-tool by including the coding region based trained model, available together with the 3’UTR based one.Keywords: AGO1, coding region, Drosophila melanogaster, microRNA target prediction
Procedia PDF Downloads 4535793 Towards End-To-End Disease Prediction from Raw Metagenomic Data
Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker
Abstract:
Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine
Procedia PDF Downloads 1275792 Evaluation of Turbulence Prediction over Washington, D.C.: Comparison of DCNet Observations and North American Mesoscale Model Outputs
Authors: Nebila Lichiheb, LaToya Myles, William Pendergrass, Bruce Hicks, Dawson Cagle
Abstract:
Atmospheric transport of hazardous materials in urban areas is increasingly under investigation due to the potential impact on human health and the environment. In response to health and safety concerns, several dispersion models have been developed to analyze and predict the dispersion of hazardous contaminants. The models of interest usually rely on meteorological information obtained from the meteorological models of NOAA’s National Weather Service (NWS). However, due to the complexity of the urban environment, NWS forecasts provide an inadequate basis for dispersion computation in urban areas. A dense meteorological network in Washington, DC, called DCNet, has been operated by NOAA since 2003 to support the development of urban monitoring methodologies and provide the driving meteorological observations for atmospheric transport and dispersion models. This study focuses on the comparison of wind observations from the DCNet station on the U.S. Department of Commerce Herbert C. Hoover Building against the North American Mesoscale (NAM) model outputs for the period 2017-2019. The goal is to develop a simple methodology for modifying NAM outputs so that the dispersion requirements of the city and its urban area can be satisfied. This methodology will allow us to quantify the prediction errors of the NAM model and propose adjustments of key variables controlling dispersion model calculation.Keywords: meteorological data, Washington D.C., DCNet data, NAM model
Procedia PDF Downloads 2355791 Optimization of Hydraulic Fracturing for Horizontal Wells in Enhanced Geothermal Reservoirs
Authors: Qudratullah Muradi
Abstract:
Geothermal energy is a renewable energy source that can be found in abundance on our planet. Only a small fraction of it is currently converted to electrical power, though in recent years installed geothermal capacity has increased considerably all over the world. In this paper, we assumed a model for designing of Enhanced Geothermal System, EGS. We used computer modeling group, CMG reservoir simulation software to create the typical Hot Dry Rock, HDR reservoir. In this research two wells, one injection of cold water and one production of hot water are included in the model. There are some hydraulic fractures created by the mentioned software. And cold water is injected in order to produce energy from the reservoir. The result of injecting cold water to the reservoir and extracting geothermal energy is defined by some graphs at the end of this research. The production of energy is quantified in a period of 10 years.Keywords: geothermal energy, EGS, HDR, hydraulic fracturing
Procedia PDF Downloads 2025790 RA-Apriori: An Efficient and Faster MapReduce-Based Algorithm for Frequent Itemset Mining on Apache Flink
Authors: Sanjay Rathee, Arti Kashyap
Abstract:
Extraction of useful information from large datasets is one of the most important research problems. Association rule mining is one of the best methods for this purpose. Finding possible associations between items in large transaction based datasets (finding frequent patterns) is most important part of the association rule mining. There exist many algorithms to find frequent patterns but Apriori algorithm always remains a preferred choice due to its ease of implementation and natural tendency to be parallelized. Many single-machine based Apriori variants exist but massive amount of data available these days is above capacity of a single machine. Therefore, to meet the demands of this ever-growing huge data, there is a need of multiple machines based Apriori algorithm. For these types of distributed applications, MapReduce is a popular fault-tolerant framework. Hadoop is one of the best open-source software frameworks with MapReduce approach for distributed storage and distributed processing of huge datasets using clusters built from commodity hardware. However, heavy disk I/O operation at each iteration of a highly iterative algorithm like Apriori makes Hadoop inefficient. A number of MapReduce-based platforms are being developed for parallel computing in recent years. Among them, two platforms, namely, Spark and Flink have attracted a lot of attention because of their inbuilt support to distributed computations. Earlier we proposed a reduced- Apriori algorithm on Spark platform which outperforms parallel Apriori, one because of use of Spark and secondly because of the improvement we proposed in standard Apriori. Therefore, this work is a natural sequel of our work and targets on implementing, testing and benchmarking Apriori and Reduced-Apriori and our new algorithm ReducedAll-Apriori on Apache Flink and compares it with Spark implementation. Flink, a streaming dataflow engine, overcomes disk I/O bottlenecks in MapReduce, providing an ideal platform for distributed Apriori. Flink's pipelining based structure allows starting a next iteration as soon as partial results of earlier iteration are available. Therefore, there is no need to wait for all reducers result to start a next iteration. We conduct in-depth experiments to gain insight into the effectiveness, efficiency and scalability of the Apriori and RA-Apriori algorithm on Flink.Keywords: apriori, apache flink, Mapreduce, spark, Hadoop, R-Apriori, frequent itemset mining
Procedia PDF Downloads 2995789 Studies on Separation of Scandium from Sulfate Environment Using Ion Exchange Technique
Authors: H. Hajmohammadi , A. H. Jafari, M. Eskandari Nasab
Abstract:
The ion exchange method was used to assess the absorption of sulfate media from laboratory-grade materials. The Taguchi method was employed for determining the optimum conditions for scandium adsorption. Results show that optimum conditions for scandium adsorption from sulfate were obtained by Purolite C100 cationic resin in 0.1 g/l sulfuric acid and scandium concentration of 2 g/l at 25 °C. Studies also showed that lowering H₂SO₄ concentration and aqueous phase temperature leads to an increase in Sc adsorption. Visual Minteq software was used to ascertain the various possible cation types and the effect of concentration of scandium ion species on scandium adsorption by cationic resins. The simulation results of the above software show that scandium ion species are often cationic species that are consistent with experimental data.Keywords: scandium, ion exchange resin, simulation, leach copper
Procedia PDF Downloads 1435788 Prediction of Slaughter Body Weight in Rabbits: Multivariate Approach through Path Coefficient and Principal Component Analysis
Authors: K. A. Bindu, T. V. Raja, P. M. Rojan, A. Siby
Abstract:
The multivariate path coefficient approach was employed to study the effects of various production and reproduction traits on the slaughter body weight of rabbits. Information on 562 rabbits maintained at the university rabbit farm attached to the Centre for Advanced Studies in Animal Genetics, and Breeding, Kerala Veterinary and Animal Sciences University, Kerala State, India was utilized. The manifest variables used in the study were age and weight of dam, birth weight, litter size at birth and weaning, weight at first, second and third months. The linear multiple regression analysis was performed by keeping the slaughter weight as the dependent variable and the remaining as independent variables. The model explained 48.60 percentage of the total variation present in the market weight of the rabbits. Even though the model used was significant, the standardized beta coefficients for the independent variables viz., age and weight of the dam, birth weight and litter sizes at birth and weaning were less than one indicating their negligible influence on the slaughter weight. However, the standardized beta coefficient of the second-month body weight was maximum followed by the first-month weight indicating their major role on the market weight. All the other factors influence indirectly only through these two variables. Hence it was concluded that the slaughter body weight can be predicted using the first and second-month body weights. The principal components were also developed so as to achieve more accuracy in the prediction of market weight of rabbits.Keywords: component analysis, multivariate, slaughter, regression
Procedia PDF Downloads 1685787 Gis Database Creation for Impacts of Domestic Wastewater Disposal on BIDA Town, Niger State Nigeria
Authors: Ejiobih Hyginus Chidozie
Abstract:
Geographic Information System (GIS) is a configuration of computer hardware and software specifically designed to effectively capture, store, update, manipulate, analyse and display and display all forms of spatially referenced information. GIS database is referred to as the heart of GIS. It has location data, attribute data and spatial relationship between the objects and their attributes. Sewage and wastewater management have assumed increased importance lately as a result of general concern expressed worldwide about the problems of pollution of the environment contamination of the atmosphere, rivers, lakes, oceans and ground water. In this research GIS database was created to study the impacts of domestic wastewater disposal methods on Bida town, Niger State as a model for investigating similar impacts on other cities in Nigeria. Results from GIS database are very useful to decision makers and researchers. Bida Town was subdivided into four regions, eight zones, and 24 sectors based on the prevailing natural morphology of the town. GIS receiver and structured questionnaire were used to collect information and attribute data from 240 households of the study area. Domestic wastewater samples were collected from twenty four sectors of the study area for laboratory analysis. ArcView 3.2a GIS software, was used to create the GIS databases for ecological, health and socioeconomic impacts of domestic wastewater disposal methods in Bida town.Keywords: environment, GIS, pollution, software, wastewater
Procedia PDF Downloads 4215786 Prediction Factor of Recurrence Supraventricular Tachycardia After Adenosine Treatment in the Emergency Department
Authors: Welawat Tienpratarn, Chaiyaporn Yuksen, Rungrawin Promkul, Chetsadakon Jenpanitpong, Pajit Bunta, Suthap Jaiboon
Abstract:
Supraventricular tachycardia (SVT) is an abnormally fast atrial tachycardia characterized by narrow (≤ 120 ms) and constant QRS. Adenosine was the drug of choice; the first dose was 6 mg. It can be repeated with the second and third doses of 12 mg, with greater than 90% success. The study found that patients observed at 4 hours after normal sinus rhythm was no recurrence within 24 hours. The objective of this study was to investigate the factors that influence the recurrence of SVT after adenosine in the emergency department (ED). The study was conducted retrospectively exploratory model, prognostic study at the Emergency Department (ED) in Faculty of Medicine, Ramathibodi Hospital, a university-affiliated super tertiary care hospital in Bangkok, Thailand. The study was conducted for ten years period between 2010 and 2020. The inclusion criteria were age > 15 years, visiting the ED with SVT, and treating with adenosine. Those patients were recorded with the recurrence SVT in ED. The multivariable logistic regression model developed the predictive model and prediction score for recurrence PSVT. 264 patients met the study criteria. Of those, 24 patients (10%) had recurrence PSVT. Five independent factors were predictive of recurrence PSVT. There was age>65 years, heart rate (after adenosine) > 100 per min, structural heart disease, and dose of adenosine. The clinical risk score to predict recurrence PSVT is developed accuracy 74.41%. The score of >6 had the likelihood ratio of recurrence PSVT by 5.71 times. The clinical predictive score of > 6 was associated with recurrence PSVT in ED.Keywords: supraventricular tachycardia, recurrance, emergency department, adenosine
Procedia PDF Downloads 1195785 Multifluid Computational Fluid Dynamics Simulation for Sawdust Gasification inside an Industrial Scale Fluidized Bed Gasifier
Authors: Vasujeet Singh, Pruthiviraj Nemalipuri, Vivek Vitankar, Harish Chandra Das
Abstract:
For the correct prediction of thermal and hydraulic performance (bed voidage, suspension density, pressure drop, heat transfer, and combustion kinetics), one should incorporate the correct parameters in the computational fluid dynamics simulation of a fluidized bed gasifier. Scarcity of fossil fuels, and to fulfill the energy demand of the increasing population, researchers need to shift their attention to the alternative to fossil fuels. The current research work focuses on hydrodynamics behavior and gasification of sawdust inside a 2D industrial scale FBG using the Eulerian-Eulerian multifluid model. The present numerical model is validated with experimental data. Further, this model extended for the prediction of gasification characteristics of sawdust by incorporating eight heterogeneous moisture release, volatile cracking, tar cracking, tar oxidation, char combustion, CO₂ gasification, steam gasification, methanation reaction, and five homogeneous oxidation of CO, CH₄, H₂, forward and backward water gas shift (WGS) reactions. In the result section, composition of gasification products is analyzed, along with the hydrodynamics of sawdust and sand phase, heat transfer between the gas, sand and sawdust, reaction rates of different homogeneous and heterogeneous reactions is being analyzed along the height of the domain.Keywords: devolatilization, Eulerian-Eulerian, fluidized bed gasifier, mathematical modelling, sawdust gasification
Procedia PDF Downloads 1085784 Shoulder Range of Motion Measurements using Computer Vision Compared to Hand-Held Goniometric Measurements
Authors: Lakshmi Sujeesh, Aaron Ramzeen, Ricky Ziming Guo, Abhishek Agrawal
Abstract:
Introduction: Range of motion (ROM) is often measured by physiotherapists using hand-held goniometer as part of mobility assessment for diagnosis. Due to the nature of hand-held goniometer measurement procedure, readings often tend to have some variations depending on the physical therapist taking the measurements (Riddle et al.). This study aims to validate computer vision software readings against goniometric measurements for quick and consistent ROM measurements to be taken by clinicians. The use of this computer vision software hopes to improve the future of musculoskeletal space with more efficient diagnosis from recording of patient’s ROM with minimal human error across different physical therapists. Methods: Using the hand-held long arm goniometer measurements as the “gold-standard”, healthy study participants (n = 20) were made to perform 4 exercises: Front elevation, Abduction, Internal Rotation, and External Rotation, using both arms. Assessment of active ROM using computer vision software at different angles set by goniometer for each exercise was done. Interclass Correlation Coefficient (ICC) using 2-way random effects model, Box-Whisker plots, and Root Mean Square error (RMSE) were used to find the degree of correlation and absolute error measured between set and recorded angles across the repeated trials by the same rater. Results: ICC (2,1) values for all 4 exercises are above 0.9, indicating excellent reliability. Lowest overall RMSE was for external rotation (5.67°) and highest for front elevation (8.00°). Box-whisker plots showed have showed that there is a potential zero error in the measurements done by the computer vision software for abduction, where absolute error for measurements taken at 0 degree are shifted away from the ideal 0 line, with its lowest recorded error being 8°. Conclusion: Our results indicate that the use of computer vision software is valid and reliable to use in clinical settings by physiotherapists for measuring shoulder ROM. Overall, computer vision helps improve accessibility to quality care provided for individual patients, with the ability to assess ROM for their condition at home throughout a full cycle of musculoskeletal care (American Academy of Orthopaedic Surgeons) without the need for a trained therapist.Keywords: physiotherapy, frozen shoulder, joint range of motion, computer vision
Procedia PDF Downloads 1105783 Practical Method for Failure Prediction of Mg Alloy Sheets during Warm Forming Processes
Authors: Sang-Woo Kim, Young-Seon Lee
Abstract:
An important concern in metal forming, even at elevated temperatures, is whether a desired deformation can be accomplished without any failure of the material. A detailed understanding of the critical condition for crack initiation provides not only the workability limit of a material but also a guide-line for process design. This paper describes the utilization of ductile fracture criteria in conjunction with the finite element method (FEM) for predicting the onset of fracture in warm metal working processes of magnesium alloy sheets. Critical damage values for various ductile fracture criteria were determined from uniaxial tensile tests and were expressed as the function of strain rate and temperature. In order to find the best criterion for failure prediction, Erichsen cupping tests under isothermal conditions and FE simulations combined with ductile fracture criteria were carried out. Based on the plastic deformation histories obtained from the FE analyses of the Erichsen cupping tests and the critical damage value curves, the initiation time and location of fracture were predicted under a bi-axial tensile condition. The results were compared with experimental results and the best criterion was recommended. In addition, the proposed methodology was used to predict the onset of fracture in non-isothermal deep drawing processes using an irregular shaped blank, and the results were verified experimentally.Keywords: magnesium, AZ31 alloy, ductile fracture, FEM, sheet forming, Erichsen cupping test
Procedia PDF Downloads 3755782 Software Vulnerability Markets: Discoverers and Buyers
Authors: Abdullah M. Algarni, Yashwant K. Malaiya
Abstract:
Some of the key aspects of vulnerability-discovery, dissemination, and disclosure-have received some attention recently. However, the role of interaction among the vulnerability discoverers and vulnerability acquirers has not yet been adequately addressed. Our study suggests that a major percentage of discoverers, a majority in some cases, are unaffiliated with the software developers and thus are free to disseminate the vulnerabilities they discover in any way they like. As a result, multiple vulnerability markets have emerged. In some of these markets, the exchange is regulated, but in others, there is little or no regulation. In recent vulnerability discovery literature, the vulnerability discoverers have remained anonymous individuals. Although there has been an attempt to model the level of their efforts, information regarding their identities, modes of operation, and what they are doing with the discovered vulnerabilities has not been explored. Reports of buying and selling of the vulnerabilities are now appearing in the press; however, the existence of such markets requires validation, and the natures of the markets need to be analysed. To address this need, we have attempted to collect detailed information. We have identified the most prolific vulnerability discoverers throughout the past decade and examined their motivation and methods. A large percentage of these discoverers are located in Eastern and Western Europe and in the Far East. We have contacted several of them in order to collect first-hand information regarding their techniques, motivations, and involvement in the vulnerability markets. We examine why many of the discoverers appear to retire after a highly successful vulnerability-finding career. The paper identifies the actual vulnerability markets, rather than the hypothetical ideal markets that are often examined. The emergence of worldwide government agencies as vulnerability buyers has significant implications. We discuss potential factors that can impact the risk to society and the need for detailed exploration.Keywords: risk management, software security, vulnerability discoverers, vulnerability markets
Procedia PDF Downloads 2545781 Experimenting with Error Performance of Systems Employing Pulse Shaping Filters on a Software-Defined-Radio Platform
Authors: Chia-Yu Yao
Abstract:
This paper presents experimental results on testing the symbol-error-rate (SER) performance of quadrature amplitude modulation (QAM) systems employing symmetric pulse-shaping square-root (SR) filters designed by minimizing the roughness function and by minimizing the peak-to-average power ratio (PAR). The device used in the experiments is the 'bladeRF' software-defined-radio platform. PAR is a well-known measurement, whereas the roughness function is a concept for measuring the jitter-induced interference. The experimental results show that the system employing minimum-roughness pulse-shaping SR filters outperforms the system employing minimum-PAR pulse-shaping SR filters in the sense of SER performance.Keywords: pulse-shaping filters, FIR filters, jittering, QAM
Procedia PDF Downloads 3425780 Defects Estimation of Embedded Systems Components by a Bond Graph Approach
Authors: I. Gahlouz, A. Chellil
Abstract:
The paper concerns the estimation of system components faults by using an unknown inputs observer. To reach this goal, we used the Bond Graph approach to physical modelling. We showed that this graphical tool is allowing the representation of system components faults as unknown inputs within the state representation of the considered physical system. The study of the causal and structural features of the system (controllability, observability, finite structure, and infinite structure) based on the Bond Graph approach was hence fulfilled in order to design an unknown inputs observer which is used for the system component fault estimation.Keywords: estimation, bond graph, controllability, observability
Procedia PDF Downloads 4165779 Build Information Systems Environment Clean Through the Sms Gateway
Authors: Lutpi Ginanjar
Abstract:
Environmental hygiene is indispensable for people to live healthy, safe and peaceful. In a small environment, the cleanliness of the environment is very easy to overcome, but on the larger environment requires a more complicated management and considerable investments. In general environmental hygiene are managed by the Department of Hygiene and Landscaper. Found a good management, but much less good management. The difficulties that are often encountered on waste management also caused public awareness itself. In addition, communities have difficulty in making a report about the rubbish because not dibangunnyasistem good information. Essai aims to build information systems environment clean especially the handling of waste in the city of Bandung, West Java province. The system was built with PHP software. Expected results obtained after the construction of the information system of environmental hygiene can be demonstrated to the community will be the health of the environment.Keywords: information systems, SMS gateway, management, software, PHP
Procedia PDF Downloads 4915778 Multi-Agent TeleRobotic Security Control System: Requirements Definitions of Multi-Agent System Using The Behavioral Patterns Analysis (BPA) Approach
Authors: Assem El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Multi-Agent TeleRobotic Security Control System (MTSCS). The event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are the Behavioral Pattern Analysis (BPA) modeling methodology, and the development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, multi-agent, TeleRobotics control, security, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases
Procedia PDF Downloads 4405777 Determination of Lithology, Porosity and Water Saturation for Mishrif Carbonate Formation
Authors: F. S. Kadhim, A. Samsuri, H. Alwan
Abstract:
Well logging records can help to answer many questions from a wide range of special interested information and basic petrophysical properties to formation evaluation of oil and gas reservoirs. The accurate calculations of porosity in carbonate reservoirs are the most challenging aspects of well log analysis. Many equations have been developed over the years based on known physical principles or on empirically derived relationships, which are used to calculate porosity, estimate lithology and water saturation; however these parameters are calculated from well logs by using modern technique in a current study. Nasiriya (NS) oilfield is one of giant oilfields in the Middle East, and the formation under study is the Mishrif carbonate formation which is the shallowest hydrocarbon bearing zone in the NS oilfield. Neurolog software (V5, 2008) was used to digitize the scanned copies of the available logs. Environmental corrections had been made as per Schlumberger charts 2005, which supplied in the Interactive Petrophysics software (IP, V3.5, 2008). Three saturation models have been used to calculate water saturation of carbonate formations, which are simple Archie equation, Dual water model, and Indonesia model. Results indicate that the Mishrif formation consists mainly of limestone, some dolomite and shale. The porosity interpretation shows that the logging tools have a good quality after making the environmental corrections. The average formation water saturation for Mishrif formation is around 0.4-0.6.This study is provided accurate behavior of petrophysical properties with depth for this formation by using modern software.Keywords: lithology, porosity, water saturation, carbonate formation, mishrif formation
Procedia PDF Downloads 3765776 AIPM:An Integrator and Pull Request Matching Model in Github
Authors: Zhifang Liao, Yanbing Li, Li Xu, Yan Zhang, Xiaoping Fan, Jinsong Wu
Abstract:
Pull Request (PR) is the primary method for code contributions from the external contributors in Github. PR review is an essential part of open source software developments for maintaining the quality of software. Matching a new PR of an appropriate integrator will make the PR review more effective. However, PR and integrator matching are now organized manually in Github. To reduce this cost, we presented an AIPM model to predict highly relevant integrator of incoming PRs. AIPM uses topic model to extract topics from the PRs, and builds a one-to-one correspondence between topics and integrators. Then, AIPM finds the most suitable integrator according to the maximum entry of the topic-document distribution. On average, AIPM can reach a precision of 60%, and even in some projects, can reach a precision of 80%.Keywords: pull Request, integrator matching, Github, open source project, topic model
Procedia PDF Downloads 3025775 Cooperative CDD Scheme Based On Hierarchical Modulation in OFDM System
Authors: Seung-Jun Yu, Yeong-Seop Ahn, Young-Min Ko, Hyoung-Kyu Song
Abstract:
In order to achieve high data rate and increase the spectral efficiency, multiple input multiple output (MIMO) system has been proposed. However, multiple antennas are limited by size and cost. Therefore, recently developed cooperative diversity scheme, which profits the transmit diversity only with the existing hardware by constituting a virtual antenna array, can be a solution. However, most of the introduced cooperative techniques have a common fault of decreased transmission rate because the destination should receive the decodable compositions of symbols from the source and the relay. In this paper, we propose a cooperative cyclic delay diversity (CDD) scheme that uses hierarchical modulation. This scheme is free from the rate loss and allows seamless cooperative communication.Keywords: MIMO, cooperative communication, CDD, hierarchical modulation
Procedia PDF Downloads 5505774 Aspects of the Detail Design of an Automated Biomethane Test
Authors: Ilias Katsanis, Paraskevas Papanikos, Nikolas Zacharopoulos, Vassilis C. Moulianitis, Evgenios Scourboutis, Diamantis T. Panagiotarakos
Abstract:
This paper presents aspects of the detailed design of an automated biomethane potential measurement system using CAD techniques. First, the design specifications grouped in eight sets that are used to design the design alternatives are briefly presented. Then, the major components of the final concept, as well as the design of the test, are presented. The material selection process is made using ANSYS EduPack database software. The mechanical behavior of one component developed in Creo v.5 is evaluated using finite element analysis. Finally, aspects of software development that integrate the BMP test is finally presented. This paper shows the advantages of CAD techniques in product design applied in the design of a mechatronic product.Keywords: automated biomethane test, detail mechatronics design, materials selection, mechanical analysis
Procedia PDF Downloads 90