Search results for: results validation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 37861

Search results for: results validation

37621 Quantification of Leachate Potential of the Quezon City Controlled Dumping Facility Using Help Model

Authors: Paul Kenneth D. Luzon, Maria Antonia N. Tanchuling

Abstract:

The Quezon City Controlled Dumping facility also known as Payatas produces leachate which can contaminate soil and water environment in the area. The goal of this study is to quantify the leachate produced by the QCCDF using the Hydrologic Evaluation of Landfill Performance (HELP) model. Results could be used as input for groundwater contaminant transport studies. The HELP model is based on a simple water budget and is an essential “model requirement” used by the US Environmental Protection Agency (EPA). Annual waste profile of the QCCDF was calculated. Based on topographical maps and estimation of settlement due to overburden pressure and degradation, a total of 10M m^3 of waste is contained in the landfill. The input necessary for the HELP model are weather data, soil properties, and landfill design. Results showed that from 1988 to 2011, an average of 50% of the total precipitation percolates through the bottom layer. Validation of the results is still needed due to the assumptions made in the study. The decrease in porosity of the top soil cover showed the best mitigation for minimizing percolation rate. This study concludes that there is a need for better leachate management system in the QCCDF.

Keywords: help model, landfill, payatas trash slide, quezon city controlled dumping facility

Procedia PDF Downloads 291
37620 An Inverse Docking Approach for Identifying New Potential Anticancer Targets

Authors: Soujanya Pasumarthi

Abstract:

Inverse docking is a relatively new technique that has been used to identify potential receptor targets of small molecules. Our docking software package MDock is well suited for such an application as it is both computationally efficient, yet simultaneously shows adequate results in binding affinity predictions and enrichment tests. As a validation study, we present the first stage results of an inverse-docking study which seeks to identify potential direct targets of PRIMA-1. PRIMA-1 is well known for its ability to restore mutant p53's tumor suppressor function, leading to apoptosis in several types of cancer cells. For this reason, we believe that potential direct targets of PRIMA-1 identified in silico should be experimentally screened for their ability to inhibitcancer cell growth. The highest-ranked human protein of our PRIMA-1 docking results is oxidosqualene cyclase (OSC), which is part of the cholesterol synthetic pathway. The results of two followup experiments which treat OSC as a possible anti-cancer target are promising. We show that both PRIMA-1 and Ro 48-8071, a known potent OSC inhibitor, significantly reduce theviability of BT-474 breast cancer cells relative to normal mammary cells. In addition, like PRIMA-1, we find that Ro 48-8071 results in increased binding of mutant p53 to DNA in BT- 474cells (which highly express p53). For the first time, Ro 48-8071 is shown as a potent agent in killing human breast cancer cells. The potential of OSC as a new target for developing anticancer therapies is worth further investigation.

Keywords: inverse docking, in silico screening, protein-ligand interactions, molecular docking

Procedia PDF Downloads 448
37619 Evaluation of Stone Column Behavior Strengthened Circular Raft Footing under Static Load

Authors: R. Ziaie Moayed, B. Mohammadi-Haji

Abstract:

Stone columns have been widely employing to improve the load-settlement characteristics of soft soils. The results of two small scale displacement control loading tests on stone columns were used in order to validate numerical finite element simulations. Additionally, a series of numerical calculations of static loading have been performed on strengthened raft footing to investigate the effects of using stone columns on bearing capacity of footings. The bearing capacity of single and group of stone columns under static loading compares with unimproved ground.

Keywords: circular raft footing, numerical analysis, validation, vertically encased stone column

Procedia PDF Downloads 290
37618 An Expert System Designed to Be Used with MOEAs for Efficient Portfolio Selection

Authors: Kostas Metaxiotis, Kostas Liagkouras

Abstract:

This study presents an Expert System specially designed to be used with Multiobjective Evolutionary Algorithms (MOEAs) for the solution of the portfolio selection problem. The validation of the proposed hybrid System is done by using data sets from Hang Seng 31 in Hong Kong, DAX 100 in Germany and FTSE 100 in UK. The performance of the proposed system is assessed in comparison with the Non-dominated Sorting Genetic Algorithm II (NSGAII). The evaluation of the performance is based on different performance metrics that evaluate both the proximity of the solutions to the Pareto front and their dispersion on it. The results show that the proposed hybrid system is efficient for the solution of this kind of problems.

Keywords: expert systems, multi-objective optimization, evolutionary algorithms, portfolio selection

Procedia PDF Downloads 440
37617 Basic One-Dimensional Modelica®-Model for Simulation of Gas-Phase Adsorber Dynamics

Authors: Adrian Rettig, Silvan Schneider, Reto Tamburini, Mirko Kleingries, Ulf Christian Muller

Abstract:

Industrial adsorption processes are, mainly due to si-multaneous heat and mass transfer, characterized by a high level of complexity. The conception of such processes often does not take place systematically; instead scale-up/down respectively number-up/down methods based on existing systems are used. This paper shows how Modelica® can be used to develop a transient model enabling a more systematic design of such ad- and desorption components and processes. The core of this model is a lumped-element submodel of a single adsorbent grain, where the thermodynamic equilibria and the kinetics of the ad- and desorption processes are implemented and solved on the basis of mass-, momentum and energy balances. For validation of this submodel, a fixed bed adsorber, whose characteristics are described in detail in the literature, was modeled and simulated. The simulation results are in good agreement with the experimental results from the literature. Therefore, the model development will be continued, and the extended model will be applied to further adsorber types like rotor adsorbers and moving bed adsorbers.

Keywords: adsorption, desorption, linear driving force, dynamic model, Modelica®, integral equation approach

Procedia PDF Downloads 371
37616 Predicting Indonesia External Debt Crisis: An Artificial Neural Network Approach

Authors: Riznaldi Akbar

Abstract:

In this study, we compared the performance of the Artificial Neural Network (ANN) model with back-propagation algorithm in correctly predicting in-sample and out-of-sample external debt crisis in Indonesia. We found that exchange rate, foreign reserves, and exports are the major determinants to experiencing external debt crisis. The ANN in-sample performance provides relatively superior results. The ANN model is able to classify correctly crisis of 89.12 per cent with reasonably low false alarms of 7.01 per cent. In out-of-sample, the prediction performance fairly deteriorates compared to their in-sample performances. It could be explained as the ANN model tends to over-fit the data in the in-sample, but it could not fit the out-of-sample very well. The 10-fold cross-validation has been used to improve the out-of-sample prediction accuracy. The results also offer policy implications. The out-of-sample performance could be very sensitive to the size of the samples, as it could yield a higher total misclassification error and lower prediction accuracy. The ANN model could be used to identify past crisis episodes with some accuracy, but predicting crisis outside the estimation sample is much more challenging because of the presence of uncertainty.

Keywords: debt crisis, external debt, artificial neural network, ANN

Procedia PDF Downloads 445
37615 Comparative Study of Ad Hoc Routing Protocols in Vehicular Ad-Hoc Networks for Smart City

Authors: Khadija Raissi, Bechir Ben Gouissem

Abstract:

In this paper, we perform the investigation of some routing protocols in Vehicular Ad-Hoc Network (VANET) context. Indeed, we study the efficiency of protocols like Dynamic Source Routing (DSR), Ad hoc On-demand Distance Vector Routing (AODV), Destination Sequenced Distance Vector (DSDV), Optimized Link State Routing convention (OLSR) and Vehicular Multi-hop algorithm for Stable Clustering (VMASC) in terms of packet delivery ratio (PDR) and throughput. The performance evaluation and comparison between the studied protocols shows that the VMASC is the best protocols regarding fast data transmission and link stability in VANETs. The validation of all results is done by the NS3 simulator.

Keywords: VANET, smart city, AODV, OLSR, DSR, OLSR, VMASC, routing protocols, NS3

Procedia PDF Downloads 298
37614 Systematic Review of Quantitative Risk Assessment Tools and Their Effect on Racial Disproportionality in Child Welfare Systems

Authors: Bronwen Wade

Abstract:

Over the last half-century, child welfare systems have increasingly relied on quantitative risk assessment tools, such as actuarial or predictive risk tools. These tools are developed by performing statistical analysis of how attributes captured in administrative data are related to future child maltreatment. Some scholars argue that attributes in administrative data can serve as proxies for race and that quantitative risk assessment tools reify racial bias in decision-making. Others argue that these tools provide more “objective” and “scientific” guides for decision-making instead of subjective social worker judgment. This study performs a systematic review of the literature on the impact of quantitative risk assessment tools on racial disproportionality; it examines methodological biases in work on this topic, summarizes key findings, and provides suggestions for further work. A search of CINAHL, PsychInfo, Proquest Social Science Premium Collection, and the ProQuest Dissertations and Theses Collection was performed. Academic and grey literature were included. The review includes studies that use quasi-experimental methods and development, validation, or re-validation studies of quantitative risk assessment tools. PROBAST (Prediction model Risk of Bias Assessment Tool) and CHARMS (CHecklist for critical Appraisal and data extraction for systematic Reviews of prediction Modelling Studies) were used to assess the risk of bias and guide data extraction for risk development, validation, or re-validation studies. ROBINS-I (Risk of Bias in Non-Randomized Studies of Interventions) was used to assess for bias and guide data extraction for the quasi-experimental studies identified. Due to heterogeneity among papers, a meta-analysis was not feasible, and a narrative synthesis was conducted. 11 papers met the eligibility criteria, and each has an overall high risk of bias based on the PROBAST and ROBINS-I assessments. This is deeply concerning, as major policy decisions have been made based on a limited number of studies with a high risk of bias. The findings on racial disproportionality have been mixed and depend on the tool and approach used. Authors use various definitions for racial equity, fairness, or disproportionality. These concepts of statistical fairness are connected to theories about the reason for racial disproportionality in child welfare or social definitions of fairness that are usually not stated explicitly. Most findings from these studies are unreliable, given the high degree of bias. However, some of the less biased measures within studies suggest that quantitative risk assessment tools may worsen racial disproportionality, depending on how disproportionality is mathematically defined. Authors vary widely in their approach to defining and addressing racial disproportionality within studies, making it difficult to generalize findings or approaches across studies. This review demonstrates the power of authors to shape policy or discourse around racial justice based on their choice of statistical methods; it also demonstrates the need for improved rigor and transparency in studies of quantitative risk assessment tools. Finally, this review raises concerns about the impact that these tools have on child welfare systems and racial disproportionality.

Keywords: actuarial risk, child welfare, predictive risk, racial disproportionality

Procedia PDF Downloads 54
37613 Validation of the Arabic Version of the Positive and Negative Syndrome Scale (PANSS)

Authors: Arij Yehya, Suhaila Ghuloum, Abdlmoneim Abdulhakam, Azza Al-Mujalli, Mark Opler, Samer Hammoudeh, Yahya Hani, Sundus Mari, Reem Elsherbiny, Ziyad Mahfoud, Hassen Al-Amin

Abstract:

Introduction: The Positive and Negative Syndrome Scale (PANSS) is a valid instrument developed by Kay and colleagues6 to assess symptoms of patients with schizophrenia. It consists of 30 items that factor the symptoms into three subscales: positive, negative and general psychopathology. This scale has been translated and validated in several languages. Objective: This study aims to determine the validity and psychometric properties of the Arabic version of the PANSS. Methods: A standardized translation and cultural adaptation method was adopted. Patients diagnosed with schizophrenia (n=98), according to psychiatrist’s diagnosis based on DSM-IV criteria, were recruited from the Psychiatry Department at Rumailah Hospital, Qatar. A first rater confirmed the diagnosis using the Arabic version of Mini International Neuropsychiatric Interview (MINI 6). A second and independent rater-administered the Arabic version of PANSS. Also, a control group (n=101), with no history of psychiatric disorder was recruited from the family and friends of the patients and from primary health care centers in Qatar. Results: There were more males than females in our sample of patients with schizophrenia (68.9% and 31.6%, respectively). On the other hand, in the control group the number of females outweighed that of males (58.4% and 41.6% respectively). The scale had a good internal consistency with Cronbach’s alpha 0.91. There was a significant difference between the scores on the three subscales of the PANSS. Patients with schizophrenia scored significantly higher (p<.0001) than the control subjects on subscales for positive symptoms 20.01(SD=7.21) and 7.30(SD=1.38), negative symptoms 18.89(SD=8.88) and 7.37(SD=2.38) and general psychopathology 34.41 (SD=11.56) and 16.93 (SD=3.93), respectively. Factor analysis and ROC curve were carried out to further test the psychometrics of the scale. Conclusions: The Arabic version of PANSS is a reliable and valid tool to assess both positive and negative symptoms of patients with schizophrenia in a balanced manner. In addition to providing the Arab population with a standardized tool to monitor symptoms of schizophrenia, this version provides a gateway to compare the prevalence of positive and negative symptoms in the Arab world which can be compared to others done elsewhere.

Keywords: Arabic version, assessment, diagnosis, schizophrenia, validation

Procedia PDF Downloads 635
37612 Stock Prediction and Portfolio Optimization Thesis

Authors: Deniz Peksen

Abstract:

This thesis aims to predict trend movement of closing price of stock and to maximize portfolio by utilizing the predictions. In this context, the study aims to define a stock portfolio strategy from models created by using Logistic Regression, Gradient Boosting and Random Forest. Recently, predicting the trend of stock price has gained a significance role in making buy and sell decisions and generating returns with investment strategies formed by machine learning basis decisions. There are plenty of studies in the literature on the prediction of stock prices in capital markets using machine learning methods but most of them focus on closing prices instead of the direction of price trend. Our study differs from literature in terms of target definition. Ours is a classification problem which is focusing on the market trend in next 20 trading days. To predict trend direction, fourteen years of data were used for training. Following three years were used for validation. Finally, last three years were used for testing. Training data are between 2002-06-18 and 2016-12-30 Validation data are between 2017-01-02 and 2019-12-31 Testing data are between 2020-01-02 and 2022-03-17 We determine Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate as benchmarks which we should outperform. We compared our machine learning basis portfolio return on test data with return of Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate. We assessed our model performance with the help of roc-auc score and lift charts. We use logistic regression, Gradient Boosting and Random Forest with grid search approach to fine-tune hyper-parameters. As a result of the empirical study, the existence of uptrend and downtrend of five stocks could not be predicted by the models. When we use these predictions to define buy and sell decisions in order to generate model-based-portfolio, model-based-portfolio fails in test dataset. It was found that Model-based buy and sell decisions generated a stock portfolio strategy whose returns can not outperform non-model portfolio strategies on test dataset. We found that any effort for predicting the trend which is formulated on stock price is a challenge. We found same results as Random Walk Theory claims which says that stock price or price changes are unpredictable. Our model iterations failed on test dataset. Although, we built up several good models on validation dataset, we failed on test dataset. We implemented Random Forest, Gradient Boosting and Logistic Regression. We discovered that complex models did not provide advantage or additional performance while comparing them with Logistic Regression. More complexity did not lead us to reach better performance. Using a complex model is not an answer to figure out the stock-related prediction problem. Our approach was to predict the trend instead of the price. This approach converted our problem into classification. However, this label approach does not lead us to solve the stock prediction problem and deny or refute the accuracy of the Random Walk Theory for the stock price.

Keywords: stock prediction, portfolio optimization, data science, machine learning

Procedia PDF Downloads 81
37611 Experimental Investigation, Analysis and Optimization of Performance and Emission Characteristics of Composite Oil Methyl Esters at 160 bar, 180 bar and 200 bar Injection Pressures by Multifunctional Criteria Technique

Authors: Yogish Huchaiah, Chandrashekara Krishnappa

Abstract:

This study considers the optimization and validation of experimental results using Multi-Functional Criteria Technique (MFCT). MFCT is concerned with structuring and solving decision and planning problems involving multiple variables. Production of biodiesel from Composite Oil Methyl Esters (COME) of Jatropha and Pongamia oils, mixed in various proportions and Biodiesel thus obtained from two step transesterification process were tested for various Physico-Chemical properties and it has been ascertained that they were within limits proposed by ASTME. They were blended with Petrodiesel in various proportions. These Methyl Esters were blended with Petrodiesel in various proportions and coded. These blends were used as fuels in a computerized CI DI engine to investigate Performance and Emission characteristics. From the analysis of results, it was found that 180MEM4B20 blend had the maximum Performance and minimum Emissions. To validate the experimental results, MFCT was used. Characteristics such as Fuel Consumption (FC), Brake Power (BP), Brake Specific Fuel Consumption (BSFC), Brake Thermal Efficiency (BTE), Carbon dioxide (CO2), Carbon Monoxide (CO), Hydro Carbon (HC) and Nitrogen oxide (NOx) were considered as dependent variables. It was found from the application of this method that the optimized combination of Injection Pressure (IP), Mix and Blend is 178MEM4.2B24. Overall corresponding variation between optimization and experimental results was found to be 7.45%.

Keywords: COME, IP, MFCT, optimization, PI, PN, PV

Procedia PDF Downloads 211
37610 Spatial Climate Changes in the Province of Macerata, Central Italy, Analyzed by GIS Software

Authors: Matteo Gentilucci, Marco Materazzi, Gilberto Pambianchi

Abstract:

Climate change is an increasingly central issue in the world, because it affects many of human activities. In this context regional studies are of great importance because they sometimes differ from the general trend. This research focuses on a small area of central Italy which overlooks the Adriatic Sea, the province of Macerata. The aim is to analyze space-based climate changes, for precipitation and temperatures, in the last 3 climatological standard normals (1961-1990; 1971-2000; 1981-2010) through GIS software. The data collected from 30 weather stations for temperature and 61 rain gauges for precipitation were subject to quality controls: validation and homogenization. These data were fundamental for the spatialization of the variables (temperature and precipitation) through geostatistical techniques. To assess the best geostatistical technique for interpolation, the results of cross correlation were used. The co-kriging method with altitude as independent variable produced the best cross validation results for all time periods, among the methods analysed, with 'root mean square error standardized' close to 1, 'mean standardized error' close to 0, 'average standard error' and 'root mean square error' with similar values. The maps resulting from the analysis were compared by subtraction between rasters, producing 3 maps of annual variation and three other maps for each month of the year (1961/1990-1971/2000; 1971/2000-1981/2010; 1961/1990-1981/2010). The results show an increase in average annual temperature of about 0.1°C between 1961-1990 and 1971-2000 and 0.6 °C between 1961-1990 and 1981-2010. Instead annual precipitation shows an opposite trend, with an average difference from 1961-1990 to 1971-2000 of about 35 mm and from 1961-1990 to 1981-2010 of about 60 mm. Furthermore, the differences in the areas have been highlighted with area graphs and summarized in several tables as descriptive analysis. In fact for temperature between 1961-1990 and 1971-2000 the most areally represented frequency is 0.08°C (77.04 Km² on a total of about 2800 km²) with a kurtosis of 3.95 and a skewness of 2.19. Instead, the differences for temperatures from 1961-1990 to 1981-2010 show a most areally represented frequency of 0.83 °C, with -0.45 as kurtosis and 0.92 as skewness (36.9 km²). Therefore it can be said that distribution is more pointed for 1961/1990-1971/2000 and smoother but more intense in the growth for 1961/1990-1981/2010. In contrast, precipitation shows a very similar shape of distribution, although with different intensities, for both variations periods (first period 1961/1990-1971/2000 and second one 1961/1990-1981/2010) with similar values of kurtosis (1st = 1.93; 2nd = 1.34), skewness (1st = 1.81; 2nd = 1.62 for the second) and area of the most represented frequency (1st = 60.72 km²; 2nd = 52.80 km²). In conclusion, this methodology of analysis allows the assessment of small scale climate change for each month of the year and could be further investigated in relation to regional atmospheric dynamics.

Keywords: climate change, GIS, interpolation, co-kriging

Procedia PDF Downloads 128
37609 Engineering Method to Measure the Impact Sound Improvement with Floor Coverings

Authors: Katarzyna Baruch, Agata Szelag, Jaroslaw Rubacha, Bartlomiej Chojnacki, Tadeusz Kamisinski

Abstract:

Methodology used to measure the reduction of transmitted impact sound by floor coverings situated on a massive floor is described in ISO 10140-3: 2010. To carry out such tests, the standardised reverberation room separated by a standard floor from the second measuring room are required. The need to have a special laboratory results in high cost and low accessibility of this measurement. The authors propose their own engineering method to measure the impact sound improvement with floor coverings. This method does not require standard rooms and floor. This paper describes the measurement procedure of proposed engineering method. Further, verification tests were performed. Validation of the proposed method was based on the analytical model, Statistical Energy Analysis (SEA) model and empirical measurements. The received results were related to corresponding ones obtained from ISO 10140-3:2010 measurements. The study confirmed the usefulness of the engineering method.

Keywords: building acoustic, impact noise, impact sound insulation, impact sound transmission, reduction of impact sound

Procedia PDF Downloads 324
37608 Multiscale Model of Blast Explosion Human Injury Biomechanics

Authors: Raj K. Gupta, X. Gary Tan, Andrzej Przekwas

Abstract:

Bomb blasts from Improvised Explosive Devices (IEDs) account for vast majority of terrorist attacks worldwide. Injuries caused by IEDs result from a combination of the primary blast wave, penetrating fragments, and human body accelerations and impacts. This paper presents a multiscale computational model of coupled blast physics, whole human body biodynamics and injury biomechanics of sensitive organs. The disparity of the involved space- and time-scales is used to conduct sequential modeling of an IED explosion event, CFD simulation of blast loads on the human body and FEM modeling of body biodynamics and injury biomechanics. The paper presents simulation results for blast-induced brain injury coupling macro-scale brain biomechanics and micro-scale response of sensitive neuro-axonal structures. Validation results on animal models and physical surrogates are discussed. Results of our model can be used to 'replicate' filed blast loadings in laboratory controlled experiments using animal models and in vitro neuro-cultures.

Keywords: blast waves, improvised explosive devices, injury biomechanics, mathematical models, traumatic brain injury

Procedia PDF Downloads 249
37607 Developing a Model of Teaching Writing Based On Reading Approach through Reflection Strategy for EFL Students of STKIP YPUP

Authors: Eny Syatriana, Ardiansyah

Abstract:

The purpose of recent study was to develop a learning model on writing, based on the reading texts which will be read by the students using reflection strategy. The strategy would allow the students to read the text and then they would write back the main idea and to develop the text by using their own sentences. So, the writing practice was begun by reading an interesting text, then the students would develop the text which has been read into their writing. The problem questions are (1) what kind of learning model that can develop the students writing ability? (2) what is the achievement of the students of STKIP YPUP through reflection strategy? (3) is the using of the strategy effective to develop students competence In writing? (4) in what level are the students interest toward the using of a strategy In writing subject? This development research consisted of some steps, they are (1) need analysis (2) model design (3) implementation (4) model evaluation. The need analysis was applied through discussion among the writing lecturers to create a learning model for writing subject. To see the effectiveness of the model, an experiment would be delivered for one class. The instrument and learning material would be validated by the experts. In every steps of material development, there was a learning process, where would be validated by an expert. The research used development design. These Principles and procedures or research design and development .This study, researcher would do need analysis, creating prototype, content validation, and limited empiric experiment to the sample. In each steps, there should be an assessment and revision to the drafts before continue to the next steps. The second year, the prototype would be tested empirically to four classes in STKIP YPUP for English department. Implementing the test greatly was done through the action research and followed by evaluation and validation from the experts.

Keywords: learning model, reflection, strategy, reading, writing, development

Procedia PDF Downloads 365
37606 An Online Priority-Configuration Algorithm for Obstacle Avoidance of the Unmanned Air Vehicles Swarm

Authors: Lihua Zhu, Jianfeng Du, Yu Wang, Zhiqiang Wu

Abstract:

Collision avoidance problems of a swarm of unmanned air vehicles (UAVs) flying in an obstacle-laden environment are investigated in this paper. Given that the UAV swarm needs to adapt to the obstacle distribution in dynamic operation, a priority configuration is designed to guide the UAVs to pass through the obstacles in turn. Based on the collision cone approach and the prediction of the collision time, a collision evaluation model is established to judge the urgency of the imminent collision of each UAV, and the evaluation result is used to assign the priority of each UAV to further instruct them going through the obstacles in descending order. At last, the simulation results provide the promising validation in terms of the efficiency and scalability of the proposed approach.

Keywords: UAV swarm, collision avoidance, complex environment, online priority design

Procedia PDF Downloads 215
37605 Stabilization of the Bernoulli-Euler Plate Equation: Numerical Analysis

Authors: Carla E. O. de Moraes, Gladson O. Antunes, Mauro A. Rincon

Abstract:

The aim of this paper is to study the internal stabilization of the Bernoulli-Euler equation numerically. For this, we consider a square plate subjected to a feedback/damping force distributed only in a subdomain. An algorithm for obtaining an approximate solution to this problem was proposed and implemented. The numerical method used was the Finite Difference Method. Numerical simulations were performed and showed the behavior of the solution, confirming the theoretical results that have already been proved in the literature. In addition, we studied the validation of the numerical scheme proposed, followed by an analysis of the numerical error; and we conducted a study on the decay of the energy associated.

Keywords: Bernoulli-Euler plate equation, numerical simulations, stability, energy decay, finite difference method

Procedia PDF Downloads 416
37604 Numerical Simulation and Laboratory Tests for Rebar Detection in Reinforced Concrete Structures using Ground Penetrating Radar

Authors: Maha Al-Soudani, Gilles Klysz, Jean-Paul Balayssac

Abstract:

The aim of this paper is to use Ground Penetrating Radar (GPR) as a non-destructive testing (NDT) method to increase its accuracy in recognizing the geometric reinforced concrete structures and in particular, the position of steel bars. This definition will help the managers to assess the state of their structures on the one hand vis-a-vis security constraints and secondly to quantify the need for maintenance and repair. Several configurations of acquisition and processing of the simulated signal were tested to propose and develop an appropriate imaging algorithm in the propagation medium to locate accurately the rebar. A subsequent experimental validation was used by testing the imaging algorithm on real reinforced concrete structures. The results indicate that, this algorithm is capable of estimating the reinforcing steel bar position to within (0-1) mm.

Keywords: GPR, NDT, Reinforced concrete structures, Rebar location.

Procedia PDF Downloads 504
37603 COVID-19 Detection from Computed Tomography Images Using UNet Segmentation, Region Extraction, and Classification Pipeline

Authors: Kenan Morani, Esra Kaya Ayana

Abstract:

This study aimed to develop a novel pipeline for COVID-19 detection using a large and rigorously annotated database of computed tomography (CT) images. The pipeline consists of UNet-based segmentation, lung extraction, and a classification part, with the addition of optional slice removal techniques following the segmentation part. In this work, a batch normalization was added to the original UNet model to produce lighter and better localization, which is then utilized to build a full pipeline for COVID-19 diagnosis. To evaluate the effectiveness of the proposed pipeline, various segmentation methods were compared in terms of their performance and complexity. The proposed segmentation method with batch normalization outperformed traditional methods and other alternatives, resulting in a higher dice score on a publicly available dataset. Moreover, at the slice level, the proposed pipeline demonstrated high validation accuracy, indicating the efficiency of predicting 2D slices. At the patient level, the full approach exhibited higher validation accuracy and macro F1 score compared to other alternatives, surpassing the baseline. The classification component of the proposed pipeline utilizes a convolutional neural network (CNN) to make final diagnosis decisions. The COV19-CT-DB dataset, which contains a large number of CT scans with various types of slices and rigorously annotated for COVID-19 detection, was utilized for classification. The proposed pipeline outperformed many other alternatives on the dataset.

Keywords: classification, computed tomography, lung extraction, macro F1 score, UNet segmentation

Procedia PDF Downloads 133
37602 Strategic Management Model for High Performance Sports Centers

Authors: Jose Ramon Sanabria Navarro, Yahilina Silveira Perez, Valentin Molina Moreno, Digna Dionisia Perez Bravo

Abstract:

The general objective of this research is to conceive a model of strategic direction for Latin American high-performance sports centers for the improvement of their results. The sample is 62 managers, 187 trainers, 2930 athletes and 62 expert researchers from centers in Cuba, Venezuela, Ecuador, Colombia and Argentina, for 3241. The measurement instrument includes 12 key variables in the process of management strategies which are consolidated with the factorial analysis and the ANOVA of a factor through the SPSS 24.0. The reliability of the scale obtained an alpha higher than 0.7 in each sample. In this sense, a model is obtained that taxes the deficiencies detected in the diagnosis, based on the needs of the members of these organizations, considering criteria and theories of the strategic direction in the improvement of the organizational results. The validation of the model for high performance sports centers of the countries analyzed aims to develop joint strategies to generate synergies in their operational mode, which leads to enhance the sports organization.

Keywords: sports organization, information management, decision making, control

Procedia PDF Downloads 132
37601 Classifying Students for E-Learning in Information Technology Course Using ANN

Authors: Sirilak Areerachakul, Nat Ployong, Supayothin Na Songkla

Abstract:

This research’s objective is to select the model with most accurate value by using Neural Network Technique as a way to filter potential students who enroll in IT course by electronic learning at Suan Suanadha Rajabhat University. It is designed to help students selecting the appropriate courses by themselves. The result showed that the most accurate model was 100 Folds Cross-validation which had 73.58% points of accuracy.

Keywords: artificial neural network, classification, students, e-learning

Procedia PDF Downloads 427
37600 Flow Visualization in Biological Complex Geometries for Personalized Medicine

Authors: Carlos Escobar-del Pozo, César Ahumada-Monroy, Azael García-Rebolledo, Alberto Brambila-Solórzano, Gregorio Martínez-Sánchez, Luis Ortiz-Rincón

Abstract:

Numerical simulations of flow in complex biological structures have gained considerable attention in the last years. However, the major issue is the validation of the results. The present work shows a Particle Image Velocimetry PIV flow visualization technique in complex biological structures, particularly in intracranial aneurysms. A methodology to reconstruct and generate a transparent model has been developed, as well as visualization and particle tracking techniques. The generated transparent models allow visualizing the flow patterns with a regular camera using the visualization techniques. The final goal is to use visualization as a tool to provide more information on the treatment and surgery decisions in aneurysms.

Keywords: aneurysms, PIV, flow visualization, particle tracking

Procedia PDF Downloads 92
37599 Assessment the Correlation of Rice Yield Traits by Simulation and Modelling Methods

Authors: Davood Barari Tari

Abstract:

In order to investigate the correlation of rice traits in different nitrogen management methods by modeling programming, an experiment was laid out in rice paddy field in an experimental field at Caspian Coastal Sea region from 2013 to 2014. Variety used was Shiroudi as a high yielding variety. Nitrogen management was in two methods. Amount of nitrogen at four levels (30, 60, 90, and 120 Kg N ha-1 and control) and nitrogen-splitting at four levels (T1: 50% in base + 50% in maximum tillering stage, T2= 33.33% basal +33.33% in maximum tillering stage +33.33% in panicle initiation stage, T3=25% basal+37.5% in maximum tillering stage +37.5% in panicle initiation stage, T4: 25% in basal + 25% in maximum tillering stage + 50% in panicle initiation stage). Results showed that nitrogen traits, total grain number, filled spikelets, panicle number per m2 had a significant correlation with grain yield. Results related to calibrated and validation of rice model methods indicated that correlation between rice yield and yield components was accurate. The correlation between panicle length and grain yield was minimum. Physiological indices was simulated with low accuracy. According to results, investigation of the correlation between rice traits in physiological, morphological and phenological characters and yield by modeling and simulation methods are very useful.

Keywords: rice, physiology, modelling, simulation, yield traits

Procedia PDF Downloads 344
37598 Author Name Disambiguation for Biomedical Literature

Authors: Parthiban Srinivasan

Abstract:

PubMed provides online access to the National Library of Medicine database (MEDLINE) and other publications, which contain close to 25 million scientific citations from 1865 to the present. There are close to 80 million author name instances in those close to 25 million citations. For any work of literature, a fundamental issue is to identify the individual(s) who wrote it, and conversely, to identify all of the works that belong to a given individual. Due to the lack of universal standards for name information, there are two aspects of name ambiguity: name synonymy (a single author with multiple name representations), and name homonymy (multiple authors sharing the same name representation). In this talk, we present some results from our extensive work in author name disambiguation for PubMed citations. Information will be presented on the effectiveness and shortcomings of different aspects of successful name disambiguation such as parsing, validation, standardization and normalization.

Keywords: disambiguation, normalization, parsing, PubMed

Procedia PDF Downloads 301
37597 Utilizing Extended Reality in Disaster Risk Reduction Education: A Scoping Review

Authors: Stefano Scippo, Damiana Luzzi, Stefano Cuomo, Maria Ranieri

Abstract:

Background: In response to the rise in natural disasters linked to climate change, numerous studies on Disaster Risk Reduction Education (DRRE) have emerged since the '90s, mainly using a didactic transmission-based approach. Effective DRRE should align with an interactive, experiential, and participatory educational model, which can be costly and risky. A potential solution is using simulations facilitated by eXtended Reality (XR). Research Question: This study aims to conduct a scoping review to explore educational methodologies that use XR to enhance knowledge among teachers, students, and citizens about environmental risks, natural disasters (including climate-related ones), and their management. Method: A search string of 66 keywords was formulated, spanning three domains: 1) education and target audience, 2) environment and natural hazards, and 3) technologies. On June 21st, 2023, the search string was used across five databases: EBSCOhost, IEEE Xplore, PubMed, Scopus, and Web of Science. After deduplication and removing papers without abstracts, 2,152 abstracts (published between 2013 and 2023) were analyzed and 2,062 papers were excluded, followed by the exclusion of 56 papers after full-text scrutiny. Excluded studies focused on unrelated technologies, non-environmental risks, and lacked educational outcomes or accessible texts. Main Results: The 34 reviewed papers were analyzed for context, risk type, research methodology, learning objectives, XR technology use, outcomes, and educational affordances of XR. Notably, since 2016, there has been a rise in scientific publications, focusing mainly on seismic events (12 studies) and floods (9), with a significant contribution from Asia (18 publications), particularly Japan (7 studies). Methodologically, the studies were categorized into empirical (26) and non-empirical (8). Empirical studies involved user or expert validation of XR tools, while non-empirical studies included systematic reviews and theoretical proposals without experimental validation. Empirical studies were further classified into quantitative, qualitative, or mixed-method approaches. Six qualitative studies involved small groups of users or experts, while 20 quantitative or mixed-method studies used seven different research designs, with most (17) employing a quasi-experimental, one-group post-test design, focusing on XR technology usability over educational effectiveness. Non-experimental studies had methodological limitations, making their results hypothetical and in need of further empirical validation. Educationally, the learning objectives centered on knowledge and skills for surviving natural disaster emergencies. All studies recommended XR technologies for simulations or serious games but did not develop comprehensive educational frameworks around these tools. XR-based tools showed potential superiority over traditional methods in teaching risk and emergency management skills. However, conclusions were more valid in studies with experimental designs; otherwise, they remained hypothetical without empirical evidence. The educational affordances of XR, mainly user engagement, were confirmed by the studies. Authors’ Conclusions: The analyzed literature lacks specific educational frameworks for XR in DRRE, focusing mainly on survival knowledge and skills. There is a need to expand educational approaches to include uncertainty education, developing competencies that encompass knowledge, skills, and attitudes like risk perception.

Keywords: disaster risk reduction education, educational technologies, scoping review, XR technologies

Procedia PDF Downloads 25
37596 Translation, Cross-Cultural Adaption, and Validation of the Vividness of Movement Imagery Questionnaire 2 (VMIQ-2) to Classical Arabic Language

Authors: Majid Alenezi, Abdelbare Algamode, Amy Hayes, Gavin Lawrence, Nichola Callow

Abstract:

The purpose of this study was to translate and culturally adapt the Vividness of Movement Imagery Questionnaire-2 (VMIQ-2) from English to produce a new Arabic version (VMIQ-2A), and to evaluate the reliability and validity of the translated questionnaire. The questionnaire assesses how vividly and clearly individuals are able to imagine themselves performing everyday actions. Its purpose is to measure individuals’ ability to conduct movement imagery, which can be defined as “the cognitive rehearsal of a task in the absence of overt physical movement.” Movement imagery has been introduced in physiotherapy as a promising intervention technique, especially when physical exercise is not possible (e.g. pain, immobilisation.) Considerable evidence indicates movement imagery interventions improve physical function, but to maximize efficacy it is important to know the imagery abilities of the individuals being treated. Given the increase in the global sharing of knowledge it is desirable to use standard measures of imagery ability across language and cultures, thus motivating this project. The translation procedure followed guidelines from the Translation and Cultural Adaptation group of the International Society for Pharmacoeconomics and Outcomes Research and involved the following phases: Preparation; the original VMIQ-2 was adapted slightly to provide additional information and simplified grammar. Forward translation; three native speakers resident in Saudi Arabia translated the original VMIQ-2 from English to Arabic, following instruction to preserve meaning (not literal translation), and cultural relevance. Reconciliation; the project manager (first author), the primary translator and a physiotherapist reviewed the three independent translations to produce a reconciled first Arabic draft of VMIQ-2A. Backward translation; a fourth translator (native Arabic speaker fluent in English) translated literally the reconciled first Arabic draft to English. The project manager and two study authors compared the English back translation to the original VMIQ-2 and produced the second Arabic draft. Cognitive debriefing; to assess participants’ understanding of the second Arabic draft, 7 native Arabic speakers resident in the UK completed the questionnaire, and rated the clearness of the questions, specified difficult words or passages, and wrote in their own words their understanding of key terms. Following review of this feedback, a final Arabic version was created. 142 native Arabic speakers completed the questionnaire in community meeting places or at home; a subset of 44 participants completed the questionnaire a second time 1 week later. Results showed the translated questionnaire to be valid and reliable. Correlation coefficients indicated good test-retest reliability. Cronbach’s a indicated high internal consistency. Construct validity was tested in two ways. Imagery ability scores have been found to be invariant across gender; this result was replicated within the current study, assessed by independent-samples t-test. Additionally, experienced sports participants have higher imagery ability than those less experienced; this result was also replicated within the current study, assessed by analysis of variance, supporting construct validity. Results provide preliminary evidence that the VMIQ-2A is reliable and valid to be used with a general population who are native Arabic speakers. Future research will include validation of the VMIQ-2A in a larger sample, and testing validity in specific patient populations.

Keywords: motor imagery, physiotherapy, translation and validation, imagery ability

Procedia PDF Downloads 335
37595 Numerical Validation of Liquid Nitrogen Phase Change in a Star-Shaped Ambient Vaporizer

Authors: Yusuf Yilmaz, Gamze Gediz Ilis

Abstract:

Gas Nitrogen where has a boiling point of -189.52oC at atmospheric pressure widely used in the industry. Nitrogen that used in the industry should be transported in liquid form to the plant area. Ambient air vaporizer (AAV) generally used for vaporization of cryogenic gases such as liquid nitrogen (LN2), liquid oxygen (LOX), liquid natural gas (LNG), and liquid argon (LAR) etc. AAV is a group of star-shaped fin vaporizer. The design and the effect of the shape of fins of the vaporizer is one of the most important criteria for the performance of the vaporizer. In this study, the performance of AAV working with liquid nitrogen was analyzed numerically in a star-shaped aluminum finned pipe. The numerical analysis is performed in order to investigate the heat capacity of the vaporizer per meter pipe length. By this way, the vaporizer capacity can be predicted for the industrial applications. In order to achieve the validation of the numerical solution, the experimental setup is constructed. The setup includes a liquid nitrogen tank with a pressure of 9 bar. The star-shaped aluminum finned tube vaporizer is connected to the LN2 tank. The inlet and the outlet pressure and temperatures of the LN2 of the vaporizer are measured. The mass flow rate of the LN2 is also measured and collected. The comparison of the numerical solution is performed by these measured data. The ambient conditions of the experiment are given as boundary conditions to the numerical model. The surface tension and contact angle have a significant effect on the boiling of liquid nitrogen. Average heat transfer coefficient including convective and nucleated boiling components should be obtained for liquid nitrogen saturated flow boiling in the finned tube. Fluent CFD module is used to simulate the numerical solution. The turbulent k-ε model is taken to simulate the liquid nitrogen flow. The phase change is simulated by using the evaporation-condensation approach used with user-defined functions (UDF). The comparison of the numerical and experimental results will be shared in this study. Besides, the performance capacity of the star-shaped finned pipe vaporizer will be calculated in this study. Based on this numerical analysis, the performance of the vaporizer per unit length can be predicted for the industrial applications and the suitable pipe length of the vaporizer can be found for the special cases.

Keywords: liquid nitrogen, numerical modeling, two-phase flow, cryogenics

Procedia PDF Downloads 120
37594 The Validation of RadCalc for Clinical Use: An Independent Monitor Unit Verification Software

Authors: Junior Akunzi

Abstract:

In the matter of patient treatment planning quality assurance in 3D conformational therapy (3D-CRT) and volumetric arc therapy (VMAT or RapidArc), the independent monitor unit verification calculation (MUVC) is an indispensable part of the process. Concerning 3D-CRT treatment planning, the MUVC can be performed manually applying the standard ESTRO formalism. However, due to the complex shape and the amount of beams in advanced treatment planning technic such as RapidArc, the manual independent MUVC is inadequate. Therefore, commercially available software such as RadCalc can be used to perform the MUVC in complex treatment planning been. Indeed, RadCalc (version 6.3 LifeLine Inc.) uses a simplified Clarkson algorithm to compute the dose contribution for individual RapidArc fields to the isocenter. The purpose of this project is the validation of RadCalc in 3D-CRT and RapidArc for treatment planning dosimetry quality assurance at Antoine Lacassagne center (Nice, France). Firstly, the interfaces between RadCalc and our treatment planning systems (TPS) Isogray (version 4.2) and Eclipse (version13.6) were checked for data transfer accuracy. Secondly, we created test plans in both Isogray and Eclipse featuring open fields, wedges fields, and irregular MLC fields. These test plans were transferred from TPSs according to the radiotherapy protocol of DICOM RT to RadCalc and the linac via Mosaiq (version 2.5). Measurements were performed in water phantom using a PTW cylindrical semiflex ionisation chamber (0.3 cm³, 31010) and compared with the TPSs and RadCalc calculation. Finally, 30 3D-CRT plans and 40 RapidArc plans created with patients CT scan were recalculated using the CT scan of a solid PMMA water equivalent phantom for 3D-CRT and the Octavius II phantom (PTW) CT scan for RapidArc. Next, we measure the doses delivered into these phantoms for each plan with a 0.3 cm³ PTW 31010 cylindrical semiflex ionisation chamber (3D-CRT) and 0.015 cm³ PTW PinPoint ionisation chamber (Rapidarc). For our test plans, good agreements were found between calculation (RadCalc and TPSs) and measurement (mean: 1.3%; standard deviation: ± 0.8%). Regarding the patient plans, the measured doses were compared to the calculation in RadCalc and in our TPSs. Moreover, RadCalc calculations were compared to Isogray and Eclispse ones. Agreements better than (2.8%; ± 1.2%) were found between RadCalc and TPSs. As for the comparison between calculation and measurement the agreement for all of our plans was better than (2.3%; ± 1.1%). The independent MU verification calculation software RadCal has been validated for clinical use and for both 3D-CRT and RapidArc techniques. The perspective of this project includes the validation of RadCal for the Tomotherapy machine installed at centre Antoine Lacassagne.

Keywords: 3D conformational radiotherapy, intensity modulated radiotherapy, monitor unit calculation, dosimetry quality assurance

Procedia PDF Downloads 216
37593 Non-Linear Control Based on State Estimation for the Convoy of Autonomous Vehicles

Authors: M-M. Mohamed Ahmed, Nacer K. M’Sirdi, Aziz Naamane

Abstract:

In this paper, a longitudinal and lateral control approach based on a nonlinear observer is proposed for a convoy of autonomous vehicles to follow a desired trajectory. To authors best knowledge, this topic has not yet been sufficiently addressed in the literature for the control of multi vehicles. The modeling of the convoy of the vehicles is revisited using a robotic method for simulation purposes and control design. With these models, a sliding mode observer is proposed to estimate the states of each vehicle in the convoy from the available sensors, then a sliding mode control based on this observer is used to control the longitudinal and lateral movement. The validation and performance evaluation are done using the well-known driving simulator Scanner-Studio. The results are presented for different maneuvers of 5 vehicles.

Keywords: autonomous vehicles, convoy, non-linear control, non-linear observer, sliding mode

Procedia PDF Downloads 141
37592 The Development of Liquid Chromatography Tandem Mass Spectrometry Method for Citrinin Determination in Dry-Fermented Meat Products

Authors: Ana Vulic, Tina Lesic, Nina Kudumija, Maja Kis, Manuela Zadravec, Nada Vahcic, Tomaz Polak, Jelka Pleadin

Abstract:

Mycotoxins are toxic secondary metabolites produced by numerous types of molds. They can contaminate both food and feed so that they represent a serious public health concern. Production of dry-fermented meat products involves ripening, during which molds can overgrow the product surface, produce mycotoxins, and consequently contaminate the final product. Citrinin is a mycotoxin produced mainly by the Penicillium citrinum. Data on citrinin occurrence in both food and feed are limited. Therefore, there is a need for research on citrinin occurrence in these types of meat products. The LC-MS/MS method for citrinin determination was developed and validated. Sample preparation was performed using immunoaffinity columns, which resulted in clean sample extracts. Method validation included the determination of the limit of detection (LOD), the limit of quantification (LOQ), recovery, linearity, and matrix effect in accordance to the latest validation guidance. The determined LOD and LOQ were 0.60 µg/kg and 1.98 µg/kg, respectively, showing a good method sensitivity. The method was tested for its linearity in the calibration range of 1 µg/L to 10 µg/L. The recovery was 100.9 %, while the matrix effect was 0.7 %. This method was employed in the analysis of 47 samples of dry-fermented sausages collected from local households. Citrinin wasn’t detected in any of these samples, probably because of the short ripening period of the tested sausages that takes three months tops. The developed method shall be used to test other types of traditional dry-cured products, such as prosciuttos, whose surface is usually more heavily overgrown by surface molds due to the longer ripening period.

Keywords: citrinin, dry-fermented meat products, LC-MS/MS, mycotoxins

Procedia PDF Downloads 123