Search results for: Statistical Approach
16540 Application of a Hybrid Modified Blade Element Momentum Theory/Computational Fluid Dynamics Approach for Wine Turbine Aerodynamic Performances Prediction
Authors: Samah Laalej, Abdelfattah Bouatem
Abstract:
In the field of wind turbine blades, it is complicated to evaluate the aerodynamic performances through experimental measurements as it requires a lot of computing time and resources. Therefore, in this paper, a hybrid BEM-CFD numerical technique is developed to predict power and aerodynamic forces acting on the blades. Computational fluid dynamics (CFD) simulation was conducted to calculate the drag and lift forces through Ansys software using the K-w model. Then an enhanced BEM code was created to predict the power outputs generated by the wind turbine using the aerodynamic properties extracted from the CFD approach. The numerical approach was compared and validated with experimental data. The power curves calculated from this hybrid method were in good agreement with experimental measurements for all velocity ranges.Keywords: blade element momentum, aerodynamic forces, wind turbine blades, computational fluid dynamics approach
Procedia PDF Downloads 6716539 The Effect of Different Strength Training Methods on Muscle Strength, Body Composition and Factors Affecting Endurance Performance
Authors: Shaher A. I. Shalfawi, Fredrik Hviding, Bjornar Kjellstadli
Abstract:
The main purpose of this study was to measure the effect of two different strength training methods on muscle strength, muscle mass, fat mass and endurance factors. Fourteen physical education students accepted to participate in this study. The participants were then randomly divided into three groups, traditional training group (TTG), cluster training group (CTG) and control group (CG). TTG consisted of 4 participants aged ( ± SD) (22.3 ± 1.5 years), body mass (79.2 ± 15.4 kg) and height (178.3 ± 11.9 cm). CTG consisted of 5 participants aged (22.2 ± 3.5 years), body mass (81.0 ± 24.0 kg) and height (180.2 ± 12.3 cm). CG consisted of 5 participants aged (22 ± 2.8 years), body mass (77 ± 19 kg) and height (174 ± 6.7 cm). The participants underwent a hypertrophy strength training program twice a week consisting of 4 sets of 10 reps at 70% of one-repetition maximum (1RM), using barbell squat and barbell bench press for 8 weeks. The CTG performed 2 x 5 reps using 10 s recovery in between repetitions and 50 s recovery between sets, while TTG performed 4 sets of 10 reps with 90 s recovery in between sets. Pre- and post-tests were administrated to assess body composition (weight, muscle mass, and fat mass), 1RM (bench press and barbell squat) and a laboratory endurance test (Bruce Protocol). Instruments used to collect the data were Tanita BC-601 scale (Tanita, Illinois, USA), Woodway treadmill (Woodway, Wisconsin, USA) and Vyntus CPX breath-to-breath system (Jaeger, Hoechberg, Germany). Analysis was conducted at all measured variables including time to peak VO2, peak VO2, heart rate (HR) at peak VO2, respiratory exchange ratio (RER) at peak VO2, and number of breaths per minute. The results indicate an increase in 1RM performance after 8 weeks of training. The change in 1RM squat was for the TTG = 30 ± 3.8 kg, CTG = 28.6 ± 8.3 kg and CG = 10.3 ± 13.8 kg. Similarly, the change in 1RM bench press was for the TTG = 9.8 ± 2.8 kg, CTG = 7.4 ± 3.4 kg and CG = 4.4 ± 3.4 kg. The within-group analysis from the oxygen consumption measured during the incremental exercise indicated that the TTG had only a statistical significant increase in their RER from 1.16 ± 0.04 to 1.23 ± 0.05 (P < 0.05). The CTG had a statistical significant improvement in their HR at peak VO2 from 186 ± 24 to 191 ± 12 Beats Per Minute (P < 0.05) and their RER at peak VO2 from 1.11 ± 0.06 to 1.18 ±0.05 (P < 0.05). Finally, the CG had only a statistical significant increase in their RER at peak VO2 from 1.11 ± 0.07 to 1.21 ± 0.05 (P < 0.05). The between-group analysis showed no statistical differences between all groups in all the measured variables from the oxygen consumption test during the incremental exercise including changes in muscle mass, fat mass, and weight (kg). The results indicate a similar effect of hypertrophy strength training irrespective of the methods of the training used on untrained subjects. Because there were no notable changes in body-composition measures, the results suggest that the improvements in performance observed in all groups is most probably due to neuro-muscular adaptation to training.Keywords: hypertrophy strength training, cluster set, Bruce protocol, peak VO2
Procedia PDF Downloads 25016538 Characteristics of Wood Plastics Nano-Composites Made of Agricultural Residues and Urban Recycled Polymer Materials
Authors: Amir Nourbakhsh Habibabadi, Alireza Ashori
Abstract:
Context: The growing concern over the management of plastic waste and the high demand for wood-based products have led to the development of wood-plastic composites. Agricultural residues, which are abundantly available, can be used as a source of lignocellulosic fibers in the production of these composites. The use of recycled polymers and nanomaterials is also a promising approach to enhance the mechanical and physical properties of the composites. Research Aim: The aim of this study was to investigate the feasibility of using recycled high-density polyethylene (rHDPE), polypropylene (rPP), and agricultural residues fibers for manufacturing wood-plastic nano-composites. The effects of these materials on the mechanical properties of the composites, specifically tensile and flexural strength, were studied. Methodology: The study utilized an experimental approach where extruders and hot presses were used to fabricate the composites. Five types of cellulosic residues fibers (bagasse, corn stalk, rice straw, sunflower, and canola stem), three levels of nanomaterials (carbon nanotubes, nano silica, and nanoclay), and coupling agent were used to chemically bind the wood/polymer fibers, chemicals, and reinforcement. The mechanical properties of the composites were then analyzed. Findings: The study found that composites made with rHDPE provided moderately superior tensile and flexural properties compared to rPP samples. The addition of agricultural residues in several types of wood-plastic nano-composites significantly improved their bending and tensile properties, with bagasse having the most significant advantage over other lignocellulosic materials. The use of recycled polymers, agricultural residues, and nano-silica resulted in composites with the best strength properties. Theoretical Importance: The study's findings suggest that using agricultural fiber residues as reinforcement in wood/plastic nanocomposites is a viable approach to improve the mechanical properties of the composites. Additionally, the study highlights the potential of using recycled polymers in the development of value-added products without compromising the product's properties. Data Collection and Analysis Procedures: The study collected data on the mechanical properties of the composites using tensile and flexural tests. Statistical analyses were performed to determine the significant effects of the various materials used. Question addressed: Can agricultural residues and recycled polymers be used to manufacture wood-plastic nano-composites with enhanced mechanical properties? Conclusion: The study demonstrates the feasibility of using agricultural residues and recycled polymers in the production of wood-plastic nano-composites. The addition of these materials significantly improved the mechanical properties of the composites, with bagasse being the most effective agricultural residue. The study's findings suggest that composites made from recycled materials can offer value-added products without sacrificing performance.Keywords: polymer, composites, wood, nano
Procedia PDF Downloads 7116537 Revisiting Domestication and Foreignisation Methods: Translating the Quran by the Hybrid Approach
Authors: Aladdin Al-Tarawneh
Abstract:
The Quran, as it is the sacred book of Islam and considered the literal word of God (Allah) in Arabic, is highly translated into many languages; however, the foreignising or the literal approach excessively stains the quality and discredits the final product in the eyes of its receptors. Such an approach fails to capture the intended meaning of the Quran and to communicate it in any language. Therefore, this study is conducted to propose a different approach that seeks involving other ones according to a hybrid model. Indeed, this study challenges the binary adherence that is highly used in Translation Studies (TS) in general and in the translation of the Quran in particular. Drawing on the genuine fact that the Quran can be communicated in any language in terms of meaning, and the translation is not sacred; this paper approaches the translation of the Quran by blending different methods like domestication or foreignisation in a systematic way, avoiding the binary choice made by many translators. To reach this aim, the paper has a conceptual part that seeks to elucidate and clarify the main methods employed in TS, and criticise and modify them to propose the new hybrid approach (the hybrid model) for translating the Quran – that is, the deductive method. To support and validate the outcome of the previous part, a comparative model is employed in order to highlight the differences between the suggested translation and other widely used ones – that is, the inductive method. By applying this methodology, the paper proves that there is a deficiency of communicating the original meaning of the Quran in light of the foreignising approach. In conclusion, the paper suggests producing a Quran translation has to take into account the adoption of many techniques to express the meaning of the Quran as understood in the original, and to offer this understanding in English in the most native-like manner to serve the intended target readers.Keywords: Quran translation, hybrid approach, domestication, foreignization, hybrid model
Procedia PDF Downloads 16316536 The Assessment of Forest Wood Biomass Potential in Terms of Sustainable Development
Authors: Julija Konstantinavičienė, Vlada Vitunskienė
Abstract:
The role of sustainable biomass, including wood biomass, is becoming more important because of European Green Deal. The New EU Forest strategy is a flagship element of the European Green Deal and a key action on the EU biodiversity strategy for 2030. The first measure of this strategy is promoting sustainable forest management, including encouraging the sustainable use of wood-based resources. The first aim of this research was to develop and present a new approach to the concept of forest wood biomass potential in terms of sustainable development, distinguishing theoretical, technical and sustainable potential and detailing its constraints. The second aim was to prepare the methodology outline of sustainable forest wood biomass potential assessment and empirically check this methodology, considering economic, social and ecological constraints. The basic methodologies of the research: the review of research (with a combination of semi-systematic and integrative review methodologies), rapid assessment method and statistical data analysis. The developed methodology of assessment of forest wood potential in terms of sustainable development can be used in Lithuania and in other countries and will let us compare this potential a different time and spatial levels. The application of the methodology will be able to serve the development of new national strategies for the wood sector.Keywords: assessment, constraints, forest wood biomass, methodology, potential, sustainability
Procedia PDF Downloads 12316535 Combining Corpus Linguistics and Critical Discourse Analysis to Study Power Relations in Hindi Newspapers
Authors: Vandana Mishra, Niladri Sekhar Dash, Jayshree Charkraborty
Abstract:
This present paper focuses on the application of corpus linguistics techniques for critical discourse analysis (CDA) of Hindi newspapers. While Corpus linguistics is the study of language as expressed in corpora (samples) of 'real world' text, CDA is an interdisciplinary approach to the study of discourse that views language as a form of social practice. CDA has mainly been studied from a qualitative perspective. However, we can say that recent studies have begun combining corpus linguistics with CDA in analyzing large volumes of text for the study of existing power relations in society. The corpus under our study is also of a sizable amount (1 million words of Hindi newspaper texts) and its analysis requires an alternative analytical procedure. So, we have combined both the quantitative approach i.e. the use of corpus techniques with CDA’s traditional qualitative analysis. In this context, we have focused on the Keyword Analysis Sorting Concordance Lines of the selected Keywords and calculating collocates of the keywords. We have made use of the Wordsmith Tool for all these analysis. The analysis starts with identifying the keywords in the political news corpus when compared with the main news corpus. The keywords are extracted from the corpus based on their keyness calculated through statistical tests like chi-squared test and log-likelihood test on the frequent words of the corpus. Some of the top occurring keywords are मोदी (Modi), भाजपा (BJP), कांग्रेस (Congress), सरकार (Government) and पार्टी (Political party). This is followed by the concordance analysis of these keywords which generates thousands of lines but we have to select few lines and examine them based on our objective. We have also calculated the collocates of the keywords based on their Mutual Information (MI) score. Both concordance and collocation help to identify lexical patterns in the political texts. Finally, all these quantitative results derived from the corpus techniques will be subjectively interpreted in accordance to the CDA’s theory to examine the ways in which political news discourse produces social and political inequality, power abuse or domination.Keywords: critical discourse analysis, corpus linguistics, Hindi newspapers, power relations
Procedia PDF Downloads 22516534 Learning Performance of Sports Education Model Based on Self-Regulated Learning Approach
Authors: Yi-Hsiang Pan, Ching-Hsiang Chen, Wei-Ting Hsu
Abstract:
The purpose of this study was to compare the learning effects of the sports education model (SEM) to those of the traditional teaching model (TTM) in physical education classes in terms of students learning motivation, action control, learning strategies, and learning performance. A quasi-experimental design was utilized in this study, and participants included two physical educators and four classes with a total of 94 students in grades 5 and 6 of elementary schools. Two classes implemented the SEM (n=47, male=24, female=23; age=11.89, SD=0.78) and two classes implemented the TTM (n=47, male=25, female=22, age=11.77; SD=0.66). Data were collected from these participants using a self-report questionnaire (including a learning motivation scale, action control scale, and learning strategy scale) and a game performance assessment instrument, and multivariate analysis of covariance was used to conduct statistical analysis. The findings of the study revealed that the SEM was significantly better than the TTM in promoting students learning motivation, action control, learning strategies, and game performance. It was concluded that the SEM could promote the mechanics of students self-regulated learning process, and thereby improve students movement performance.Keywords: self-regulated learning theory, learning process, curriculum model, physical education
Procedia PDF Downloads 34416533 TomoTherapy® System Repositioning Accuracy According to Treatment Localization
Authors: Veronica Sorgato, Jeremy Belhassen, Philippe Chartier, Roddy Sihanath, Nicolas Docquiere, Jean-Yves Giraud
Abstract:
We analyzed the image-guided radiotherapy method used by the TomoTherapy® System (Accuray Corp.) for patient repositioning in clinical routine. The TomoTherapy® System computes X, Y, Z and roll displacements to match the reference CT, on which the dosimetry has been performed, with the pre-treatment MV CT. The accuracy of the repositioning method has been studied according to the treatment localization. For this, a database of 18774 treatment sessions, performed during 2 consecutive years (2016-2017 period) has been used. The database includes the X, Y, Z and roll displacements proposed by TomoTherapy® System as well as the manual correction of these proposals applied by the radiation therapist. This manual correction aims to further improve the repositioning based on the clinical situation and depends on the structures surrounding the target tumor tissue. The statistical analysis performed on the database aims to define repositioning limits to be used as security and guiding tool for the manual adjustment implemented by the radiation therapist. This tool will participate not only to notify potential repositioning errors but also to further improve patient positioning for optimal treatment.Keywords: accuracy, IGRT MVCT, image-guided radiotherapy megavoltage computed tomography, statistical analysis, tomotherapy, localization
Procedia PDF Downloads 22616532 Enhancing Students’ Achievement, Interest and Retention in Chemistry through an Integrated Teaching/Learning Approach
Authors: K. V. F. Fatokun, P. A. Eniayeju
Abstract:
This study concerns the effects of concept mapping-guided discovery integrated teaching approach on the learning style and achievement of chemistry students. The sample comprised 162 senior secondary school (SS 2) students drawn from two science schools in Nasarawa State which have equivalent mean scores of 9.68 and 9.49 in their pre-test. Five instruments were developed and validated while the sixth was purely adopted by the investigator for the study, Four null hypotheses were tested at α = 0.05 level of significance. Chi square analysis showed that there is a significant shift in students’ learning style from accommodating and diverging to converging and assimilating when exposed to concept mapping- guided discovery approach. Also t-test and ANOVA that those in experimental group achieve and retain content learnt better. Results of the Scheffe’s test for multiple comparisons showed that boys in the experimental group performed better than girls. It is therefore concluded that the concept mapping-guided discovery integrated approach should be used in secondary schools to successfully teach electrochemistry. It is strongly recommended that chemistry teachers should be encouraged to adopt this method for teaching difficult concepts.Keywords: integrated teaching approach, concept mapping-guided discovery, achievement, retention, learning styles and interest
Procedia PDF Downloads 32916531 Evaluation of Hydrogen Particle Volume on Surfaces of Selected Nanocarbons
Authors: M. Ziółkowska, J. T. Duda, J. Milewska-Duda
Abstract:
This paper describes an approach to the adsorption phenomena modeling aimed at specifying the adsorption mechanisms on localized or nonlocalized adsorbent sites, when applied to the nanocarbons. The concept comes from the fundamental thermodynamic description of adsorption equilibrium and is based on numerical calculations of the hydrogen adsorbed particles volume on the surface of selected nanocarbons: single-walled nanotube and nanocone. This approach enables to obtain information on adsorption mechanism and then as a consequence to take appropriate mathematical adsorption model, thus allowing for a more reliable identification of the material porous structure. Theoretical basis of the approach is discussed and newly derived results of the numerical calculations are presented for the selected nanocarbons.Keywords: adsorption, mathematical modeling, nanocarbons, numerical analysis
Procedia PDF Downloads 26916530 Body Mass Index and Dietary Habits among Nursing College Students Living in the University Residence in Kirkuk City, Iraq
Authors: Jenan Shakoor
Abstract:
Obesity prevalence is increasing worldwide. University life is a challenging period especially for students who have to leave their familiar surroundings and settle in a new environment. The current study aimed to assess the diet and exercise habits and their association with body mass index (BMI) among nursing college students living at Kirkuk University residence. This was a descriptive study. A non-probability (purposive) sample of 101 students living in Kirkuk University residence was recruited during the period from the 15th November 2015 to the 5th May 2016. A questionnaire was constructed for the purpose of the study which consisted of four parts: the demographic characteristics of the study sample, eating habits, eating at college and healthy habits. The data were collected by interviewing the study sample and the weight and height were measured by a trained researcher at the college. Descriptive statistical analysis was undertaken. Data were prepared, organized and entered into the computer file; the Statistical Package for Social Science (SPSS 20) was used for data analysis. A p value≤ 0.05 was accepted as statistical significant. A total of 63 (62.4%) of the sample were aged20-21with a mean age of 22.1 (SD±0.653). A third of the sample 38 (37.6%) were from level four at college, 67 (66.3%) were female and 46 45.5% of participants were from a middle socio-economic status. 14 (13.9%) of the study sample were overweight (BMI =25-29.9kg/m2) and 6 (5.9%) were obese (BMI≥30kg/m2) compared to 73 (72.3%) were of normal weight (BMI =18.5-24.9kg/m2). With regard to eating habits and exercise, 42 (41.6%) of the students rarely ate breakfast, 79 (78.2%) eat lunch at university residence, 77 (78.2%) of the students reported rarely doing exercise and 62 (61.4%) of them were sleeping for less than eight hours. No significant association was found between the variables age, sex, level of college and socio-economic status and BMI, while there was a significant association between eating lunch at university and BMI (p =0.03). No significant association was found between eating habits, healthy habits and BMI. The prevalence of overweight and obesity among the study sample was 19.8% with female students being more obese than males. Further studies are needed to identify BMI among residence students in other colleges and increasing the awareness of undergraduate students to healthy food habits.Keywords: body mass index, diet, obesity, university residence
Procedia PDF Downloads 22016529 JREM: An Approach for Formalising Models in the Requirements Phase with JSON and NoSQL Databases
Authors: Aitana Alonso-Nogueira, Helia Estévez-Fernández, Isaías García
Abstract:
This paper presents an approach to reduce some of its current flaws in the requirements phase inside the software development process. It takes the software requirements of an application, makes a conceptual modeling about it and formalizes it within JSON documents. This formal model is lodged in a NoSQL database which is document-oriented, that is, MongoDB, because of its advantages in flexibility and efficiency. In addition, this paper underlines the contributions of the detailed approach and shows some applications and benefits for the future work in the field of automatic code generation using model-driven engineering tools.Keywords: conceptual modelling, JSON, NoSQL databases, requirements engineering, software development
Procedia PDF Downloads 37916528 A Crowdsourced Homeless Data Collection System and Its Econometric Analysis: Strengthening Inclusive Public Administration Policies
Authors: Praniil Nagaraj
Abstract:
This paper proposes a method to collect homeless data using crowdsourcing and presents an approach to analyze the data, demonstrating its potential to strengthen existing and future policies aimed at promoting socio-economic equilibrium. This paper's contributions can be categorized into three main areas. Firstly, a unique method for collecting homeless data is introduced, utilizing a user-friendly smartphone app (currently available for Android). The app enables the general public to quickly record information about homeless individuals, including the number of people and details about their living conditions. The collected data, including date, time, and location, is anonymized and securely transmitted to the cloud. It is anticipated that an increasing number of users motivated to contribute to society will adopt the app, thus expanding the data collection efforts. Duplicate data is addressed through simple classification methods, and historical data is utilized to fill in missing information. The second contribution of this paper is the description of data analysis techniques applied to the collected data. By combining this new data with existing information, statistical regression analysis is employed to gain insights into various aspects, such as distinguishing between unsheltered and sheltered homeless populations, as well as examining their correlation with factors like unemployment rates, housing affordability, and labor demand. Initial data is collected in San Francisco, while pre-existing information is drawn from three cities: San Francisco, New York City, and Washington D.C., facilitating the conduction of simulations. The third contribution focuses on demonstrating the practical implications of the data processing results. The challenges faced by key stakeholders, including charitable organizations and local city governments, are taken into consideration. Two case studies are presented as examples. The first case study explores improving the efficiency of food and necessities distribution, as well as medical assistance, driven by charitable organizations. The second case study examines the correlation between micro-geographic budget expenditure by local city governments and homeless information to justify budget allocation and expenditures. The ultimate objective of this endeavor is to enable the continuous enhancement of the quality of life for the underprivileged. It is hoped that through increased crowdsourcing of data from the public, the Generosity Curve and the Need Curve will intersect, leading to a better world for all.Keywords: crowdsourcing, homelessness, socio-economic policies, statistical analysis
Procedia PDF Downloads 4816527 Pregnancy and Birth Experience, Opinions regarding the Delivery Method of the Patients' Vaginal Deliveries
Authors: Umran Erciyes, Filiz Okumus
Abstract:
The purpose of this study was the determination of factors which impact the pregnancy, birth experience and the opinions regarding the delivery type of the puerperants, after vaginal birth. This descriptive study includes 349 patients who gave births with normal birth in one of the hospital in İstanbul in May- November 2014. After birth, we interview with these women face to face. The descriptive information form and Perception of Birth Scale were used as data collection tool. SPSS (Statistical Package for the Social Sciences) was used for statistical analysis. The average age of patients was 27.13, and the average score was 76.93±20.22. The patients are primary school graduate, and they do not have a job. They expressed an income outcome equality. More than half of women did not get educated before birth. Among educated patients, few women got educated overcoming the pain during labor process. As the time spent in the hospital for the birth increases, the birth perception of mothers is affected negatively. %86.8 of participants gave assisted delivery. Spontaneous vaginal birth has positive effects on birth perception. Establishing a vascular access, induction of labor performing enema, restriction of orally intake and movement, fundal pressure, episiotomy, nor to perform skin to skin contact with the baby after birth has adverse effects on the birth perceptions.Keywords: antenatal care, birth experience, perception of birth, vaginal birth
Procedia PDF Downloads 43716526 Waste Identification Diagrams Effectiveness: A Case Study in the Manaus Industrial Pole
Authors: José Dinis-Carvalho, Levi Guimarães, Celina Leão, Rui Sousa, Rosa Eliza Vieira, Larissa Thomaz, Kelliane Guerreiro
Abstract:
This research paper investigates the efficacy of waste identification diagrams (WIDs) as a tool for waste reduction and management within the Manaus Industrial Pole. The study focuses on assessing the practical application and effectiveness of WIDs in identifying, categorizing, and mitigating various forms of waste generated across industrial processes. Employing a mixed-methods approach, including a qualitative questionnaire applied to 5 companies and quantitative data analysis with SPSS statistical software, the research evaluates the implementation and impact of WIDs on waste reduction practices in select industries within the Manaus Industrial Pole. The findings contribute to understanding the utility of WIDs as a proactive strategy for waste management, offering insights into their potential for fostering sustainable practices and promoting environmental stewardship in industrial settings. The study also discusses challenges, best practices, and recommendations for optimizing the utilization of WIDs in industrial waste management, thereby addressing the broader implications for sustainable industrial development.Keywords: waste identification diagram, value stream mapping, overall equipment effectiveness, lean manufacturing
Procedia PDF Downloads 5616525 Next Generation Radiation Risk Assessment and Prediction Tools Generation Applying AI-Machine (Deep) Learning Algorithms
Authors: Selim M. Khan
Abstract:
Indoor air quality is strongly influenced by the presence of radioactive radon (222Rn) gas. Indeed, exposure to high 222Rn concentrations is unequivocally linked to DNA damage and lung cancer and is a worsening issue in North American and European built environments, having increased over time within newer housing stocks as a function of as yet unclear variables. Indoor air radon concentration can be influenced by a wide range of environmental, structural, and behavioral factors. As some of these factors are quantitative while others are qualitative, no single statistical model can determine indoor radon level precisely while simultaneously considering all these variables across a complex and highly diverse dataset. The ability of AI- machine (deep) learning to simultaneously analyze multiple quantitative and qualitative features makes it suitable to predict radon with a high degree of precision. Using Canadian and Swedish long-term indoor air radon exposure data, we are using artificial deep neural network models with random weights and polynomial statistical models in MATLAB to assess and predict radon health risk to human as a function of geospatial, human behavioral, and built environmental metrics. Our initial artificial neural network with random weights model run by sigmoid activation tested different combinations of variables and showed the highest prediction accuracy (>96%) within the reasonable iterations. Here, we present details of these emerging methods and discuss strengths and weaknesses compared to the traditional artificial neural network and statistical methods commonly used to predict indoor air quality in different countries. We propose an artificial deep neural network with random weights as a highly effective method for assessing and predicting indoor radon.Keywords: radon, radiation protection, lung cancer, aI-machine deep learnng, risk assessment, risk prediction, Europe, North America
Procedia PDF Downloads 9816524 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood
Authors: Randa Alharbi, Vladislav Vyshemirsky
Abstract:
Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)
Procedia PDF Downloads 20616523 Effects of Process Parameter Variation on the Surface Roughness of Rapid Prototyped Samples Using Design of Experiments
Authors: R. Noorani, K. Peerless, J. Mandrell, A. Lopez, R. Dalberto, M. Alzebaq
Abstract:
Rapid prototyping (RP) is an additive manufacturing technology used in industry that works by systematically depositing layers of working material to construct larger, computer-modeled parts. A key challenge associated with this technology is that RP parts often feature undesirable levels of surface roughness for certain applications. To combat this phenomenon, an experimental technique called Design of Experiments (DOE) can be employed during the growth procedure to statistically analyze which RP growth parameters are most influential to part surface roughness. Utilizing DOE to identify such factors is important because it is a technique that can be used to optimize a manufacturing process, which saves time, money, and increases product quality. In this study, a four-factor/two level DOE experiment was performed to investigate the effect of temperature, layer thickness, infill percentage, and infill speed on the surface roughness of RP prototypes. Samples were grown using the sixteen different possible growth combinations associated with a four-factor/two level study, and then the surface roughness data was gathered for each set of factors. After applying DOE statistical analysis to these data, it was determined that layer thickness played the most significant role in the prototype surface roughness.Keywords: rapid prototyping, surface roughness, design of experiments, statistical analysis, factors and levels
Procedia PDF Downloads 26216522 Premalignant and Malignant Lesions of Uterine Polyps: Analysis at a University Hospital
Authors: Manjunath A. P., Al-Ajmi G. M., Al Shukri M., Girija S
Abstract:
Introduction: This study aimed to compare the ability of hysteroscopy and ultrasonography to diagnose uterine polyps. To correlate the ultrasonography and hystroscopic findings with various clinical factors and histopathology of uterine polyps. Methods: This is a retrospective study conducted at the Department of Obstetrics and Gynaecology at Sultan Qaboos University Hospital from 2014 to 2019. All women undergoing hysteroscopy for suspected uterine polyps were included. All relevant data were obtained from the electronic patient record and analysed using SPSS. Results: A total of 77 eligible women were analysed. The mean age of the patients was 40 years. The clinical risk factors; obesity, hypertension, and diabetes mellitus, showed no significant statistical association with the presence of uterine polyps (p-value>0.005). Although 20 women (52.6%) with uterine polyps had thickened endometrium (>11 mm), however, there is no statistical association (p-value>0.005). The sensitivity and specificity of ultrasonography in the detection of uterine polyp were 39% and 65%, respectively. Whereas for hysteroscopy, it was 89% and 20%, respectively. The prevalence of malignant and premalignant lesions were 1.85% and 7.4%, respectively. Conclusion: This study found that obesity, hypertension, and diabetes mellitus were not associated with the presence of uterine polyps. There was no association between thick endometrium and uterine polyps. The sensitivity is higher for hysteroscopy, whereas the specificity is higher for sonography in detecting uterine polyps. The prevalence of malignancy was very low in uterine polyps.Keywords: endometrial polyps, hysteroscopy, ultrasonography, premalignant, malignant
Procedia PDF Downloads 13016521 Towards Learning Query Expansion
Authors: Ahlem Bouziri, Chiraz Latiri, Eric Gaussier
Abstract:
The steady growth in the size of textual document collections is a key progress-driver for modern information retrieval techniques whose effectiveness and efficiency are constantly challenged. Given a user query, the number of retrieved documents can be overwhelmingly large, hampering their efficient exploitation by the user. In addition, retaining only relevant documents in a query answer is of paramount importance for an effective meeting of the user needs. In this situation, the query expansion technique offers an interesting solution for obtaining a complete answer while preserving the quality of retained documents. This mainly relies on an accurate choice of the added terms to an initial query. Interestingly enough, query expansion takes advantage of large text volumes by extracting statistical information about index terms co-occurrences and using it to make user queries better fit the real information needs. In this respect, a promising track consists in the application of data mining methods to extract dependencies between terms, namely a generic basis of association rules between terms. The key feature of our approach is a better trade off between the size of the mining result and the conveyed knowledge. Thus, face to the huge number of derived association rules and in order to select the optimal combination of query terms from the generic basis, we propose to model the problem as a classification problem and solve it using a supervised learning algorithm such as SVM or k-means. For this purpose, we first generate a training set using a genetic algorithm based approach that explores the association rules space in order to find an optimal set of expansion terms, improving the MAP of the search results. The experiments were performed on SDA 95 collection, a data collection for information retrieval. It was found that the results were better in both terms of MAP and NDCG. The main observation is that the hybridization of text mining techniques and query expansion in an intelligent way allows us to incorporate the good features of all of them. As this is a preliminary attempt in this direction, there is a large scope for enhancing the proposed method.Keywords: supervised leaning, classification, query expansion, association rules
Procedia PDF Downloads 32516520 On Pooling Different Levels of Data in Estimating Parameters of Continuous Meta-Analysis
Authors: N. R. N. Idris, S. Baharom
Abstract:
A meta-analysis may be performed using aggregate data (AD) or an individual patient data (IPD). In practice, studies may be available at both IPD and AD level. In this situation, both the IPD and AD should be utilised in order to maximize the available information. Statistical advantages of combining the studies from different level have not been fully explored. This study aims to quantify the statistical benefits of including available IPD when conducting a conventional summary-level meta-analysis. Simulated meta-analysis were used to assess the influence of the levels of data on overall meta-analysis estimates based on IPD-only, AD-only and the combination of IPD and AD (mixed data, MD), under different study scenario. The percentage relative bias (PRB), root mean-square-error (RMSE) and coverage probability were used to assess the efficiency of the overall estimates. The results demonstrate that available IPD should always be included in a conventional meta-analysis using summary level data as they would significantly increased the accuracy of the estimates. On the other hand, if more than 80% of the available data are at IPD level, including the AD does not provide significant differences in terms of accuracy of the estimates. Additionally, combining the IPD and AD has moderating effects on the biasness of the estimates of the treatment effects as the IPD tends to overestimate the treatment effects, while the AD has the tendency to produce underestimated effect estimates. These results may provide some guide in deciding if significant benefit is gained by pooling the two levels of data when conducting meta-analysis.Keywords: aggregate data, combined-level data, individual patient data, meta-analysis
Procedia PDF Downloads 37516519 Thermal Behavior of a Ventilated Façade Using Perforated Ceramic Bricks
Authors: Helena López-Moreno, Antoni Rodríguez-Sánchez, Carmen Viñas-Arrebola, Cesar Porras-Amores
Abstract:
The ventilated façade has great advantages when compared to traditional façades as it reduces the air conditioning thermal loads due to the stack effect induced by solar radiation in the air chamber. Optimizing energy consumption by using a ventilated façade can be used not only in newly built buildings but also it can be implemented in existing buildings, opening the field of implementation to energy building retrofitting works. In this sense, the following three prototypes of façade where designed, built and further analyzed in this research: non-ventilated façade (NVF); slightly ventilated façade (SLVF) and strongly ventilated façade (STVF). The construction characteristics of the three facades are based on the Spanish regulation of building construction “Technical Building Code”. The façades have been monitored by type-k thermocouples in a representative day of the summer season in Madrid (Spain). Moreover, an analysis of variance (ANOVA) with repeated measures, studying the thermal lag in the ventilated and no-ventilated façades has been designed. Results show that STVF façade presents higher levels of thermal inertia as the thermal lag reduces up to 100% (daily mean) compared to the non-ventilated façade. In addition, the statistical analysis proves that an increase of the ventilation holes size in STVF façades does not improve the thermal lag significantly (p > 0.05) when compared to the SLVF façade.Keywords: ventilated façade, energy efficiency, thermal behavior, statistical analysis
Procedia PDF Downloads 49316518 An Appraisal of the Utilization of the New International Academy of Cytology Yokohama Standardized Reporting System: A Study of Diagnostic Accuracy and Calculation of the Risk of Malignancy Along with Histopathological Correlation in Fine-Needle Aspiratio
Authors: Deepika Gupta, Namita Bhutani, Mithlesh Bhargav
Abstract:
Objective: Breast cancer is the most commonly encountered lesion in females after non-melanoma skin malignancies. The triple assessment is an important approach in pre-operative lesions in developing countries. The objectives of the present study were to determine diagnostic accuracy and calculation of ROM along with cyto-histopathological correlation with the incorporation of a new IAC reporting system. Material and Methods: A total of 940 FNAC slides were retrieved from December 2019 to December 2020 and categorized according to the new IAC system. The diagnostic accuracy and calculation of ROM, along with cyto-histopathological correlation, were determined. Results: All the breast FNAC lesions were categorized from C1 to C5. Of the 940 cases, 358 cases had cyto-histopathological correlation. The ROM was ranging from 0% to 99%. All the statistical parameters were calculated along with diagnostic accuracy which was 97%. Conclusion: The new IAC standardized reporting system of breast FNAC evoked the utilization of rapid, accurate, and low-cost diagnostic tests and broadened the understanding and application of breast FNAC.Keywords: accuracy, fine needle aspiration cytology, IAC, risk of malignancy, Yokohama system
Procedia PDF Downloads 1116517 Group Sequential Covariate-Adjusted Response Adaptive Designs for Survival Outcomes
Authors: Yaxian Chen, Yeonhee Park
Abstract:
Driven by evolving FDA recommendations, modern clinical trials demand innovative designs that strike a balance between statistical rigor and ethical considerations. Covariate-adjusted response-adaptive (CARA) designs bridge this gap by utilizing patient attributes and responses to skew treatment allocation in favor of the treatment that is best for an individual patient’s profile. However, existing CARA designs for survival outcomes often hinge on specific parametric models, constraining their applicability in clinical practice. In this article, we address this limitation by introducing a CARA design for survival outcomes (CARAS) based on the Cox model and a variance estimator. This method addresses issues of model misspecification and enhances the flexibility of the design. We also propose a group sequential overlapweighted log-rank test to preserve type I error rate in the context of group sequential trials using extensive simulation studies to demonstrate the clinical benefit, statistical efficiency, and robustness to model misspecification of the proposed method compared to traditional randomized controlled trial designs and response-adaptive randomization designs.Keywords: cox model, log-rank test, optimal allocation ratio, overlap weight, survival outcome
Procedia PDF Downloads 6516516 Resistance and Sub-Resistances of RC Beams Subjected to Multiple Failure Modes
Authors: F. Sangiorgio, J. Silfwerbrand, G. Mancini
Abstract:
Geometric and mechanical properties all influence the resistance of RC structures and may, in certain combination of property values, increase the risk of a brittle failure of the whole system. This paper presents a statistical and probabilistic investigation on the resistance of RC beams designed according to Eurocodes 2 and 8, and subjected to multiple failure modes, under both the natural variation of material properties and the uncertainty associated with cross-section and transverse reinforcement geometry. A full probabilistic model based on JCSS Probabilistic Model Code is derived. Different beams are studied through material nonlinear analysis via Monte Carlo simulations. The resistance model is consistent with Eurocode 2. Both a multivariate statistical evaluation and the data clustering analysis of outcomes are then performed. Results show that the ultimate load behaviour of RC beams subjected to flexural and shear failure modes seems to be mainly influenced by the combination of the mechanical properties of both longitudinal reinforcement and stirrups, and the tensile strength of concrete, of which the latter appears to affect the overall response of the system in a nonlinear way. The model uncertainty of the resistance model used in the analysis plays undoubtedly an important role in interpreting results.Keywords: modelling, Monte Carlo simulations, probabilistic models, data clustering, reinforced concrete members, structural design
Procedia PDF Downloads 47216515 Churn Prediction for Savings Bank Customers: A Machine Learning Approach
Authors: Prashant Verma
Abstract:
Commercial banks are facing immense pressure, including financial disintermediation, interest rate volatility and digital ways of finance. Retaining an existing customer is 5 to 25 less expensive than acquiring a new one. This paper explores customer churn prediction, based on various statistical & machine learning models and uses under-sampling, to improve the predictive power of these models. The results show that out of the various machine learning models, Random Forest which predicts the churn with 78% accuracy, has been found to be the most powerful model for the scenario. Customer vintage, customer’s age, average balance, occupation code, population code, average withdrawal amount, and an average number of transactions were found to be the variables with high predictive power for the churn prediction model. The model can be deployed by the commercial banks in order to avoid the customer churn so that they may retain the funds, which are kept by savings bank (SB) customers. The article suggests a customized campaign to be initiated by commercial banks to avoid SB customer churn. Hence, by giving better customer satisfaction and experience, the commercial banks can limit the customer churn and maintain their deposits.Keywords: savings bank, customer churn, customer retention, random forests, machine learning, under-sampling
Procedia PDF Downloads 14416514 A PROMETHEE-BELIEF Approach for Multi-Criteria Decision Making Problems with Incomplete Information
Abstract:
Multi-criteria decision aid methods consider decision problems where numerous alternatives are evaluated on several criteria. These methods are used to deal with perfect information. However, in practice, it is obvious that this information requirement is too much strict. In fact, the imperfect data provided by more or less reliable decision makers usually affect decision results since any decision is closely linked to the quality and availability of information. In this paper, a PROMETHEE-BELIEF approach is proposed to help multi-criteria decisions based on incomplete information. This approach solves problems with incomplete decision matrix and unknown weights within PROMETHEE method. On the base of belief function theory, our approach first determines the distributions of belief masses based on PROMETHEE’s net flows and then calculates weights. Subsequently, it aggregates the distribution masses associated to each criterion using Murphy’s modified combination rule in order to infer a global belief structure. The final action ranking is obtained via pignistic probability transformation. A case study of real-world application concerning the location of a waste treatment center from healthcare activities with infectious risk in the center of Tunisia is studied to illustrate the detailed process of the BELIEF-PROMETHEE approach.Keywords: belief function theory, incomplete information, multiple criteria analysis, PROMETHEE method
Procedia PDF Downloads 16716513 An Improved Dynamic Window Approach with Environment Awareness for Local Obstacle Avoidance of Mobile Robots
Authors: Baoshan Wei, Shuai Han, Xing Zhang
Abstract:
Local obstacle avoidance is critical for mobile robot navigation. It is a challenging task to ensure path optimality and safety in cluttered environments. We proposed an Environment Aware Dynamic Window Approach in this paper to cope with the issue. The method integrates environment characterization into Dynamic Window Approach (DWA). Two strategies are proposed in order to achieve the integration. The local goal strategy guides the robot to move through openings before approaching the final goal, which solves the local minima problem in DWA. The adaptive control strategy endows the robot to adjust its state according to the environment, which addresses path safety compared with DWA. Besides, the evaluation shows that the path generated from the proposed algorithm is safer and smoother compared with state-of-the-art algorithms.Keywords: adaptive control, dynamic window approach, environment aware, local obstacle avoidance, mobile robots
Procedia PDF Downloads 15916512 Effect of Be, Zr, and Heat Treatment on Mechanical Behavior of Cast Al-Mg-Zn-Cu Alloys (7075)
Authors: Mahmoud M. Tash
Abstract:
The present study was undertaken to investigate the effect of aging parameters (time and temperature) on the mechanical properties of Be-and/or Zr- treated Al-Mg-Zn (7075) alloys. Ultimate tensile strength, 0.5% offset yield strength and % elongation measurements were carried out on specimens prepared from cast and heat treated 7075 alloys containing Be and/or Zr. Different aging treatment were carried out for the as solution treated (SHT) specimens. The specimens were aged at different conditions; Natural and artificial aging was carried out at room temperature, 120C, 150C, 180C and 220C for different periods of time. Duplex aging was performed for SHT conditions (pre-aged at different time and temperature followed by high temperature aging). Ultimate tensile strength, yield strength and % elongation data results as a function of different aging parameters are analysed. A statistical design of experiments (DOE) approach using fractional factorial design is applied to acquire an understanding of the effects of these variables and their interactions on the mechanical properties of Be- and/or Zr- treated 7075 alloys. Mathematical models are developed to relate the alloy mechanical properties with the different aging parameters.Keywords: casting aging treatment, mechanical properties, Al-Mg-Zn alloys, Be- and/or Zr-treatment, experimental correlation
Procedia PDF Downloads 36516511 Applying Sliding Autonomy for a Human-Robot Team on USARSim
Authors: Fang Tang, Jacob Longazo
Abstract:
This paper describes a sliding autonomy approach for coordinating a team of robots to assist the human operator to accomplish tasks while adapting to new or unexpected situations by requesting help from the human operator. While sliding autonomy has been well studied in the context of controlling a single robot. Much work needs to be done to apply sliding autonomy to a multi-robot team, especially human-robot team. Our approach aims at a hierarchical sliding control structure, with components that support human-robot collaboration. We validated our approach in the USARSim simulation and demonstrated that the human-robot team's overall performance can be improved under the sliding autonomy control.Keywords: sliding autonomy, multi-robot team, human-robot collaboration, USARSim
Procedia PDF Downloads 546