Search results for: parameter estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3754

Search results for: parameter estimation

1174 Comparative Fragility Analysis of Shallow Tunnels Subjected to Seismic and Blast Loads

Authors: Siti Khadijah Che Osmi, Mohammed Ahmad Syed

Abstract:

Underground structures are crucial components which required detailed analysis and design. Tunnels, for instance, are massively constructed as transportation infrastructures and utilities network especially in urban environments. Considering their prime importance to the economy and public safety that cannot be compromised, thus any instability to these tunnels will be highly detrimental to their performance. Recent experience suggests that tunnels become vulnerable during earthquakes and blast scenarios. However, a very limited amount of studies has been carried out to study and understanding the dynamic response and performance of underground tunnels under those unpredictable extreme hazards. In view of the importance of enhancing the resilience of these structures, the overall aims of the study are to evaluate probabilistic future performance of shallow tunnels subjected to seismic and blast loads by developing detailed fragility analysis. Critical non-linear time history numerical analyses using sophisticated finite element software Midas GTS NX have been presented about the current methods of analysis, taking into consideration of structural typology, ground motion and explosive characteristics, effect of soil conditions and other associated uncertainties on the tunnel integrity which may ultimately lead to the catastrophic failure of the structures. The proposed fragility curves for both extreme loadings are discussed and compared which provide significant information the performance of the tunnel under extreme hazards which may beneficial for future risk assessment and loss estimation.

Keywords: fragility analysis, seismic loads, shallow tunnels, blast loads

Procedia PDF Downloads 343
1173 Variability of Climatic Elements in Nigeria Over Recent 100 Years

Authors: T. Salami, O. S. Idowu, N. J. Bello

Abstract:

Climatic variability is an essential issue when dealing with the issue of climate change. Variability of some climate parameter helps to determine how variable the climatic condition of a region will behave. The most important of these climatic variables which help to determine the climatic condition in an area are both the Temperature and Precipitation. This research deals with Longterm climatic variability in Nigeria. Variables examined in this analysis include near-surface temperature, near surface minimum temperature, maximum temperature, relative humidity, vapour pressure, precipitation, wet-day frequency and cloud cover using data ranging between 1901-2010. Analyses were carried out and the following methods were used: - Regression and EOF analysis. Results show that the annual average, minimum and maximum near-surface temperature all gradually increases from 1901 to 2010. And they are in the same case in a wet season and dry season. Minimum near-surface temperature, with its linear trends are significant for annual, wet season and dry season means. However, the diurnal temperature range decreases in the recent 100 years imply that the minimum near-surface temperature has increased more than the maximum. Both precipitation and wet day frequency decline from the analysis, demonstrating that Nigeria has become dryer than before by the way of rainfall. Temperature and precipitation variability has become very high during these periods especially in the Northern areas. Areas which had excessive rainfall were confronted with flooding and other related issues while area that had less precipitation were all confronted with drought. More practical issues will be presented.

Keywords: climate, variability, flooding, excessive rainfall

Procedia PDF Downloads 382
1172 Validation of SWAT Model for Prediction of Water Yield and Water Balance: Case Study of Upstream Catchment of Jebba Dam in Nigeria

Authors: Adeniyi G. Adeogun, Bolaji F. Sule, Adebayo W. Salami, Michael O. Daramola

Abstract:

Estimation of water yield and water balance in a river catchment is critical to the sustainable management of water resources at watershed level in any country. Therefore, in the present study, Soil and Water Assessment Tool (SWAT) interfaced with Geographical Information System (GIS) was applied as a tool to predict water balance and water yield of a catchment area in Nigeria. The catchment area, which was 12,992km2, is located upstream Jebba hydropower dam in North central part of Nigeria. In this study, data on the observed flow were collected and compared with simulated flow using SWAT. The correlation between the two data sets was evaluated using statistical measures, such as, Nasch-Sucliffe Efficiency (NSE) and coefficient of determination (R2). The model output shows a good agreement between the observed flow and simulated flow as indicated by NSE and R2, which were greater than 0.7 for both calibration and validation period. A total of 42,733 mm of water was predicted by the calibrated model as the water yield potential of the basin for a simulation period 1985 to 2010. This interesting performance obtained with SWAT model suggests that SWAT model could be a promising tool to predict water balance and water yield in sustainable management of water resources. In addition, SWAT could be applied to other water resources in other basins in Nigeria as a decision support tool for sustainable water management in Nigeria.

Keywords: GIS, modeling, sensitivity analysis, SWAT, water yield, watershed level

Procedia PDF Downloads 437
1171 Measurement of the Quadriceps Angle with Respect to Various Body Parameters in Arab Countries

Authors: Ramada R. Khasawneh, Mohammed Z. Allouh, Ejlal Abu-El Rub

Abstract:

The quadriceps angle (Q angle), formed between the quadriceps muscles and the patella tendon, is considered clinically as a very important parameter which displays the biomechanical effect of the quadriceps muscle on the knee, and it is also regarded as a crucial factor for the proper posture and movement of the knee patella. This study had been conducted to measure the normal Q angle values range in the Arab nationalities and determine the correlation between Q angle values and several body parameters, including gender, height, weight, dominant side, and the condylar distance of the femur. The study includes 500 healthy Arab students from Yarmouk University and Jordan University of Science and Technology. The Q angle of those volunteers was measured using a universal manual Goniometer with the subjects in the upright weight-bearing position. It was found that the Q angle was greater in women than in men. The analysis of the data revealed an insignificant increase in the dominant side of the Q angle. In addition, the Q was significantly higher in the taller people of both sexes. However, the Q angle did not present any considerable correlation with weight in the study population; conversely, it was observed that there was a link with the condylar distance of the femur in both sexes. It was also noticed that the Q angle increased remarkably when there was an increase in the condylar distance. Consequently, it turned out that the gender, height, and the condylar distance were momentous factors that had an impact on the Q angle in our study samples. However, weight and dominance factors did not show to have any influence on the values in our study.

Keywords: Q angle, Jordanian, anatomy, condylar distance

Procedia PDF Downloads 144
1170 Estimation of Asphalt Pavement Surfaces Using Image Analysis Technique

Authors: Mohammad A. Khasawneh

Abstract:

Asphalt concrete pavements gradually lose their skid resistance causing safety problems especially under wet conditions and high driving speeds. In order to enact the actual field polishing and wearing process of asphalt pavement surfaces in a laboratory setting, several laboratory-scale accelerated polishing devices were developed by different agencies. To mimic the actual process, friction and texture measuring devices are needed to quantify surface deterioration at different polishing intervals that reflect different stages of the pavement life. The test could still be considered lengthy and to some extent labor-intensive. Therefore, there is a need to come up with another method that can assist in investigating the bituminous pavement surface characteristics in a practical and time-efficient test procedure. The purpose of this paper is to utilize a well-developed image analysis technique to characterize asphalt pavement surfaces without the need to use conventional friction and texture measuring devices in an attempt to shorten and simplify the polishing procedure in the lab. Promising findings showed the possibility of using image analysis in lieu of the labor-sensitive-variable-in-nature friction and texture measurements. It was found that the exposed aggregate surface area of asphalt specimens made from limestone and gravel aggregates produced solid evidence of the validity of this method in describing asphalt pavement surfaces. Image analysis results correlated well with the British Pendulum Numbers (BPN), Polish Values (PV) and Mean Texture Depth (MTD) values.

Keywords: friction, image analysis, polishing, statistical analysis, texture

Procedia PDF Downloads 304
1169 Unsupervised Feature Learning by Pre-Route Simulation of Auto-Encoder Behavior Model

Authors: Youngjae Jin, Daeshik Kim

Abstract:

This paper describes a cycle accurate simulation results of weight values learned by an auto-encoder behavior model in terms of pre-route simulation. Given the results we visualized the first layer representations with natural images. Many common deep learning threads have focused on learning high-level abstraction of unlabeled raw data by unsupervised feature learning. However, in the process of handling such a huge amount of data, the learning method’s computation complexity and time limited advanced research. These limitations came from the fact these algorithms were computed by using only single core CPUs. For this reason, parallel-based hardware, FPGAs, was seen as a possible solution to overcome these limitations. We adopted and simulated the ready-made auto-encoder to design a behavior model in Verilog HDL before designing hardware. With the auto-encoder behavior model pre-route simulation, we obtained the cycle accurate results of the parameter of each hidden layer by using MODELSIM. The cycle accurate results are very important factor in designing a parallel-based digital hardware. Finally this paper shows an appropriate operation of behavior model based pre-route simulation. Moreover, we visualized learning latent representations of the first hidden layer with Kyoto natural image dataset.

Keywords: auto-encoder, behavior model simulation, digital hardware design, pre-route simulation, Unsupervised feature learning

Procedia PDF Downloads 445
1168 Potassium Acetate - Coconut Shell Activated Carbon for Adsorption of Benzene and Toluene: Equilibrium and Kinetic Studies

Authors: Jibril Mohammed, Usman Dadum Hamza, Abdulsalam Surajudeen, Baba Yahya Danjuma

Abstract:

Considerable concerns have been raised over the presence of volatile organic compounds (VOCs) in water. In this study, coconut shell based activated carbon was produced through chemical activation with potassium acetate (PAAC) for adsorption of benzene and toluene. The porous carbons were characterized using Fourier transform infrared spectroscopy (FTIR), thermogravimetric analysis (TGA), scanning electron microscopy (SEM), proximate analysis, and ultimate analysis and nitrogen adsorption tests. Adsorption of benzene and toluene on the porous carbons were conducted at varying concentrations (50-250 mg/l). The high BET surface area of 622 m2/g and highly heteroporous adsorbent prepared gave good removal efficiencies of 79 and 82% for benzene and toluene respectively, with 32% yield. Equilibrium data were fitted to Langmuir, Freundlich and Temkin isotherms with all the models having R2 > 0.94. The equilibrium data were best represented by the Langmuir isotherm, with maximum adsorption capacity of 192 mg/g and 227 mg/g for benzene and toluene respectively. The Webber and Chakkravorti equilibrium parameter (RL) values are between 0 and 1 confirming the favourability of the Langmuir model. The adsorption kinetics was found to follow the pseudo-second-order kinetic model. The PAAC produced can be used effectively to salvage environmental pollution problems posed by VOCs through a sustainable process.

Keywords: adsorption, equilibrium and kinetics studies, potassium acetate, water treatment

Procedia PDF Downloads 218
1167 Saltwater Intrusion Studies in the Cai River in the Khanh Hoa Province, Vietnam

Authors: B. Van Kessel, P. T. Kockelkorn, T. R. Speelman, T. C. Wierikx, C. Mai Van, T. A. Bogaard

Abstract:

Saltwater intrusion is a common problem in estuaries around the world, as it could hinder the freshwater supply of coastal zones. This problem is likely to grow due to climate change and sea-level rise. The influence of these factors on the saltwater intrusion was investigated for the Cai River in the Khanh Hoa province in Vietnam. In addition, the Cai River has high seasonal fluctuations in discharge, leading to increased saltwater intrusion during the dry season. Sea level rise, river discharge changes, river mouth widening and a proposed saltwater intrusion prevention dam can have influences on the saltwater intrusion but have not been quantified for the Cai River estuary. This research used both an analytical and numerical model to investigate the effect of the aforementioned factors. The analytical model was based on a model proposed by Savenije and was calibrated using limited in situ data. The numerical model was a 3D hydrodynamic model made using the Delft3D4 software. The analytical model and numerical model agreed with in situ data, mostly for tidally average data. Both models indicated a roughly similar dependence on discharge, also agreeing that this parameter had the most severe influence on the modeled saltwater intrusion. Especially for discharges below 10 m/s3, the saltwater was predicted to reach further than 10 km. In the models, both sea-level rise and river widening mainly resulted in salinity increments up to 3 kg/m3 in the middle part of the river. The predicted sea-level rise in 2070 was simulated to lead to an increase of 0.5 km in saltwater intrusion length. Furthermore, the effect of the saltwater intrusion dam seemed significant in the model used, but only for the highest position of the gate.

Keywords: Cai River, hydraulic models, river discharge, saltwater intrusion, tidal barriers

Procedia PDF Downloads 109
1166 Characteristics and Item Parameters Fitness on Chemistry Teacher-Made Test Instrument

Authors: Rizki Nor Amelia, Farida A. Setiawati

Abstract:

This study aimed to: (1) describe the characteristics of teacher-made test instrument used to measure the ability of students’chemistry, and (2) identify the presence of the compability difficulty level set by teachers to difficulty level by empirical results. Based on these objectives, this study was a descriptive research. The analysis in this study used the Rasch model and Chi-square statistics. Analysis using Rasch Model was based on the response patterns of high school students to the teacher-made test instrument on chemistry subject Academic Year 2015/2016 in the Yogyakarta. The sample of this research were 358 students taken by cluster random sampling technique. The analysis showed that: (1) a teacher-made tests instrument has a medium on the mean difficulty level. This instrument is capable to measure the ability on the interval of -0,259 ≤ θ ≤ 0,659 logit. Maximum Test Information Function obtained at 18.187 on the ability +0,2 logit; (2) 100% items categorized either as easy or difficult by rasch model is match with the teachers’ judgment; while 37 items are categorized according to rasch model which 8.10% and 10.81% categorized as easy and difficult items respectively according to the teachers, the others are medium categorized. Overall, the distribution of the level of difficulty formulated by the teachers has the distinction (not match) to the level of difficulty based on the empirical results.

Keywords: chemistry, items parameter fitness, Rasch model, teacher-made test

Procedia PDF Downloads 237
1165 Optimization of Economic Order Quantity of Multi-Item Inventory Control Problem through Nonlinear Programming Technique

Authors: Prabha Rohatgi

Abstract:

To obtain an efficient control over a huge amount of inventory of drugs in pharmacy department of any hospital, generally, the medicines are categorized on the basis of their cost ‘ABC’ (Always Better Control), first and then categorize on the basis of their criticality ‘VED’ (Vital, Essential, desirable) for prioritization. About one-third of the annual expenditure of a hospital is spent on medicines. To minimize the inventory investment, the hospital management may like to keep the medicines inventory low, as medicines are perishable items. The main aim of each and every hospital is to provide better services to the patients under certain limited resources. To achieve the satisfactory level of health care services to outdoor patients, a hospital has to keep eye on the wastage of medicines because expiry date of medicines causes a great loss of money though it was limited and allocated for a particular period of time. The objectives of this study are to identify the categories of medicines requiring incentive managerial control. In this paper, to minimize the total inventory cost and the cost associated with the wastage of money due to expiry of medicines, an inventory control model is used as an estimation tool and then nonlinear programming technique is used under limited budget and fixed number of orders to be placed in a limited time period. Numerical computations have been given and shown that by using scientific methods in hospital services, we can give more effective way of inventory management under limited resources and can provide better health care services. The secondary data has been collected from a hospital to give empirical evidence.

Keywords: ABC-VED inventory classification, multi item inventory problem, nonlinear programming technique, optimization of EOQ

Procedia PDF Downloads 254
1164 Big Data in Telecom Industry: Effective Predictive Techniques on Call Detail Records

Authors: Sara ElElimy, Samir Moustafa

Abstract:

Mobile network operators start to face many challenges in the digital era, especially with high demands from customers. Since mobile network operators are considered a source of big data, traditional techniques are not effective with new era of big data, Internet of things (IoT) and 5G; as a result, handling effectively different big datasets becomes a vital task for operators with the continuous growth of data and moving from long term evolution (LTE) to 5G. So, there is an urgent need for effective Big data analytics to predict future demands, traffic, and network performance to full fill the requirements of the fifth generation of mobile network technology. In this paper, we introduce data science techniques using machine learning and deep learning algorithms: the autoregressive integrated moving average (ARIMA), Bayesian-based curve fitting, and recurrent neural network (RNN) are employed for a data-driven application to mobile network operators. The main framework included in models are identification parameters of each model, estimation, prediction, and final data-driven application of this prediction from business and network performance applications. These models are applied to Telecom Italia Big Data challenge call detail records (CDRs) datasets. The performance of these models is found out using a specific well-known evaluation criteria shows that ARIMA (machine learning-based model) is more accurate as a predictive model in such a dataset than the RNN (deep learning model).

Keywords: big data analytics, machine learning, CDRs, 5G

Procedia PDF Downloads 138
1163 Prevalence of Workplace Bullying in Hong Kong: A Latent Class Analysis

Authors: Catalina Sau Man Ng

Abstract:

Workplace bullying is generally defined as a form of direct and indirect maltreatment at work including harassing, offending, socially isolating someone or negatively affecting someone’s work tasks. Workplace bullying is unfortunately commonplace around the world, which makes it a social phenomenon worth researching. However, the measurements and estimation methods of workplace bullying seem to be diverse in different studies, leading to dubious results. Hence, this paper attempts to examine the prevalence of workplace bullying in Hong Kong using the latent class analysis approach. It is often argued that the traditional classification of workplace bullying into the dichotomous 'victims' and 'non-victims' may not be able to fully represent the complex phenomenon of bullying. By treating workplace bullying as one latent variable and examining the potential categorical distribution within the latent variable, a more thorough understanding of workplace bullying in real-life situations may hence be provided. As a result, this study adopts a latent class analysis method, which was tested to demonstrate higher construct and higher predictive validity previously. In the present study, a representative sample of 2814 employees (Male: 54.7%, Female: 45.3%) in Hong Kong was recruited. The participants were asked to fill in a self-reported questionnaire which included measurements such as Chinese Workplace Bullying Scale (CWBS) and Chinese Version of Depression Anxiety Stress Scale (DASS). It is estimated that four latent classes will emerge: 'non-victims', 'seldom bullied', 'sometimes bullied', and 'victims'. The results of each latent class and implications of the study will also be discussed in this working paper.

Keywords: latent class analysis, prevalence, survey, workplace bullying

Procedia PDF Downloads 329
1162 Unlocking the Puzzle of Borrowing Adult Data for Designing Hybrid Pediatric Clinical Trials

Authors: Rajesh Kumar G

Abstract:

A challenging aspect of any clinical trial is to carefully plan the study design to meet the study objective in optimum way and to validate the assumptions made during protocol designing. And when it is a pediatric study, there is the added challenge of stringent guidelines and difficulty in recruiting the necessary subjects. Unlike adult trials, there is not much historical data available for pediatrics, which is required to validate assumptions for planning pediatric trials. Typically, pediatric studies are initiated as soon as approval is obtained for a drug to be marketed for adults, so with the adult study historical information and with the available pediatric pilot study data or simulated pediatric data, the pediatric study can be well planned. Generalizing the historical adult study for new pediatric study is a tedious task; however, it is possible by integrating various statistical techniques and utilizing the advantage of hybrid study design, which will help to achieve the study objective in a smoother way even with the presence of many constraints. This research paper will explain how well the hybrid study design can be planned along with integrated technique (SEV) to plan the pediatric study; In brief the SEV technique (Simulation, Estimation (using borrowed adult data and applying Bayesian methods)) incorporates the use of simulating the planned study data and getting the desired estimates to Validate the assumptions.This method of validation can be used to improve the accuracy of data analysis, ensuring that results are as valid and reliable as possible, which allow us to make informed decisions well ahead of study initiation. With professional precision, this technique based on the collected data allows to gain insight into best practices when using data from historical study and simulated data alike.

Keywords: adaptive design, simulation, borrowing data, bayesian model

Procedia PDF Downloads 75
1161 Application of Nonparametric Geographically Weighted Regression to Evaluate the Unemployment Rate in East Java

Authors: Sifriyani Sifriyani, I Nyoman Budiantara, Sri Haryatmi, Gunardi Gunardi

Abstract:

East Java Province has a first rank as a province that has the most counties and cities in Indonesia and has the largest population. In 2015, the population reached 38.847.561 million, this figure showed a very high population growth. High population growth is feared to lead to increase the levels of unemployment. In this study, the researchers mapped and modeled the unemployment rate with 6 variables that were supposed to influence. Modeling was done by nonparametric geographically weighted regression methods with truncated spline approach. This method was chosen because spline method is a flexible method, these models tend to look for its own estimation. In this modeling, there were point knots, the point that showed the changes of data. The selection of the optimum point knots was done by selecting the most minimun value of Generalized Cross Validation (GCV). Based on the research, 6 variables were declared to affect the level of unemployment in eastern Java. They were the percentage of population that is educated above high school, the rate of economic growth, the population density, the investment ratio of total labor force, the regional minimum wage and the ratio of the number of big industry and medium scale industry from the work force. The nonparametric geographically weighted regression models with truncated spline approach had a coefficient of determination 98.95% and the value of MSE equal to 0.0047.

Keywords: East Java, nonparametric geographically weighted regression, spatial, spline approach, unemployed rate

Procedia PDF Downloads 320
1160 Modelling Biological Treatment of Dye Wastewater in SBR Systems Inoculated with Bacteria by Artificial Neural Network

Authors: Yasaman Sanayei, Alireza Bahiraie

Abstract:

This paper presents a systematic methodology based on the application of artificial neural networks for sequencing batch reactor (SBR). The SBR is a fill-and-draw biological wastewater technology, which is specially suited for nutrient removal. Employing reactive dye by Sphingomonas paucimobilis bacteria at sequence batch reactor is a novel approach of dye removal. The influent COD, MLVSS, and reaction time were selected as the process inputs and the effluent COD and BOD as the process outputs. The best possible result for the discrete pole parameter was a= 0.44. In orderto adjust the parameters of ANN, the Levenberg-Marquardt (LM) algorithm was employed. The results predicted by the model were compared to the experimental data and showed a high correlation with R2> 0.99 and a low mean absolute error (MAE). The results from this study reveal that the developed model is accurate and efficacious in predicting COD and BOD parameters of the dye-containing wastewater treated by SBR. The proposed modeling approach can be applied to other industrial wastewater treatment systems to predict effluent characteristics. Note that SBR are normally operated with constant predefined duration of the stages, thus, resulting in low efficient operation. Data obtained from the on-line electronic sensors installed in the SBR and from the control quality laboratory analysis have been used to develop the optimal architecture of two different ANN. The results have shown that the developed models can be used as efficient and cost-effective predictive tools for the system analysed.

Keywords: artificial neural network, COD removal, SBR, Sphingomonas paucimobilis

Procedia PDF Downloads 412
1159 The Signaling Power of ESG Accounting in Sub-Sahara Africa: A Dynamic Model Approach

Authors: Haruna Maama

Abstract:

Environmental, social and governance (ESG) reporting is gaining considerable attention despite being voluntary. Meanwhile, it consumes resources to provide ESG reporting, raising a question of its value relevance. The study examined the impact of ESG reporting on the market value of listed firms in SSA. The annual and integrated reports of 276 listed sub-Sahara Africa (SSA) firms. The integrated reporting scores of the firm were analysed using a content analysis method. A multiple regression estimation technique using a GMM approach was employed for the analysis. The results revealed that ESG has a positive relationship with firms’ market value, suggesting that investors are interested in the ESG information disclosure of firms in SSA. This suggests that extensive ESG disclosures are attempts by firms to obtain the approval of powerful social, political and environmental stakeholders, especially institutional investors. Furthermore, the market value analysis evidence is consistent with signalling theory, which postulates that firms provide integrated reports as a signal to influence the behaviour of stakeholders. This finding reflects the value placed on investors' social, environmental and governance disclosures, which affirms the views that conventional investors would care about the social, environmental and governance issues of their potential or existing investee firms. Overall, the evidence is consistent with the prediction of signalling theory. In the context of this theory, integrated reporting is seen as part of firms' overall competitive strategy to influence investors' behaviour. The findings of this study make unique contributions to knowledge and practice in corporate reporting.

Keywords: environmental accounting, ESG accounting, signalling theory, sustainability reporting, sub-saharan Africa

Procedia PDF Downloads 75
1158 Using Fuzzy Logic Decision Support System to Predict the Lifted Weight for Students at Weightlifting Class

Authors: Ahmed Abdulghani Taha, Mohammad Abdulghani Taha

Abstract:

This study aims at being acquainted with the using the body fat percentage (%BF) with body Mass Index (BMI) as input parameters in fuzzy logic decision support system to predict properly the lifted weight for students at weightlifting class lift according to his abilities instead of traditional manner. The sample included 53 male students (age = 21.38 ± 0.71 yrs, height (Hgt) = 173.17 ± 5.28 cm, body weight (BW) = 70.34 ± 7.87.6 kg, Body mass index (BMI) 23.42 ± 2.06 kg.m-2, fat mass (FM) = 9.96 ± 3.15 kg and fat percentage (% BF) = 13.98 ± 3.51 %.) experienced the weightlifting class as a credit and has variance at BW, Hgt and BMI and FM. BMI and % BF were taken as input parameters in FUZZY logic whereas the output parameter was the lifted weight (LW). There were statistical differences between LW values before and after using fuzzy logic (Diff 3.55± 2.21, P > 0.001). The percentages of the LW categories proposed by fuzzy logic were 3.77% of students to lift 1.0 fold of their bodies; 50.94% of students to lift 0.95 fold of their bodies; 33.96% of students to lift 0.9 fold of their bodies; 3.77% of students to lift 0.85 fold of their bodies and 7.55% of students to lift 0.8 fold of their bodies. The study concluded that the characteristic changes in body composition experienced by students when undergoing weightlifting could be utilized side by side with the Fuzzy logic decision support system to determine the proper workloads consistent with the abilities of students.

Keywords: fuzzy logic, body mass index, body fat percentage, weightlifting

Procedia PDF Downloads 428
1157 Uncertainty Assessment in Building Energy Performance

Authors: Fally Titikpina, Abderafi Charki, Antoine Caucheteux, David Bigaud

Abstract:

The building sector is one of the largest energy consumer with about 40% of the final energy consumption in the European Union. Ensuring building energy performance is of scientific, technological and sociological matter. To assess a building energy performance, the consumption being predicted or estimated during the design stage is compared with the measured consumption when the building is operational. When valuing this performance, many buildings show significant differences between the calculated and measured consumption. In order to assess the performance accurately and ensure the thermal efficiency of the building, it is necessary to evaluate the uncertainties involved not only in measurement but also those induced by the propagation of dynamic and static input data in the model being used. The evaluation of measurement uncertainty is based on both the knowledge about the measurement process and the input quantities which influence the result of measurement. Measurement uncertainty can be evaluated within the framework of conventional statistics presented in the \textit{Guide to the Expression of Measurement Uncertainty (GUM)} as well as by Bayesian Statistical Theory (BST). Another choice is the use of numerical methods like Monte Carlo Simulation (MCS). In this paper, we proposed to evaluate the uncertainty associated to the use of a simplified model for the estimation of the energy consumption of a given building. A detailed review and discussion of these three approaches (GUM, MCS and BST) is given. Therefore, an office building has been monitored and multiple sensors have been mounted on candidate locations to get required data. The monitored zone is composed of six offices and has an overall surface of 102 $m^2$. Temperature data, electrical and heating consumption, windows opening and occupancy rate are the features for our research work.

Keywords: building energy performance, uncertainty evaluation, GUM, bayesian approach, monte carlo method

Procedia PDF Downloads 457
1156 Optimized Real Ground Motion Scaling for Vulnerability Assessment of Building Considering the Spectral Uncertainty and Shape

Authors: Chen Bo, Wen Zengping

Abstract:

Based on the results of previous studies, we focus on the research of real ground motion selection and scaling method for structural performance-based seismic evaluation using nonlinear dynamic analysis. The input of earthquake ground motion should be determined appropriately to make them compatible with the site-specific hazard level considered. Thus, an optimized selection and scaling method are established including the use of not only Monte Carlo simulation method to create the stochastic simulation spectrum considering the multivariate lognormal distribution of target spectrum, but also a spectral shape parameter. Its applications in structural fragility analysis are demonstrated through case studies. Compared to the previous scheme with no consideration of the uncertainty of target spectrum, the method shown here can make sure that the selected records are in good agreement with the median value, standard deviation and spectral correction of the target spectrum, and greatly reveal the uncertainty feature of site-specific hazard level. Meanwhile, it can help improve computational efficiency and matching accuracy. Given the important infection of target spectrum’s uncertainty on structural seismic fragility analysis, this work can provide the reasonable and reliable basis for structural seismic evaluation under scenario earthquake environment.

Keywords: ground motion selection, scaling method, seismic fragility analysis, spectral shape

Procedia PDF Downloads 291
1155 Non-Methane Hydrocarbons Emission during the Photocopying Process

Authors: Kiurski S. Jelena, Aksentijević M. Snežana, Kecić S. Vesna, Oros B. Ivana

Abstract:

The prosperity of electronic equipment in photocopying environment not only has improved work efficiency, but also has changed indoor air quality. Considering the number of photocopying employed, indoor air quality might be worse than in general office environments. Determining the contribution from any type of equipment to indoor air pollution is a complex matter. Non-methane hydrocarbons are known to have an important role of air quality due to their high reactivity. The presence of hazardous pollutants in indoor air has been detected in one photocopying shop in Novi Sad, Serbia. Air samples were collected and analyzed for five days, during 8-hr working time in three-time intervals, whereas three different sampling points were determined. Using multiple linear regression model and software package STATISTICA 10 the concentrations of occupational hazards and micro-climates parameters were mutually correlated. Based on the obtained multiple coefficients of determination (0.3751, 0.2389, and 0.1975), a weak positive correlation between the observed variables was determined. Small values of parameter F indicated that there was no statistically significant difference between the concentration levels of non-methane hydrocarbons and micro-climates parameters. The results showed that variable could be presented by the general regression model: y = b0 + b1xi1+ b2xi2. Obtained regression equations allow to measure the quantitative agreement between the variation of variables and thus obtain more accurate knowledge of their mutual relations.

Keywords: non-methane hydrocarbons, photocopying process, multiple regression analysis, indoor air quality, pollutant emission

Procedia PDF Downloads 376
1154 The Effects of Time and Cyclic Loading to the Axial Capacity for Offshore Pile in Shallow Gas

Authors: Christian H. Girsang, M. Razi B. Mansoor, Noorizal N. Huang

Abstract:

An offshore platform was installed in 1977 at about 260km offshore West Malaysia at the water depth of 73.6m. Twelve (12) piles were installed with four (4) are skirt piles. The piles have 1.219m outside diameter and wall thickness of 31mm and were driven to 109m below seabed. Deterministic analyses of the pile capacity under axial loading were conducted using the current API (American Petroleum Institute) method and the four (4) CPT-based methods: the ICP (Imperial College Pile)-method, the NGI (Norwegian Geotechnical Institute)-Method, the UWA (University of Western Australia)-method and the Fugro-method. A statistical analysis of the model uncertainty associated with each pile capacity method was performed. There were two (2) piles analysed: Pile 1 and piles other than Pile 1, where Pile 1 is the pile that was most affected by shallow gas problems. Using the mean estimate of soil properties, the five (5) methods used for deterministic estimation of axial pile capacity in compression predict an axial capacity from 28 to 42MN for Pile 1 and 32 to 49MN for piles other than Pile 1. These values refer to the static capacity shortly after pile installation. They do not include the effects of cyclic loading during the design storm or time after installation on the axial pile capacity. On average, the axial pile capacity is expected to have increased by about 40% because of ageing since the installation of the platform in 1977. On the other hand, the cyclic loading effects during the design storm may reduce the axial capacity of the piles by around 25%. The study concluded that all piles have sufficient safety factor when the pile aging and cyclic loading effect are considered, as all safety factors are above 2.0 for maximum operating and storm loads.

Keywords: axial capacity, cyclic loading, pile ageing, shallow gas

Procedia PDF Downloads 342
1153 The Effect of Alkaline Treatment on Tensile Strength and Morphological Properties of Kenaf Fibres for Yarn Production

Authors: A. Khalina, K. Shaharuddin, M. S. Wahab, M. P. Saiman, H. A. Aisyah

Abstract:

This paper investigates the effect of alkali treatment and mechanical properties of kenaf (Hibiscus cannabinus) fibre for the development of yarn. Two different fibre sources are used for the yarn production. Kenaf fibres were treated with sodium hydroxide (NaOH) in the concentration of 3, 6, 9, and 12% prior to fibre opening process and tested for their tensile strength and Young’s modulus. Then, the selected fibres were introduced to fibre opener at three different opening processing parameters; namely, speed of roller feeder, small drum, and big drum. The diameter size, surface morphology, and fibre durability towards machine of the fibres were characterized. The results show that concentrations of NaOH used have greater effects on fibre mechanical properties. From this study, the tensile and modulus properties of the treated fibres for both types have improved significantly as compared to untreated fibres, especially at the optimum level of 6% NaOH. It is also interesting to highlight that 6% NaOH is the optimum concentration for the alkaline treatment. The untreated and treated fibres at 6% NaOH were then introduced to fibre opener, and it was found that the treated fibre produced higher fibre diameter with better surface morphology compared to the untreated fibre. Higher speed parameter during opening was found to produce higher yield of opened-kenaf fibres.

Keywords: alkaline treatment, kenaf fibre, tensile strength, yarn production

Procedia PDF Downloads 245
1152 Developing Pavement Maintenance Management System (PMMS) for Small Cities, Aswan City Case Study

Authors: Ayman Othman, Tallat Ali

Abstract:

A pavement maintenance management system (PMMS) was developed for the city of Aswan as a model of a small city to provide the road maintenance department in Aswan city with the capabilities for comprehensive planning of the maintenance activities needed to put the internal pavement network into desired physical condition in view of maintenance budget constraints. The developed system consists of three main stages. First is the inventory & condition survey stage where the internal pavement network of Aswan city was inventoried and its actual conditions were rated in segments of 100 meters length. Second is the analysis stage where pavement condition index (PCI) was calculated and the most appropriate maintenance actions were assigned for each segment. The total maintenance budget was also estimated and a parameter based ranking criteria were developed to prioritize maintenance activities when the available maintenance budget is not sufficient. Finally comes the packaging stage where approved maintenance budget is packed into maintenance projects for field implementation. System results indicate that, the system output maintenance budget is very reasonable and the system output maintenance programs agree to a great extent with the actual maintenance needs of the network. Condition survey of Aswan city road network showed that roughness is the most dominate distress. In general, the road network can be considered in a fairly reasonable condition, however, the developed PMMS needs to be officially adapted to maintain the road network in a desirable condition and to prevent further deterioration.

Keywords: pavement, maintenance, management, system, distresses, survey, ranking

Procedia PDF Downloads 246
1151 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier

Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh

Abstract:

This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.

Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems

Procedia PDF Downloads 42
1150 Estimation of Twist Loss in the Weft Yarn during Air-Jet Weft Insertion

Authors: Muhammad Umair, Yasir Nawab, Khubab Shaker, Muhammad Maqsood, Adeel Zulfiqar, Danish Mahmood Baitab

Abstract:

Fabric is a flexible woven material consisting of a network of natural or artificial fibers often referred to as thread or yarn. Today fabrics are produced by weaving, braiding, knitting, tufting and non-woven. Weaving is a method of fabric production in which warp and weft yarns are interlaced perpendicular to each other. There is infinite number of ways for the interlacing of warp and weft yarn. Each way produces a different fabric structure. The yarns parallel to the machine direction are called warp yarns and the yarns perpendicular to the machine direction are called weft or filling yarns. Air jet weaving is the modern method of weft insertion and considered as high speed loom. The twist loss in air jet during weft insertion affects the strength. The aim of this study was to investigate the effect of twist change in weft yarn during air-jet weft insertion. A total number of 8 samples were produced using 1/1 plain and 3/1 twill weave design with two fabric widths having same loom settings. Two different types of yarns like cotton and PC blend were used. The effect of material type, weave design and fabric width on twist change of weft yarn was measured and discussed. Twist change in the different types of weft yarn and weave design was measured and compared the twist change in the weft yarn with the yarn before weft yarn insertion and twist loss is measured. Wider fabric leads to higher twist loss in the yarn.

Keywords: air jet loom, twist per inch, twist loss, weft yarn

Procedia PDF Downloads 401
1149 Fast Generation of High-Performance Driveshafts: A Digital Approach to Automated Linked Topology and Design Optimization

Authors: Willi Zschiebsch, Alrik Dargel, Sebastian Spitzer, Philipp Johst, Robert Böhm, Niels Modler

Abstract:

In this article, we investigate an approach that digitally links individual development process steps by using the drive shaft of an aircraft engine as a representative example of a fiber polymer composite. Such high-performance, lightweight composite structures have many adjustable parameters that influence the mechanical properties. Only a combination of optimal parameter values can lead to energy efficient lightweight structures. The development tools required for the Engineering Design Process (EDP) are often isolated solutions, and their compatibility with each other is limited. A digital framework is presented in this study, which allows individual specialised tools to be linked via the generated data in such a way that automated optimization across programs becomes possible. This is demonstrated using the example of linking geometry generation with numerical structural analysis. The proposed digital framework for automated design optimization demonstrates the feasibility of developing a complete digital approach to design optimization. The methodology shows promising potential for achieving optimal solutions in terms of mass, material utilization, eigenfrequency, and deformation under lateral load with less development effort. The development of such a framework is an important step towards promoting a more efficient design approach that can lead to stable and balanced results.

Keywords: digital linked process, composite, CFRP, multi-objective, EDP, NSGA-2, NSGA-3, TPE

Procedia PDF Downloads 75
1148 Bile Salt Induced Microstructural Changes of Gemini Surfactant Micelles

Authors: Vijaykumar Patel, P. Bahadur

Abstract:

Microstructural evolution of a cationic gemini surfactant 12-4-12 micelles in the presence of bile salts has been investigated using different techniques. A negative value of interaction parameter evaluated from surface tension measurements is a signature of strong synergistic interaction between oppositely charged surfactants. Both the bile salts compete with each other in inducing the micellar transition of 12-4-12 micelles depending on their hydrophobicity. Viscosity measurements disclose that loading of bile salts induces morphological changes in 12-4-12 micelles; sodium deoxycholate is more efficient in altering the aggregation behaviour of 12-4-12 micelles compared to sodium cholate and presents pronounced increase in viscosity and micellar growth which is suppressed at elevated temperatures. A remarkable growth of 12-4-12 micelles in the presence of sodium deoxycholate at low pH has been ascribed to the solubilization of bile acids formed in acidic medium. Small angle neutron scattering experiments provided size and shape of 12-4-12/bile salt mixed micelles are explicated on the basis of hydrophobicity of bile salts. The location of bile salts in micelle was determined from nuclear overhauser effect spectroscopy. The present study characterizes 12-4-12 gemini-bile salt mixed systems which significantly enriches our knowledge, and such a structural transition provides an opportunity to use these bioamphiphiles as delivery vehicles and in some pharmaceutical formulations.

Keywords: gemini surfactants, bile salts, SANS (small angle neutron scattering), NOESY (nuclear overhauser effect spectroscopy)

Procedia PDF Downloads 149
1147 Agriculture Water Quality Evaluation in Minig Basin

Authors: Ben Salah Nahla

Abstract:

The problem of water in Tunisia affects the quality and quantity. Tunisia is in a situation of water shortage. It was estimated that 4.6 Mm3/an. Moreover, the quality of water in Tunisia is also mediocre. In fact, 50% of the water has a high salinity (> 1.5g/l). There are several parameters which affect water quality such as sodium, fluoride. An excess of this parameter may induce some human health. Furthermore, the mining basin area has a problem of industrial waste. This problem may affect the water quality of the groundwater. Therefore, the purpose of this work is to assess the water quality in Basin Mining and the impact of fluorine. For this research, some water samples were done in the field and specific water analysis was implemented in the laboratory. Sampling is carried out on eight drilling in the area of the mining region. In the following, we will look at water view composition, physical and chemical quality. A physical-chemical analysis of water from a survey of the Mining area of Tunisia was performed and showed an excess for the following items: fluorine, sodium, sulfate. So many chemicals may be present in water. However, only a small number of them immediately concern in terms of health in all circumstances. Fluorine (F) is one particular chemical that is considered both necessary for the human body, but an excess of the rate of this chemical causes serious diseases. Sodium fluoride and sodium silicofluoride are more soluble and may spread in animals and plants where their toxicity largest organizations. The more complex particles such as cryolite and fluorite, almost insoluble, are more stable and less toxic. Thereafter, we will study the problem of excess fluorine in the water. The latter intended for human consumption must always comply with the limits for microbiological quality parameters and physical-chemical parameters defined by European standards (1.5 mg/l) and Tunisian (2 mg/l).

Keywords: water, minier basin, fluorine, silicofluoride

Procedia PDF Downloads 581
1146 The Efficiency of AFLP and ISSR Markers in Genetic Diversity Estimation and Gene Pool Classification of Iranian Landrace Bread Wheat (Triticum Aestivum L.) Germplasm

Authors: Reza Talebi

Abstract:

Wheat (Triticum aestivum) is one of the most important food staples in Iran. Understanding genetic variability among the landrace wheat germplasm is important for breeding. Landraces endemic to Iran are a genetic resource that is distinct from other wheat germplasm. In this study, 60 Iranian landrace wheat accessions were characterized AFLP and ISSR markers. Twelve AFLP primer pairs detected 128 polymorphic bands among the sixty genotypes. The mean polymorphism rate based on AFLP data was 31%; however, a wide polymorphism range among primer pairs was observed (22–40%). Polymorphic information content (PIC value) calculated to assess the informativeness of each marker ranged from 0.28 to 0.4, with a mean of 0.37. According to AFLP molecular data, cluster analysis grouped the genotypes in five distinct clusters. .ISSR markers generated 68 bands (average of 6 bands per primer), which 31 were polymorphic (45%) across the 60 wheat genotypes. Polymorphism information content (PIC) value for ISSR markers was calculated in the range of 0.14 to 0.48 with an average of 0.33. Based on data achieved by ISSR-PCR, cluster analysis grouped the genotypes in three distinct clusters. Both AFLP and ISSR markers able to showed that high level of genetic diversity in Iranian landrace wheat accessions has maintained a relatively constant level of genetic diversity during last years.

Keywords: wheat, genetic diversity, AFLP, ISSR

Procedia PDF Downloads 450
1145 Prevalence of Selected Cardiovascular Risk Factors Obesity among University of Venda Staff

Authors: Avhasei Dorothy Rasifudi, Josephine Mandizha

Abstract:

Cardiovascular risk factors continue to be the leading cause of death in the majority of developed and developing countries. In 2011, the World Health Organization reported that every year an estimated 17 million people globally die of CVD, representing 30% of all global deaths, particularly caused by heart attacks and strokes. The purpose of the study was to determine and describe the prevalence of selected cardiovascular risk factors among university of Venda staff. A cross-sectional study was conducted among 100 staff aged 20-65 years. The anthropometric measurements were conducted in accordance to and with standardized procedures advocated by the International Society for the Advanced Kinanthropometry. Weight, Height, waist circumference and hip circumference were measured for calculation of body mass index and waist-hip ratio. Blood pressure was measured using a Heine cuff and sphygmomanometer. Questionnaire was administered to gather demographic details and cardiovascular risk factors of hypertension and obesity. Data were analyzed using mean and standard deviation. The parameter t-test was applied to test significance level at p ≤ 0.05 between sexes. The statistical significance was set at p ≤ 0.05. The prevalence of hypertension was 23% with the highest prevalence amongst those aged 40 years and above. Factors found to be to be significantly associated with hypertension were gender, age, physical inactivity and family history. Prevalence of obesity was 43%, with the highest prevalence among those aged 40 years. The factors associated with obesity were diet, age and physical activity. The prevalence of hypertension and obesity in the study were high.

Keywords: cardiovascular, prevalence, risk factors, staff

Procedia PDF Downloads 293