Search results for: degradation models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8252

Search results for: degradation models

7862 Comparison Of Data Mining Models To Predict Future Bridge Conditions

Authors: Pablo Martinez, Emad Mohamed, Osama Mohsen, Yasser Mohamed

Abstract:

Highway and bridge agencies, such as the Ministry of Transportation in Ontario, use the Bridge Condition Index (BCI) which is defined as the weighted condition of all bridge elements to determine the rehabilitation priorities for its bridges. Therefore, accurate forecasting of BCI is essential for bridge rehabilitation budgeting planning. The large amount of data available in regard to bridge conditions for several years dictate utilizing traditional mathematical models as infeasible analysis methods. This research study focuses on investigating different classification models that are developed to predict the bridge condition index in the province of Ontario, Canada based on the publicly available data for 2800 bridges over a period of more than 10 years. The data preparation is a key factor to develop acceptable classification models even with the simplest one, the k-NN model. All the models were tested, compared and statistically validated via cross validation and t-test. A simple k-NN model showed reasonable results (within 0.5% relative error) when predicting the bridge condition in an incoming year.

Keywords: asset management, bridge condition index, data mining, forecasting, infrastructure, knowledge discovery in databases, maintenance, predictive models

Procedia PDF Downloads 185
7861 Analytical Study on Threats to Wetland Ecosystems and Their Solutions in the Framework of the Ramsar Convention

Authors: Ehsan Daryadel, Farhad Talaie

Abstract:

Wetlands are one of the most important ecosystems on Earth. Nevertheless, various challenges threaten these ecosystems and disrupt their ecological character. Among these, the effects of human-based threats are more devastating. Following mass degradation of wetlands during 1970s, the Ramsar Convention on Wetlands (Ramsar, Iran, 1971) was concluded to conserve wetlands of international importance and prevent destruction and degradation of such ecosystems through wise use of wetlands as a mean to achieve sustainable development in all over the world. Therefore, in this paper, efforts have been made to analyze threats to wetlands and then investigate solutions in the framework of the Ramsar Convention. Finally, in order to operate these mechanisms, this study concludes that all states should in turn make their best effort to improve and restore global wetlands through preservation of environmental standards and close contribution and also through taking joint measures with other states effectively.

Keywords: Ramsar Convention, threats, wetland wcosystems, wise use

Procedia PDF Downloads 395
7860 Social Entrepreneurship on Islamic Perspective: Identifying Research Gap

Authors: Mohd Adib Abd Muin, Shuhairimi Abdullah, Azizan Bahari

Abstract:

Problem: The research problem is lacking of model on social entrepreneurship that focus on Islamic perspective. Objective: The objective of this paper is to analyse the existing model on social entrepreneurship and to identify the research gap on Islamic perspective from existing models. Research Methodology: The research method used in this study is literature review and comparative analysis from 6 existing models of social entrepreneurship. Finding: The research finding shows that 6 existing models on social entrepreneurship has been analysed and it shows that the existing models on social entrepreneurship do not emphasize on Islamic perspective.

Keywords: social entrepreneurship, Islamic perspective, research gap, business management

Procedia PDF Downloads 351
7859 Simultaneous Determination of Methotrexate and Aspirin Using Fourier Transform Convolution Emission Data under Non-Parametric Linear Regression Method

Authors: Marwa A. A. Ragab, Hadir M. Maher, Eman I. El-Kimary

Abstract:

Co-administration of methotrexate (MTX) and aspirin (ASP) can cause a pharmacokinetic interaction and a subsequent increase in blood MTX concentrations which may increase the risk of MTX toxicity. Therefore, it is important to develop a sensitive, selective, accurate and precise method for their simultaneous determination in urine. A new hybrid chemometric method has been applied to the emission response data of the two drugs. Spectrofluorimetric method for determination of MTX through measurement of its acid-degradation product, 4-amino-4-deoxy-10-methylpteroic acid (4-AMP), was developed. Moreover, the acid-catalyzed degradation reaction enables the spectrofluorimetric determination of ASP through the formation of its active metabolite salicylic acid (SA). The proposed chemometric method deals with convolution of emission data using 8-points sin xi polynomials (discrete Fourier functions) after the derivative treatment of these emission data. The first and second derivative curves (D1 & D2) were obtained first then convolution of these curves was done to obtain first and second derivative under Fourier functions curves (D1/FF) and (D2/FF). This new application was used for the resolution of the overlapped emission bands of the degradation products of both drugs to allow their simultaneous indirect determination in human urine. Not only this chemometric approach was applied to the emission data but also the obtained data were subjected to non-parametric linear regression analysis (Theil’s method). The proposed method was fully validated according to the ICH guidelines and it yielded linearity ranges as follows: 0.05-0.75 and 0.5-2.5 µg mL-1 for MTX and ASP respectively. It was found that the non-parametric method was superior over the parametric one in the simultaneous determination of MTX and ASP after the chemometric treatment of the emission spectra of their degradation products. The work combines the advantages of derivative and convolution using discrete Fourier function together with the reliability and efficacy of the non-parametric analysis of data. The achieved sensitivity along with the low values of LOD (0.01 and 0.06 µg mL-1) and LOQ (0.04 and 0.2 µg mL-1) for MTX and ASP respectively, by the second derivative under Fourier functions (D2/FF) were promising and guarantee its application for monitoring the two drugs in patients’ urine samples.

Keywords: chemometrics, emission curves, derivative, convolution, Fourier transform, human urine, non-parametric regression, Theil’s method

Procedia PDF Downloads 427
7858 A-Score, Distress Prediction Model with Earning Response during the Financial Crisis: Evidence from Emerging Market

Authors: Sumaira Ashraf, Elisabete G.S. Félix, Zélia Serrasqueiro

Abstract:

Traditional financial distress prediction models performed well to predict bankrupt and insolvent firms of the developed markets. Previous studies particularly focused on the predictability of financial distress, financial failure, and bankruptcy of firms. This paper contributes to the literature by extending the definition of financial distress with the inclusion of early warning signs related to quotation of face value, dividend/bonus declaration, annual general meeting, and listing fee. The study used five well-known distress prediction models to see if they have the ability to predict early warning signs of financial distress. Results showed that the predictive ability of the models varies over time and decreases specifically for the sample with early warning signs of financial distress. Furthermore, the study checked the differences in the predictive ability of the models with respect to the financial crisis. The results conclude that the predictive ability of the traditional financial distress prediction models decreases for the firms with early warning signs of financial distress and during the time of financial crisis. The study developed a new model comprising significant variables from the five models and one new variable earning response. This new model outperforms the old distress prediction models before, during and after the financial crisis. Thus, it can be used by researchers, organizations and all other concerned parties to indicate early warning signs for the emerging markets.

Keywords: financial distress, emerging market, prediction models, Z-Score, logit analysis, probit model

Procedia PDF Downloads 238
7857 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis

Authors: H. Jung, N. Kim, B. Kang, J. Choe

Abstract:

History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.

Keywords: history matching, principal component analysis, reservoir modelling, support vector machine

Procedia PDF Downloads 157
7856 Antioxidant Juice Prevents UV- Induced Skin Damage in Rats

Authors: S. P. Gomes, D. C. Goncalves, E. Ribeiro, M. C. L. Seelaender

Abstract:

Skin is susceptible to photo damage induced by exposure to sunlight, or ultraviolet (UV) radiation, which induces breakdown of extracellular matrix, DNA degradation, skin cell lesion and apoptosis, and development of cancer. Phytonutrients demonstrate protective effects against UV damage. The purpose of this study was evaluating the effect of an antioxidant juice (AJ) contaning Brazilian natural products upon skin damage. The juice was produced by Metabolics®. Male Wistar rats were divided in 4 groups: Animals receiving the antioxidant juice (AJ): orange, carrot, honey, tomato extract, avocado, ginger and camu-camu (Brazilian fruit, a major source of vitamin C) ad libitum for 21 days; or water (C), subdivided in groups exposed or not to UV radiation for 2 non consecutive days, during five hours each day, after 15 days of juice supplementation. On the 22nd day, rats were killed by decapitation and epithelium samples from the dorsal skin removed, fixed in bouin and embedded in paraffin. The sections were stained with hematoxylin and eosin or mallory and picrosirius red. Isolated DNA was submitted to electrophoresis (1.8% agarose gel, 0.5% ethidium bromide). UV radiation significantly induced sunburn of superficial epithelial cells of C, AJ treatment reduced this effect. Collagen changes were observed in UV groups, yet AJ treatment prevented collagen degradation. UV radiation induced significant DNA degradation, in C, which was prevented by AJ treatment. The antioxidant juice consumed chronically protected against acute skin damage.

Keywords: nutraceuticals, antioxidants, photoprotection, uv radiation

Procedia PDF Downloads 618
7855 Stability Analysis of Endemic State of Modelling the Effect of Vaccination and Novel Quarantine-Adjusted Incidence on the Spread of Newcastle Disease Virus

Authors: Nurudeen Oluwasola Lasisi, Abdulkareem Afolabi Ibrahim

Abstract:

Newcastle disease is an infection of domestic poultry and other bird species with virulent Newcastle disease virus (NDV). In this paper, we study the dynamics of modeling the Newcastle disease virus (NDV) using a novel quarantine-adjusted incidence. We do a comparison of Vaccination, linear incident rate, and novel quarantine adjusted incident rate in the models. The dynamics of the models yield disease free and endemic equilibrium states. The effective reproduction numbers of the models are computed in order to measure the relative impact for the individual bird or combined intervention for effective disease control. We showed the local and global stability of endemic equilibrium states of the models, and we found that stability of endemic equilibrium states of models are globally asymptotically stable if the effective reproduction numbers of the models equations are greater than a unit.

Keywords: effective reproduction number, endemic state, mathematical model, Newcastle disease virus, novel quarantine-adjusted incidence, stability analysis

Procedia PDF Downloads 242
7854 Reservoir Fluids: Occurrence, Classification, and Modeling

Authors: Ahmed El-Banbi

Abstract:

Several PVT models exist to represent how PVT properties are handled in sub-surface and surface engineering calculations for oil and gas production. The most commonly used models include black oil, modified black oil (MBO), and compositional models. These models are used in calculations that allow engineers to optimize and forecast well and reservoir performance (e.g., reservoir simulation calculations, material balance, nodal analysis, surface facilities, etc.). The choice of which model is dependent on fluid type and the production process (e.g., depletion, water injection, gas injection, etc.). Based on close to 2,000 reservoir fluid samples collected from different basins and locations, this paper presents some conclusions on the occurrence of reservoir fluids. It also reviews the common methods used to classify reservoir fluid types. Based on new criteria related to the production behavior of different fluids and economic considerations, an updated classification of reservoir fluid types is presented in the paper. Recommendations on the use of different PVT models to simulate the behavior of different reservoir fluid types are discussed. Each PVT model requirement is highlighted. Available methods for the calculation of PVT properties from each model are also discussed. Practical recommendations and tips on how to control the calculations to achieve the most accurate results are given.

Keywords: PVT models, fluid types, PVT properties, fluids classification

Procedia PDF Downloads 66
7853 Modeling Curriculum for High School Students to Learn about Electric Circuits

Authors: Meng-Fei Cheng, Wei-Lun Chen, Han-Chang Ma, Chi-Che Tsai

Abstract:

Recent K–12 Taiwan Science Education Curriculum Guideline emphasize the essential role of modeling curriculum in science learning; however, few modeling curricula have been designed and adopted in current science teaching. Therefore, this study aims to develop modeling curriculum on electric circuits to investigate any learning difficulties students have with modeling curriculum and further enhance modeling teaching. This study was conducted with 44 10th-grade students in Central Taiwan. Data collection included a students’ understanding of models in science (SUMS) survey that explored the students' epistemology of scientific models and modeling and a complex circuit problem to investigate the students’ modeling abilities. Data analysis included the following: (1) Paired sample t-tests were used to examine the improvement of students’ modeling abilities and conceptual understanding before and after the curriculum was taught. (2) Paired sample t-tests were also utilized to determine the students’ modeling abilities before and after the modeling activities, and a Pearson correlation was used to understand the relationship between students’ modeling abilities during the activities and on the posttest. (3) ANOVA analysis was used during different stages of the modeling curriculum to investigate the differences between the students’ who developed microscopic models and macroscopic models after the modeling curriculum was taught. (4) Independent sample t-tests were employed to determine whether the students who changed their models had significantly different understandings of scientific models than the students who did not change their models. The results revealed the following: (1) After the modeling curriculum was taught, the students had made significant progress in both their understanding of the science concept and their modeling abilities. In terms of science concepts, this modeling curriculum helped the students overcome the misconception that electric currents reduce after flowing through light bulbs. In terms of modeling abilities, this modeling curriculum helped students employ macroscopic or microscopic models to explain their observed phenomena. (2) Encouraging the students to explain scientific phenomena in different context prompts during the modeling process allowed them to convert their models to microscopic models, but it did not help them continuously employ microscopic models throughout the whole curriculum. The students finally consistently employed microscopic models when they had help visualizing the microscopic models. (3) During the modeling process, the students who revised their own models better understood that models can be changed than the students who did not revise their own models. Also, the students who revised their models to explain different scientific phenomena tended to regard models as explanatory tools. In short, this study explored different strategies to facilitate students’ modeling processes as well as their difficulties with the modeling process. The findings can be used to design and teach modeling curricula and help students enhance their modeling abilities.

Keywords: electric circuits, modeling curriculum, science learning, scientific model

Procedia PDF Downloads 455
7852 A Structuring and Classification Method for Assigning Application Areas to Suitable Digital Factory Models

Authors: R. Hellmuth

Abstract:

The method of factory planning has changed a lot, especially when it is about planning the factory building itself. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity and Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Furthermore, digital building models are increasingly being used in factories to support facility management and manufacturing processes. The main research question of this paper is, therefore: What kind of digital factory model is suitable for the different areas of application during the operation of a factory? First, different types of digital factory models are investigated, and their properties and usabilities for use cases are analysed. Within the scope of investigation are point cloud models, building information models, photogrammetry models, and these enriched with sensor data are examined. It is investigated which digital models allow a simple integration of sensor data and where the differences are. Subsequently, possible application areas of digital factory models are determined by means of a survey and the respective digital factory models are assigned to the application areas. Finally, an application case from maintenance is selected and implemented with the help of the appropriate digital factory model. It is shown how a completely digitalized maintenance process can be supported by a digital factory model by providing information. Among other purposes, the digital factory model is used for indoor navigation, information provision, and display of sensor data. In summary, the paper shows a structuring of digital factory models that concentrates on the geometric representation of a factory building and its technical facilities. A practical application case is shown and implemented. Thus, the systematic selection of digital factory models with the corresponding application cases is evaluated.

Keywords: building information modeling, digital factory model, factory planning, maintenance

Procedia PDF Downloads 103
7851 Mediation Models in Triadic Relationships: Illness Narratives and Medical Education

Authors: Yoko Yamada, Chizumi Yamada

Abstract:

Narrative psychology is based on the dialogical relationship between self and other. The dialogue can consist of divided, competitive, or opposite communication between self and other. We constructed models of coexistent dialogue in which self and other were positioned side by side and communicated sympathetically. We propose new mediation models for narrative relationships. The mediation models are based on triadic relationships that incorporate a medium or a mediator along with self and other. We constructed three types of mediation model. In the first type, called the “Joint Attention Model”, self and other are positioned side by side and share attention with the medium. In the second type, the “Triangle Model”, an agent mediates between self and other. In the third type, the “Caring Model”, a caregiver stands beside the communication between self and other. We apply the three models to the illness narratives of medical professionals and patients. As these groups have different views and experiences of disease or illness, triadic mediation facilitates the ability to see things from the other person’s perspective and to bridge differences in people’s experiences and feelings. These models would be useful for medical education in various situations, such as in considering the relationships between senior and junior doctors and between old and young patients.

Keywords: illness narrative, mediation, psychology, model, medical education

Procedia PDF Downloads 405
7850 Contextual Toxicity Detection with Data Augmentation

Authors: Julia Ive, Lucia Specia

Abstract:

Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.

Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing

Procedia PDF Downloads 167
7849 Design and Study of a Parabolic Trough Solar Collector for Generating Electricity

Authors: A. A. A. Aboalnour, Ahmed M. Amasaib, Mohammed-Almujtaba A. Mohammed-Farah, Abdelhakam, A. Noreldien

Abstract:

This paper presents a design and study of Parabolic Trough Solar Collector (PTC). Mathematical models were used in this work to find the direct and reflected solar radiation from the air layer on the surface of the earth per hour based on the total daily solar radiation on a horizontal surface. Also mathematical models had been used to calculate the radiation of the tilted surfaces. Most of the ingredients used in this project as previews data required on several solar energy applications, thermal simulation, and solar power systems. In addition, mathematical models had been used to study the flow of the fluid inside the tube (receiver), and study the effect of direct and reflected solar radiation on the pressure, temperature, speed, kinetic energy and forces of fluid inside the tube. Finally, the mathematical models had been used to study the (PTC) performances and estimate its thermal efficiency.

Keywords: CFD, experimental, mathematical models, parabolic trough, radiation

Procedia PDF Downloads 416
7848 Imputing Missing Data in Electronic Health Records: A Comparison of Linear and Non-Linear Imputation Models

Authors: Alireza Vafaei Sadr, Vida Abedi, Jiang Li, Ramin Zand

Abstract:

Missing data is a common challenge in medical research and can lead to biased or incomplete results. When the data bias leaks into models, it further exacerbates health disparities; biased algorithms can lead to misclassification and reduced resource allocation and monitoring as part of prevention strategies for certain minorities and vulnerable segments of patient populations, which in turn further reduce data footprint from the same population – thus, a vicious cycle. This study compares the performance of six imputation techniques grouped into Linear and Non-Linear models on two different realworld electronic health records (EHRs) datasets, representing 17864 patient records. The mean absolute percentage error (MAPE) and root mean squared error (RMSE) are used as performance metrics, and the results show that the Linear models outperformed the Non-Linear models in terms of both metrics. These results suggest that sometimes Linear models might be an optimal choice for imputation in laboratory variables in terms of imputation efficiency and uncertainty of predicted values.

Keywords: EHR, machine learning, imputation, laboratory variables, algorithmic bias

Procedia PDF Downloads 80
7847 Improvement of Process Competitiveness Using Intelligent Reference Models

Authors: Julio Macedo

Abstract:

Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.

Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics

Procedia PDF Downloads 83
7846 Fatigue Test and Stress-Life Analysis of Nanocomposite-Based Bone Fixation Device

Authors: Jisoo Kim, Min Su Lee, Sunmook Lee

Abstract:

Durability assessment of nanocomposite-based bone fixation device was performed by flexural fatigue tests, for which the changes in the life cycles of nanocomposite samples synthesized by blending bioabsorbable polymer (PLGA) and ceramic nanoparticles (β-TCP) with different ratios were monitored. The nanocomposite samples were kept in a constant temperature/humidity chamber at 37°C/50%RH for varied incubation periods for the degradation of nanocomposite samples under the temperature/humidity stress. It was found that the life cycles were increasing as the incubation time in the chamber were increasing in the initial stage irrespective of sample compositions, which was due to the annealing effect of the polymer. However, the life cycle was getting shorter as the incubation time increased afterward, which was due to the overall degradation of nanocomposites. It was found that the life cycle of the nanocomposite sample with high ceramic content was shorter than the one with low ceramic content, which was attributed to the increased brittleness of the composite with high ceramic content. The changes in chemical properties were also monitored by FT-IR, which indicated that the degradation of the biodegradable polymer could be confirmed by the increased intensities of carboxyl groups and hydroxyl groups since the hydrolysis of ester bonds connecting two successive monomers yielded carboxyl end groups and hydroxyl groups.

Keywords: bioabsorbable polymer, bone fixation device, ceramic nanoparticles, durability assessment, fatigue test

Procedia PDF Downloads 398
7845 Prediction of PM₂.₅ Concentration in Ulaanbaatar with Deep Learning Models

Authors: Suriya

Abstract:

Rapid socio-economic development and urbanization have led to an increasingly serious air pollution problem in Ulaanbaatar (UB), the capital of Mongolia. PM₂.₅ pollution has become the most pressing aspect of UB air pollution. Therefore, monitoring and predicting PM₂.₅ concentration in UB is of great significance for the health of the local people and environmental management. As of yet, very few studies have used models to predict PM₂.₅ concentrations in UB. Using data from 0:00 on June 1, 2018, to 23:00 on April 30, 2020, we proposed two deep learning models based on Bayesian-optimized LSTM (Bayes-LSTM) and CNN-LSTM. We utilized hourly observed data, including Himawari8 (H8) aerosol optical depth (AOD), meteorology, and PM₂.₅ concentration, as input for the prediction of PM₂.₅ concentrations. The correlation strengths between meteorology, AOD, and PM₂.₅ were analyzed using the gray correlation analysis method; the comparison of the performance improvement of the model by using the AOD input value was tested, and the performance of these models was evaluated using mean absolute error (MAE) and root mean square error (RMSE). The prediction accuracies of Bayes-LSTM and CNN-LSTM deep learning models were both improved when AOD was included as an input parameter. Improvement of the prediction accuracy of the CNN-LSTM model was particularly enhanced in the non-heating season; in the heating season, the prediction accuracy of the Bayes-LSTM model slightly improved, while the prediction accuracy of the CNN-LSTM model slightly decreased. We propose two novel deep learning models for PM₂.₅ concentration prediction in UB, Bayes-LSTM, and CNN-LSTM deep learning models. Pioneering the use of AOD data from H8 and demonstrating the inclusion of AOD input data improves the performance of our two proposed deep learning models.

Keywords: deep learning, AOD, PM2.5, prediction, Ulaanbaatar

Procedia PDF Downloads 41
7844 Statistical Analysis for Overdispersed Medical Count Data

Authors: Y. N. Phang, E. F. Loh

Abstract:

Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling over-dispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling over-dispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling over-dispersed medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling over-dispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian, and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling over-dispersed medical count data when ZIP and ZINB are inadequate.

Keywords: zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit

Procedia PDF Downloads 535
7843 Optimal Portfolio of Multi-service Provision based on Stochastic Model Predictive Control

Authors: Yifu Ding, Vijay Avinash, Malcolm McCulloch

Abstract:

As the proliferation of decentralized energy systems, the UK power system allows small-scale entities such as microgrids (MGs) to tender multiple energy services including energy arbitrage and frequency responses (FRs). However, its operation requires the balance between the uncertain renewable generations and loads in real-time and has to fulfill their provision requirements of contract services continuously during the time window agreed, otherwise it will be penalized for the under-delivered provision. To hedge against risks due to uncertainties and maximize the economic benefits, we propose a stochastic model predictive control (SMPC) framework to optimize its operation for the multi-service provision. Distinguished from previous works, we include a detailed economic-degradation model of the lithium-ion battery to quantify the costs of different service provisions, as well as accurately describe the changing dynamics of the battery. Considering a branch of load and generation scenarios and the battery aging, we formulate a risk-averse cost function using conditional value at risk (CVaR). It aims to achieve the maximum expected net revenue and avoids severe losses. The framework will be performed on a case study of a PV-battery grid-tied microgrid in the UK with real-life data. To highlight its performance, the framework will be compared with the case without the degradation model and the deterministic formulation.

Keywords: model predictive control (MPC), battery degradation, frequency response, microgrids

Procedia PDF Downloads 118
7842 Dehalogenation of Aromatic Compounds in Wastewater by Bacterial Cultures

Authors: Anne Elain, Magali Le Fellic

Abstract:

Halogenated Aromatic Compounds (HAC) are major organic pollutants that are detected in several environmental compartments as a result of their widespread use as solvents, pesticides and other industrial chemicals. The degradation of HAC simultaneously at low temperature and under saline conditions would be useful for remediation of polluted sites. Hence, microbial processes based on the metabolic activities of anaerobic bacteria are especially attractive from an economic and environmental point of view. Metabolites are generally less toxic, less likely to bioaccumulate and more susceptible for further degradation. Studies on biological reductive dehalogenation have largely been restricted to chlorinated compounds while relatively few have focussed on other HAC i.e., fluorinated, brominated or iodinated compounds. The objectives of the present work were to investigate the biodegradation of a mixture of triiodoaromatic molecules in industrial wastewater by an enriched bacterial consortium. Biodegradation of the mixture was studied during batch experiments in an anaerobic reactor. The degree of mineralization and recovery of halogen were monitored by HPLC-UV, TOC analysis and potentiometric titration. Providing ethanol as an electron donor was found to stimulate anaerobic reductive dehalogenation of HAC with a deiodination rate up to 12.4 mg.L-1 per day. Sodium chloride even at high concentration (10 mM) was found to have no influence on the degradation rates nor on the microbial viability. An analysis of the 16S rDNA (MicroSeq®) revealed that at least 6 bacteria were predominant in the enrichment, including Pseudomonas aeruginosa, Pseudomonas monteilii, Kocuria rhizophila, Ochrobacterium anthropi, Ralstonia pickettii and Rhizobium rhizogenes.

Keywords: halogenated aromatics, anaerobic biodegradation, deiodination, bacterial consortium

Procedia PDF Downloads 172
7841 The Strengths and Limitations of the Statistical Modeling of Complex Social Phenomenon: Focusing on SEM, Path Analysis, or Multiple Regression Models

Authors: Jihye Jeon

Abstract:

This paper analyzes the conceptual framework of three statistical methods, multiple regression, path analysis, and structural equation models. When establishing research model of the statistical modeling of complex social phenomenon, it is important to know the strengths and limitations of three statistical models. This study explored the character, strength, and limitation of each modeling and suggested some strategies for accurate explaining or predicting the causal relationships among variables. Especially, on the studying of depression or mental health, the common mistakes of research modeling were discussed.

Keywords: multiple regression, path analysis, structural equation models, statistical modeling, social and psychological phenomenon

Procedia PDF Downloads 643
7840 Evaluation of Football Forecasting Models: 2021 Brazilian Championship Case Study

Authors: Flavio Cordeiro Fontanella, Asla Medeiros e Sá, Moacyr Alvim Horta Barbosa da Silva

Abstract:

In the present work, we analyse the performance of football results forecasting models. In order to do so, we have performed the data collection from eight different forecasting models during the 2021 Brazilian football season. First, we guide the analysis through visual representations of the data, designed to highlight the most prominent features and enhance the interpretation of differences and similarities between the models. We propose using a 2-simplex triangle to investigate visual patterns from the results forecasting models. Next, we compute the expected points for every team playing in the championship and compare them to the final league standings, revealing interesting contrasts between actual to expected performances. Then, we evaluate forecasts’ accuracy using the Ranked Probability Score (RPS); models comparison accounts for tiny scale differences that may become consistent in time. Finally, we observe that the Wisdom of Crowds principle can be appropriately applied in the context, driving into a discussion of results forecasts usage in practice. This paper’s primary goal is to encourage football forecasts’ performance discussion. We hope to accomplish it by presenting appropriate criteria and easy-to-understand visual representations that can point out the relevant factors of the subject.

Keywords: accuracy evaluation, Brazilian championship, football results forecasts, forecasting models, visual analysis

Procedia PDF Downloads 92
7839 Modelling the Impacts of Geophysical Parameters on Deforestation and Forest Degradation in Pre and Post Ban Logging Periods in Hindu Kush Himalayas

Authors: Alam Zeb, Glen W. Armstrong, Muhammad Qasim

Abstract:

Loss of forest cover is one of the most important land cover changes and has been of great concern to policy makers. This study quantified forest cover changes over pre logging ban (1973-1993) and post logging ban (1993-2015) to examine the role of geophysical factors and spatial attributes of land in the two periods. We show that despite a complete ban on green felling, forest cover decreased by 28% and mostly converted to rangeland. Nevertheless, the logging ban was completely effective in controlling agriculture expansion. The binary logistic regression revealed that the south facing aspects at low elevation witnessed more deforestation in the pre-ban period compared to post-ban. Opposite to deforestation, forest degradation was more prominent on the northern aspects at higher elevation during the policy period. Agriculture expansion was widespread in the low elevation flat areas with gentle slope, while during the policy period agriculture contraction in the form of regeneration was observed on the low elevation areas of north facing slopes. All proximity variables, except distance to administrative boundary, showed a similar trend across the two periods and were important explanatory variables in understanding forest and agriculture expansion. The changes in determinants of forest and agriculture expansion and contraction over the two periods might be attributed to the influence of policy and a general decrease in resource availability.

Keywords: forest conservation , wood harvesting ban, logistic regression, deforestation, forest degradation, agriculture expansion, Chitral, Pakistan

Procedia PDF Downloads 226
7838 Bioremediation of Phenanthrene by Monocultures and Mixed Culture Bacteria Isolated from Contaminated Soil

Authors: A. Fazilah, I. Darah, I. Noraznawati

Abstract:

Three different bacteria capable of degrading phenanthrene were isolated from hydrocarbon contaminated site. In this study, the phenanthrene-degrading activity by defined monoculture was determined and mixed culture was identified as Acinetobacter sp. P3d, Bacillus sp. P4a and Pseudomonas sp. P6. All bacteria were able to grow in a minimal salt medium saturated with phenanthrene as the sole source of carbon and energy. Phenanthrene degradation efficiencies by different combinations (consortia) of these bacteria were investigated and their phenanthrene degradation was evaluated by gas chromatography. Among the monocultures, Pseudomonas sp. P6 exhibited 58.71% activity compared to Acinetobacter sp. P3d and Bacillus sp. P4a which were 56.97% and 53.05%, respectively after 28 days of cultivation. All consortia showed high phenanthrene elimination which were 95.64, 79.37, 87.19, 79.21% for Consortia A, B, C and D, respectively. The results indicate that all of the bacteria isolated may effectively degrade target chemical and have a promising application in bioremediation of hydrocarbon contaminated soil purposes.

Keywords: phenanthrene, consortia, acinetobacter sp. P3d, bacillus sp. P4a, pseudomonas sp. P6

Procedia PDF Downloads 292
7837 Statistical Models and Time Series Forecasting on Crime Data in Nepal

Authors: Dila Ram Bhandari

Abstract:

Throughout the 20th century, new governments were created where identities such as ethnic, religious, linguistic, caste, communal, tribal, and others played a part in the development of constitutions and the legal system of victim and criminal justice. Acute issues with extremism, poverty, environmental degradation, cybercrimes, human rights violations, crime against, and victimization of both individuals and groups have recently plagued South Asian nations. Everyday massive number of crimes are steadfast, these frequent crimes have made the lives of common citizens restless. Crimes are one of the major threats to society and also for civilization. Crime is a bone of contention that can create a societal disturbance. The old-style crime solving practices are unable to live up to the requirement of existing crime situations. Crime analysis is one of the most important activities of the majority of intelligent and law enforcement organizations all over the world. The South Asia region lacks such a regional coordination mechanism, unlike central Asia of Asia Pacific regions, to facilitate criminal intelligence sharing and operational coordination related to organized crime, including illicit drug trafficking and money laundering. There have been numerous conversations in recent years about using data mining technology to combat crime and terrorism. The Data Detective program from Sentient as a software company, uses data mining techniques to support the police (Sentient, 2017). The goals of this internship are to test out several predictive model solutions and choose the most effective and promising one. First, extensive literature reviews on data mining, crime analysis, and crime data mining were conducted. Sentient offered a 7-year archive of crime statistics that were daily aggregated to produce a univariate dataset. Moreover, a daily incidence type aggregation was performed to produce a multivariate dataset. Each solution's forecast period lasted seven days. Statistical models and neural network models were the two main groups into which the experiments were split. For the crime data, neural networks fared better than statistical models. This study gives a general review of the applied statistics and neural network models. A detailed image of each model's performance on the available data and generalizability is provided by a comparative analysis of all the models on a comparable dataset. Obviously, the studies demonstrated that, in comparison to other models, Gated Recurrent Units (GRU) produced greater prediction. The crime records of 2005-2019 which was collected from Nepal Police headquarter and analysed by R programming. In conclusion, gated recurrent unit implementation could give benefit to police in predicting crime. Hence, time series analysis using GRU could be a prospective additional feature in Data Detective.

Keywords: time series analysis, forecasting, ARIMA, machine learning

Procedia PDF Downloads 163
7836 Statistical Channel Modeling for Multiple-Input-Multiple-Output Communication System

Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany

Abstract:

The performance of wireless communication systems is affected mainly by the environment of its associated channel, which is characterized by dynamic and unpredictable behavior. In this paper, different statistical earth-satellite channel models are studied with emphasize on two main models, first is the Rice-Log normal model, due to its representation for the environment including shadowing and multi-path components that affect the propagated signal along its path, and a three-state model that take into account different fading conditions (clear area, moderate shadow and heavy shadowing). The provided models are based on AWGN, Rician, Rayleigh, and log-normal distributions were their Probability Density Functions (PDFs) are presented. The transmission system Bit Error Rate (BER), Peak-Average-Power Ratio (PAPR), and the channel capacity vs. fading models are measured and analyzed. These simulations are implemented using MATLAB tool, and the results had shown the performance of transmission system over different channel models.

Keywords: fading channels, MIMO communication, RNS scheme, statistical modeling

Procedia PDF Downloads 144
7835 Simulative Study of the Influence of Degraded Twin-Tube Shock Absorbers on the Lateral Forces of Vehicle Axles

Authors: Tobias Schramm, Günther Prokop

Abstract:

Degraded vehicle shock absorbers represent a risk for road safety. The exact effect of degraded vehicle dampers on road safety is still the subject of research. This work is intended to contribute to estimating the effect of degraded twin-tube dampers of passenger cars on road safety. An axle model was built using a damper model to simulate different degradation levels. To parameterize the model, a realistic parameter space was estimated based on test rig measurements and database analyses, which is intended to represent the vehicle field in Germany. Within the parameter space, simulations of the axle model were carried out, which calculated the transmittable lateral forces of the various axle configurations as a function of vehicle speed, road surface, damper conditions and axle parameters. A degraded damper has the greatest effect on the transmittable lateral forces at high speeds and in poor road conditions. If a vehicle is traveling at a speed of 100 kph on a Class D road, a degraded damper reduces the transmissible lateral forces of an axle by 20 % on average. For individual parameter configurations, this value can rise to 50 %. The axle parameters that most influence the effect of a degraded damper are the vertical stiffness of the tire, the unsprung mass and the stabilizer stiffness of the axle.

Keywords: vehicle dynamics, vehicle simulation, vehicle component degradation, shock absorber model, shock absorber degradation

Procedia PDF Downloads 108
7834 Isolation and Molecular Identification of Two Fungal Strains Capable of Degrading hydrocarbon Contaminants on Saudi Arabian Environment

Authors: Amr A. EL Hanafy, Yasir Anwar, Saleh A. Mohamed, Saleh Mohamed Saleh Al-Garni, Jamal S. M. Sabir , Osama A. H. Abu Zinadah, Mohamed Morsi Ahmed

Abstract:

In the vicinity of the red sea about 15 fungi species were isolated from oil contaminated sites. On the basis of aptitude to degrade the crude oil and DCPIP assay, two fungal isolates were selected amongst 15 oil degrading strains. Analysis of ITS-1, ITS-2 and amplicon pyrosequencing studies of fungal diversity revealed that these strains belong to Penicillium and Aspergillus species. Two strains that proved to be the most efficient in degrading crude oil was Aspergillus niger (54 %) and Penicillium commune (48 %) Subsequent to two weeks of cultivation in BHS medium the degradation rate were recorded by using spectrophotometer and GC-MS. Hence, it is cleared that these fungal strains has the capability of degradation and can be utilized for cleaning the Saudi Arabian environment.

Keywords: fungal strains, hydrocarbon contaminants, molecular identification, biodegradation, GC-MS

Procedia PDF Downloads 518
7833 Radical Degradation of Acetaminophen with Peroxymonosulfate-Based Oxidation Processes

Authors: Chaoqun Tan, Naiyun Gao, Xiaoyan Xin

Abstract:

Perxymonosulfate (PMS)-based oxidation processes, as an alternative of hydrogen peroxide-based oxidation processes, are more and more popular because of reactive radical species (SO4-•, OH•) produced in systems. Magnetic nano-scaled particles Fe3O4 and ferrous anion (Fe2+) were studied for the activation of PMS for degradation of acetaminophen (APAP) in water. The Fe3O4 MNPs were found to effectively catalyze PMS for APAP and the reactions well followed a pseudo-first-order kinetics pattern (R2>0.95). While the degradation of APAP in PMS-Fe2+ system proceeds through two stages: a fast stage and a much slower stage. Within 5 min, approximately 7% and 18% of 10 ppm APAP was accomplished by 0.2 mM PMS in Fe3O4 (0.8g/L) and Fe2+ (0.1mM) activation process. However, as reaction proceed to 120 min, approximately 75% and 35% of APAP was removed in Fe3O4 activation process and Fe2+ activation process, respectively. Within 120 min, the mineralization of APAP was about 7.5% and 5.0% (initial APAP of 10 ppm and [PMS]0 of 0.2 mM) in Fe3O4-PMS and Fe2+-PMS system, while the mineralization could be greatly increased to about 31% and 40% as [PMS]0 increased to 2.0 mM in in Fe3O4-PMS and Fe2+-PMS system, respectively. At last, the production of reactive radical species were validated directly from Electron Paramagnetic Resonance (ESR) tests with 0.1 M 5,5-Dimethyl-1-pyrrolidine N-oxide (DMPO). Plausible mechanisms on the radical generation from Fe3O4 and Fe2+ activation of PMS are proposed on the results of radial identification tests. The results demonstrated that Fe3O4 MNPs activated PMS and Fe2+ anion activated PMS systems are promising technologies for water pollution caused by contaminants such as pharmaceutical. Fe3O4-PMS system is more suitable for slowly remediation, while Fe2+-PMS system is more suitable for fast remediation.

Keywords: acetaminophen, peroxymonosulfate, radicals, Electron Paramagnetic Resonance (ESR)

Procedia PDF Downloads 344