Search results for: statistical modelling
5022 Sleep Apnea Hypopnea Syndrom Diagnosis Using Advanced ANN Techniques
Authors: Sachin Singh, Thomas Penzel, Dinesh Nandan
Abstract:
Accurate identification of Sleep Apnea Hypopnea Syndrom Diagnosis is difficult problem for human expert because of variability among persons and unwanted noise. This paper proposes the diagonosis of Sleep Apnea Hypopnea Syndrome (SAHS) using airflow, ECG, Pulse and SaO2 signals. The features of each type of these signals are extracted using statistical methods and ANN learning methods. These extracted features are used to approximate the patient's Apnea Hypopnea Index(AHI) using sample signals in model. Advance signal processing is also applied to snore sound signal to locate snore event and SaO2 signal is used to support whether determined snore event is true or noise. Finally, Apnea Hypopnea Index (AHI) event is calculated as per true snore event detected. Experiment results shows that the sensitivity can reach up to 96% and specificity to 96% as AHI greater than equal to 5.Keywords: neural network, AHI, statistical methods, autoregressive models
Procedia PDF Downloads 1195021 Towards Law Data Labelling Using Topic Modelling
Authors: Daniel Pinheiro Da Silva Junior, Aline Paes, Daniel De Oliveira, Christiano Lacerda Ghuerren, Marcio Duran
Abstract:
The Courts of Accounts are institutions responsible for overseeing and point out irregularities of Public Administration expenses. They have a high demand for processes to be analyzed, whose decisions must be grounded on severity laws. Despite the existing large amount of processes, there are several cases reporting similar subjects. Thus, previous decisions on already analyzed processes can be a precedent for current processes that refer to similar topics. Identifying similar topics is an open, yet essential task for identifying similarities between several processes. Since the actual amount of topics is considerably large, it is tedious and error-prone to identify topics using a pure manual approach. This paper presents a tool based on Machine Learning and Natural Language Processing to assists in building a labeled dataset. The tool relies on Topic Modelling with Latent Dirichlet Allocation to find the topics underlying a document followed by Jensen Shannon distance metric to generate a probability of similarity between documents pairs. Furthermore, in a case study with a corpus of decisions of the Rio de Janeiro State Court of Accounts, it was noted that data pre-processing plays an essential role in modeling relevant topics. Also, the combination of topic modeling and a calculated distance metric over document represented among generated topics has been proved useful in helping to construct a labeled base of similar and non-similar document pairs.Keywords: courts of accounts, data labelling, document similarity, topic modeling
Procedia PDF Downloads 1795020 Statistical Modeling for Permeabilization of a Novel Yeast Isolate for β-Galactosidase Activity Using Organic Solvents
Authors: Shweta Kumari, Parmjit S. Panesar, Manab B. Bera
Abstract:
The hydrolysis of lactose using β-galactosidase is one of the most promising biotechnological applications, which has wide range of potential applications in food processing industries. However, due to intracellular location of the yeast enzyme, and expensive extraction methods, the industrial applications of enzymatic hydrolysis processes are being hampered. The use of permeabilization technique can help to overcome the problems associated with enzyme extraction and purification of yeast cells and to develop the economically viable process for the utilization of whole cell biocatalysts in food industries. In the present investigation, standardization of permeabilization process of novel yeast isolate was carried out using a statistical model approach known as Response Surface Methodology (RSM) to achieve maximal b-galactosidase activity. The optimum operating conditions for permeabilization process for optimal β-galactosidase activity obtained by RSM were 1:1 ratio of toluene (25%, v/v) and ethanol (50%, v/v), 25.0 oC temperature and treatment time of 12 min, which displayed enzyme activity of 1.71 IU /mg DW.Keywords: β-galactosidase, optimization, permeabilization, response surface methodology, yeast
Procedia PDF Downloads 2565019 A Hybrid Model of Structural Equation Modelling-Artificial Neural Networks: Prediction of Influential Factors on Eating Behaviors
Authors: Maryam Kheirollahpour, Mahmoud Danaee, Amir Faisal Merican, Asma Ahmad Shariff
Abstract:
Background: The presence of nonlinearity among the risk factors of eating behavior causes a bias in the prediction models. The accuracy of estimation of eating behaviors risk factors in the primary prevention of obesity has been established. Objective: The aim of this study was to explore the potential of a hybrid model of structural equation modeling (SEM) and Artificial Neural Networks (ANN) to predict eating behaviors. Methods: The Partial Least Square-SEM (PLS-SEM) and a hybrid model (SEM-Artificial Neural Networks (SEM-ANN)) were applied to evaluate the factors affecting eating behavior patterns among university students. 340 university students participated in this study. The PLS-SEM analysis was used to check the effect of emotional eating scale (EES), body shape concern (BSC), and body appreciation scale (BAS) on different categories of eating behavior patterns (EBP). Then, the hybrid model was conducted using multilayer perceptron (MLP) with feedforward network topology. Moreover, Levenberg-Marquardt, which is a supervised learning model, was applied as a learning method for MLP training. The Tangent/sigmoid function was used for the input layer while the linear function applied for the output layer. The coefficient of determination (R²) and mean square error (MSE) was calculated. Results: It was proved that the hybrid model was superior to PLS-SEM methods. Using hybrid model, the optimal network happened at MPLP 3-17-8, while the R² of the model was increased by 27%, while, the MSE was decreased by 9.6%. Moreover, it was found that which one of these factors have significantly affected on healthy and unhealthy eating behavior patterns. The p-value was reported to be less than 0.01 for most of the paths. Conclusion/Importance: Thus, a hybrid approach could be suggested as a significant methodological contribution from a statistical standpoint, and it can be implemented as software to be able to predict models with the highest accuracy.Keywords: hybrid model, structural equation modeling, artificial neural networks, eating behavior patterns
Procedia PDF Downloads 1565018 Using Statistical Significance and Prediction to Test Long/Short Term Public Services and Patients' Cohorts: A Case Study in Scotland
Authors: Raptis Sotirios
Abstract:
Health and social care (HSc) services planning and scheduling are facing unprecedented challenges due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven can help to improve policies, plan and design services provision schedules using algorithms assist healthcare managers’ to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as CART, random forests (RF), and logistic regression (LGR). The significance tests Chi-Squared test and Student test are used on data over a 39 years span for which HSc services data exist for services delivered in Scotland. The demands are probabilistically associated through statistical hypotheses that assume that the target service’s demands are statistically dependent on other demands as a NULL hypothesis. This linkage can be confirmed or not by the data. Complementarily, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus groups of services. Statistical tests confirm ML couplings making the prediction also statistically meaningful and prove that a target service can be matched reliably to other services, and ML shows these indicated relationships can also be linear ones. Zero paddings were used for missing years records and illustrated better such relationships both for limited years and in the entire span offering long term data visualizations while limited years groups explained how well patients numbers can be related in short periods or can change over time as opposed to behaviors across more years. The prediction performance of the associations is measured using Receiver Operating Characteristic(ROC) AUC and ACC metrics as well as the statistical tests, Chi-Squared and Student. Co-plots and comparison tables for RF, CART, and LGR as well as p-values and Information Exchange(IE), are provided showing the specific behavior of the ML and of the statistical tests and the behavior using different learning ratios. The impact of k-NN and cross-correlation and C-Means first groupings is also studied over limited years and the entire span. It was found that CART was generally behind RF and LGR, but in some interesting cases, LGR reached an AUC=0 falling below CART, while the ACC was as high as 0.912, showing that ML methods can be confused padding or by data irregularities or outliers. On average, 3 linear predictors were sufficient, LGR was found competing RF well, and CART followed with the same performance at higher learning ratios. Services were packed only if when significance level(p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, birth weights, alcoholism, drug abuse, and emergency admissions. The work found that different HSc services can be well packed as plans of limited years, across various services sectors, learning configurations, as confirmed using statistical hypotheses.Keywords: class, cohorts, data frames, grouping, prediction, prob-ability, services
Procedia PDF Downloads 2345017 The Effectiveness of Energy Index Technique in Bearing Condition Monitoring
Authors: Faisal Alshammari, Abdulmajid Addali, Mosab Alrashed, Taihiret Alhashan
Abstract:
The application of acoustic emission techniques is gaining popularity, as it can monitor the condition of gears and bearings and detect early symptoms of a defect in the form of pitting, wear, and flaking of surfaces. Early detection of these defects is essential as it helps to avoid major failures and the associated catastrophic consequences. Signal processing techniques are required for early defect detection – in this article, a time domain technique called the Energy Index (EI) is used. This article presents an investigation into the Energy Index’s effectiveness to detect early-stage defect initiation and deterioration, and compares it with the common r.m.s. index, Kurtosis, and the Kolmogorov-Smirnov statistical test. It is concluded that EI is a more effective technique for monitoring defect initiation and development than other statistical parameters.Keywords: acoustic emission, signal processing, kurtosis, Kolmogorov-Smirnov test
Procedia PDF Downloads 3665016 Wildlife Habitat Corridor Mapping in Urban Environments: A GIS-Based Approach Using Preliminary Category Weightings
Authors: Stefan Peters, Phillip Roetman
Abstract:
The global loss of biodiversity is threatening the benefits nature provides to human populations and has become a more pressing issue than climate change and requires immediate attention. While there have been successful global agreements for environmental protection, such as the Montreal Protocol, these are rare, and we cannot rely on them solely. Thus, it is crucial to take national and local actions to support biodiversity. Australia is one of the 17 countries in the world with a high level of biodiversity, and its cities are vital habitats for endangered species, with more of them found in urban areas than in non-urban ones. However, the protection of biodiversity in metropolitan Adelaide has been inadequate, with over 130 species disappearing since European colonization in 1836. In this research project we conceptualized, developed and implemented a framework for wildlife Habitat Hotspots and Habitat Corridor modelling in an urban context using geographic data and GIS modelling and analysis. We used detailed topographic and other geographic data provided by a local council, including spatial and attributive properties of trees, parcels, water features, vegetated areas, roads, verges, traffic, and census data. Weighted factors considered in our raster-based Habitat Hotspot model include parcel size, parcel shape, population density, canopy cover, habitat quality and proximity to habitats and water features. Weighted factors considered in our raster-based Habitat Corridor model include habitat potential (resulting from the Habitat Hotspot model), verge size, road hierarchy, road widths, human density, and presence of remnant indigenous vegetation species. We developed a GIS model, using Python scripting and ArcGIS-Pro Model-Builder, to establish an automated reproducible and adjustable geoprocessing workflow, adaptable to any study area of interest. Our habitat hotspot and corridor modelling framework allow to determine and map existing habitat hotspots and wildlife habitat corridors. Our research had been applied to the study case of Burnside, a local council in Adelaide, Australia, which encompass an area of 30 km2. We applied end-user expertise-based category weightings to refine our models and optimize the use of our habitat map outputs towards informing local strategic decision-making.Keywords: biodiversity, GIS modeling, habitat hotspot, wildlife corridor
Procedia PDF Downloads 1155015 An Analytical Approach of Computational Complexity for the Method of Multifluid Modelling
Authors: A. K. Borah, A. K. Singh
Abstract:
In this paper we deal building blocks of the computer simulation of the multiphase flows. Whole simulation procedure can be viewed as two super procedures; The implementation of VOF method and the solution of Navier Stoke’s Equation. Moreover, a sequential code for a Navier Stoke’s solver has been studied.Keywords: Bi-conjugate gradient stabilized (Bi-CGSTAB), ILUT function, krylov subspace, multifluid flows preconditioner, simple algorithm
Procedia PDF Downloads 5285014 Statistical Correlation between Logging-While-Drilling Measurements and Wireline Caliper Logs
Authors: Rima T. Alfaraj, Murtadha J. Al Tammar, Khaqan Khan, Khalid M. Alruwaili
Abstract:
OBJECTIVE/SCOPE (25-75): Caliper logging data provides critical information about wellbore shape and deformations, such as stress-induced borehole breakouts or washouts. Multiarm mechanical caliper logs are often run using wireline, which can be time-consuming, costly, and/or challenging to run in certain formations. To minimize rig time and improve operational safety, it is valuable to develop analytical solutions that can estimate caliper logs using available Logging-While-Drilling (LWD) data without the need to run wireline caliper logs. As a first step, the objective of this paper is to perform statistical analysis using an extensive datasetto identify important physical parameters that should be considered in developing such analytical solutions. METHODS, PROCEDURES, PROCESS (75-100): Caliper logs and LWD data of eleven wells, with a total of more than 80,000 data points, were obtained and imported into a data analytics software for analysis. Several parameters were selected to test the relationship of the parameters with the measured maximum and minimum caliper logs. These parameters includegamma ray, porosity, shear, and compressional sonic velocities, bulk densities, and azimuthal density. The data of the eleven wells were first visualized and cleaned.Using the analytics software, several analyses were then preformed, including the computation of Pearson’s correlation coefficients to show the statistical relationship between the selected parameters and the caliper logs. RESULTS, OBSERVATIONS, CONCLUSIONS (100-200): The results of this statistical analysis showed that some parameters show good correlation to the caliper log data. For instance, the bulk density and azimuthal directional densities showedPearson’s correlation coefficients in the range of 0.39 and 0.57, which wererelatively high when comparedto the correlation coefficients of caliper data with other parameters. Other parameters such as porosity exhibited extremely low correlation coefficients to the caliper data. Various crossplots and visualizations of the data were also demonstrated to gain further insights from the field data. NOVEL/ADDITIVE INFORMATION (25-75): This study offers a unique and novel look into the relative importance and correlation between different LWD measurements and wireline caliper logs via an extensive dataset. The results pave the way for a more informed development of new analytical solutions for estimating the size and shape of the wellbore in real-time while drilling using LWD data.Keywords: LWD measurements, caliper log, correlations, analysis
Procedia PDF Downloads 1215013 The Relationships between Market Orientation and Competitiveness of Companies in Banking Sector
Authors: Patrik Jangl, Milan Mikuláštík
Abstract:
The objective of the paper is to measure and compare market orientation of Swiss and Czech banks, as well as examine statistically the degree of influence it has on competitiveness of the institutions. The analysis of market orientation is based on the collecting, analysis and correct interpretation of the data. Descriptive analysis of market orientation describe current situation. Research of relation of competitiveness and market orientation in the sector of big international banks is suggested with the expectation of existence of a strong relationship. Partially, the work served as reconfirmation of suitability of classic methodologies to measurement of banks’ market orientation. Two types of data were gathered. Firstly, by measuring subjectively perceived market orientation of a company and secondly, by quantifying its competitiveness. All data were collected from a sample of small, mid-sized and large banks. We used numerical secondary character data from the international statistical financial Bureau Van Dijk’s BANKSCOPE database. Statistical analysis led to the following results. Assuming classical market orientation measures to be scientifically justified, Czech banks are statistically less market-oriented than Swiss banks. Secondly, among small Swiss banks, which are not broadly internationally active, small relationship exist between market orientation measures and market share based competitiveness measures. Thirdly, among all Swiss banks, a strong relationship exists between market orientation measures and market share based competitiveness measures. Above results imply existence of a strong relation of this measure in sector of big international banks. A strong statistical relationship has been proven to exist between market orientation measures and equity/total assets ratio in Switzerland.Keywords: market orientation, competitiveness, marketing strategy, measurement of market orientation, relation between market orientation and competitiveness, banking sector
Procedia PDF Downloads 4765012 Assessment of Air Pollutant Dispersion and Soil Contamination: The Critical Role of MATLAB Modeling in Evaluating Emissions from the Covanta Municipal Solid Waste Incineration Facility
Authors: Jadon Matthiasa, Cindy Donga, Ali Al Jibouria, Hsin Kuo
Abstract:
The environmental impact of emissions from the Covanta Waste-to-Energy facility in Burnaby, BC, was comprehensively evaluated, focusing on the dispersion of air pollutants and the subsequent assessment of heavy metal contamination in surrounding soils. A Gaussian Plume Model, implemented in MATLAB, was utilized to simulate the dispersion of key pollutants to understand their atmospheric behaviour and potential deposition patterns. The MATLAB code developed for this study enhanced the accuracy of pollutant concentration predictions and provided capabilities for visualizing pollutant dispersion in 3D plots. Furthermore, the code could predict the maximum concentration of pollutants at ground level, eliminating the need to use the Ranchoux model for predictions. Complementing the modelling approach, empirical soil sampling and analysis were conducted to evaluate heavy metal concentrations in the vicinity of the facility. This integrated methodology underscored the importance of computational modelling in air pollution assessment and highlighted the necessity of soil analysis to obtain a holistic understanding of environmental impacts. The findings emphasized the effectiveness of current emissions controls while advocating for ongoing monitoring to safeguard public health and environmental integrity.Keywords: air emissions, Gaussian Plume Model, MATLAB, soil contamination, air pollution monitoring, waste-to-energy, pollutant dispersion visualization, heavy metal analysis, environmental impact assessment, emission control effectiveness
Procedia PDF Downloads 175011 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring
Authors: Hyun-Woo Cho
Abstract:
Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.Keywords: calibration model, monitoring, quality improvement, feature selection
Procedia PDF Downloads 3565010 Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems
Authors: Belkacem Laimouche
Abstract:
With the field of artificial intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.Keywords: artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, interlaboratory comparison, data analysis, data reliability, measurement of bias impact on predictions, improvement of model accuracy and reliability
Procedia PDF Downloads 1055009 Modelling and Simulating CO2 Electro-Reduction to Formic Acid Using Microfluidic Electrolytic Cells: The Influence of Bi-Sn Catalyst and 1-Ethyl-3-Methyl Imidazolium Tetra-Fluoroborate Electrolyte on Cell Performance
Authors: Akan C. Offong, E. J. Anthony, Vasilije Manovic
Abstract:
A modified steady-state numerical model is developed for the electrochemical reduction of CO2 to formic acid. The numerical model achieves a CD (current density) (~60 mA/cm2), FE-faradaic efficiency (~98%) and conversion (~80%) for CO2 electro-reduction to formic acid in a microfluidic cell. The model integrates charge and species transport, mass conservation, and momentum with electrochemistry. Specifically, the influences of Bi-Sn based nanoparticle catalyst (on the cathode surface) at different mole fractions and 1-ethyl-3-methyl imidazolium tetra-fluoroborate ([EMIM][BF4]) electrolyte, on CD, FE and CO2 conversion to formic acid is studied. The reaction is carried out at a constant concentration of electrolyte (85% v/v., [EMIM][BF4]). Based on the mass transfer characteristics analysis (concentration contours), mole ratio 0.5:0.5 Bi-Sn catalyst displays the highest CO2 mole consumption in the cathode gas channel. After validating with experimental data (polarisation curves) from literature, extensive simulations reveal performance measure: CD, FE and CO2 conversion. Increasing the negative cathode potential increases the current densities for both formic acid and H2 formations. However, H2 formations are minimal as a result of insufficient hydrogen ions in the ionic liquid electrolyte. Moreover, the limited hydrogen ions have a negative effect on formic acid CD. As CO2 flow rate increases, CD, FE and CO2 conversion increases.Keywords: carbon dioxide, electro-chemical reduction, ionic liquids, microfluidics, modelling
Procedia PDF Downloads 1465008 Geostatistical and Geochemical Study of the Aquifer System Waters Complex Terminal in the Valley of Oued Righ-Arid Area Algeria
Authors: Asma Bettahar, Imed Eddine Nezli, Sameh Habes
Abstract:
Groundwater resources in the Oued Righ valley are represented like the parts of the eastern basin of the Algerian Sahara, superposed by two major aquifers: the Intercalary Continental (IC) and the Terminal Complex (TC). From a qualitative point of view, various studies have highlighted that the waters of this region showed excessive mineralization, including the waters of the terminal complex (EC Avg equal 5854.61 S/cm) .The present article is a statistical approach by two multi methods various complementary (ACP, CAH), applied to the analytical data of multilayered aquifer waters Terminal Complex of the Oued Righ valley. The approach is to establish a correlation between the chemical composition of water and the lithological nature of different aquifer levels formations, and predict possible connection between groundwater’s layers. The results show that the mineralization of water is from geological origin. They concern the composition of the layers that make up the complex terminal.Keywords: complex terminal, mineralization, oued righ, statistical approach
Procedia PDF Downloads 3875007 The Essence of Culture and Religion in Creating Disaster Resilient Societies through Corporate Social Responsibility
Authors: Repaul Kanji, Rajat Agrawal
Abstract:
In this era where issues like climate change and disasters are the topics of discussion at national and international forums, it is very often that humanity questions the causative role of corporates in such events. It is beyond any doubt that rapid industrialisation and development has taken a toll in the form of climate change and even disasters, in some case. Thus, demanding to fulfill a corporate's responsibilities in the form of rescue and relief in times of disaster, rehabilitation and even mitigation and preparedness to adapt to the oncoming changes is obvious. But how can the responsibilities of the corporates be channelised to ensure all this, i.e., develop a resilient society? More than that, which factors, when emphasised upon, can lead to the holistic development of the society. To answer this query, an extensive literature review was done to identify several enablers like legislations of a nation, the role of brand and reputation, ease of doing Corporate Social Responsibility, mission and vision of an organisation, religion and culture, etc. as a tool for building disaster resilience. A questionnaire survey, interviews with experts and academicians followed by interpretive structural modelling (ISM) were used to construct a multi-hierarchy model depicting the contextual relationship among the identified enablers. The study revealed that culture and religion are the most powerful driver, which affects other enablers either directly or indirectly. Taking cognisance of the fact that an idea of separation between religion and workplace (business) resides subconsciously within the society, the study tries to interpret the outcome of the ISM through the lenses of past researches (The Integrating Box) and explores how it can be leveraged to build a resilient society.Keywords: corporate social responsibility, interpretive structural modelling, disaster resilience and risk reduction, the integration box (TIB)
Procedia PDF Downloads 2095006 Optimization of Media for Enhanced Fermentative Production of Mycophenolic Acid by Penicillium brevicompactum
Authors: Shraddha Digole, Swarali Hingse, Uday Annapure
Abstract:
Mycophenolic acid (MPA) is an immunosuppressant; produced by Penicillium Sp. Box-Behnken statistical experimental design was employed to optimize the condition of Penicillium brevicompactum NRRL 2011 for mycophenolic acid (MPA) production. Initially optimization of various physicochemical parameters and media components was carried out using one factor at a time approach and significant factors were screened by Taguchi L-16 orthogonal array design. Taguchi design indicated that glucose, KH2PO4 and MgSO4 had significant effect on MPA production. These variables were selected for further optimization studies using Box-Behnken design. Optimised fermentation condition, glucose (60 g/L), glycine (28 g/L), L-leucine (1.5g/L), KH2PO4 (3g/L), MgSO4.7H2O (1.5g/L), increased the production of MPA from 170 mg/L to 1032.54 mg/L. Analysis of variance (ANOVA) showed a high value of coefficient of determination R2 (0.9965), indicating a good agreement between experimental and predicted values and proves validity of the statistical model.Keywords: Box-Behnken design, fermentation, mycophenolic acid, Penicillium brevicompactum
Procedia PDF Downloads 4525005 Estimating Interdependence of Social Statuses in a Cooperative Breeding Birds through Mathematical Modelling
Authors: Sinchan Ghosh, Fahad Al Basir, Santanu Ray, Sabyasachi Bhattacharya
Abstract:
The cooperatively breeding birds have two major ranks for the sexually mature birds. The breeders mate and produce offspring while the non-breeding helpers increase the chick production rate through help in mate-finding and allo-parenting. However, the chicks also cooperate to raise their younger siblings through warming, defending and food sharing. Although, the existing literatures describes the evolution of allo-parenting in birds but do not differentiate the significance of allo-parenting in sexually immature and mature helpers separately. This study addresses the significance of both immature and mature helpers’ contribution to the total sustainable bird population in a breeding site using Blue-tailed bee-eater as a test-bed species. To serve this purpose, a mathematical model has been built considering each social status and chicks as separate but interactive compartments. Also, to observe the dynamics of each social status with changing prey abundance, a prey population has been introduced as an additional compartment. The model was analyzed for stability condition and was validated using field-data. A simulation experiment was then performed to observe the change in equilibria with a varying helping rate from both the helpers. The result from the simulation experiment suggest that the cooperative breeding population changes its population sizes significantly with a change in helping rate from the sexually immature helpers. On the other hand, the mature helpers do not contribute to the stability of the population equilibrium as much as the immature helpers.Keywords: Blue-tailed bee eater, Altruism, Mathematical Ethology, Behavioural modelling
Procedia PDF Downloads 1625004 Degumming of Eri Silk Fabric with Ionic Liquid
Authors: Shweta K. Vyas, Rakesh Musale, Sanjeev R. Shukla
Abstract:
Eri silk is a non mulberry silk which is obtained without killing the silkworms and hence it is also known as Ahmisa silk. In the present study, the results on degumming of eri silk with alkaline peroxide have been compared with those obtained by using ionic liquid (IL) 1-Butyl-3-methylimidazolium chloride [BMIM]Cl. Experiments were designed to find out the optimum processing parameters for degumming of eri silk by response surface methodology. The statistical software, Design-Expert 6.0 was used for regression analysis and graphical analysis of the responses obtained by running the set of designed experiments. Analysis of variance (ANOVA) was used to estimate the statistical parameters. The polynomial equation of quadratic order was employed to fit the experimental data. The quality and model terms were evaluated by F-test. Three dimensional surface plots were prepared to study the effect of variables on different responses. The optimum conditions for IL treatment were selected from predicted combinations and the experiments were repeated under these conditions to determine the reproducibility.Keywords: silk degumming, ionic liquid, response surface methodology, ANOVA
Procedia PDF Downloads 5935003 The Quality of Business Relationships in the Tourism System: An Imaginary Organisation Approach
Authors: Armando Luis Vieira, Carlos Costa, Arthur Araújo
Abstract:
The tourism system is viewable as a network of relationships amongst business partners where the success of each actor will ultimately be determined by the success of the whole network. Especially since the publication of Gümmesson’s (1996) ‘theory of imaginary organisations’, which suggests that organisational effectiveness largely depends on managing relationships and sharing resources and activities, relationship quality (RQ) has been increasingly recognised as a main source of value creation and competitive advantage. However, there is still ambiguity around this topic, and managers and researchers have been recurrently reporting the need to better understand and capitalise on the quality of interactions with business partners. This research aims at testing an RQ model from a relational, imaginary organisation’s approach. Two mail surveys provide the perceptions of 725 hotel representatives about their business relationships with tour operators, and 1,224 corporate client representatives about their business relationships with hotels (21.9 % and 38.8 % response rate, respectively). The analysis contributes to enhance our understanding on the linkages between RQ and its determinants, and identifies the role of their dimensions. Structural equation modelling results highlight trust as the dominant dimension, the crucial role of commitment and satisfaction, and suggest customer orientation as complementary building block. Findings also emphasise problem solving behaviour and selling orientation as the most relevant dimensions of customer orientation. The comparison of the two ‘dyads’ deepens the discussion and enriches the suggested theoretical and managerial guidelines concerning the contribution of quality relationships to business performance.Keywords: corporate clients, destination competitiveness, hotels, relationship quality, structural equations modelling, tour operators
Procedia PDF Downloads 3935002 The Relationship between Land Use Factors and Feeling of Happiness at the Neighbourhood Level
Authors: M. Moeinaddini, Z. Asadi-Shekari, Z. Sultan, M. Zaly Shah
Abstract:
Happiness can be related to everything that can provide a feeling of satisfaction or pleasure. This study tries to consider the relationship between land use factors and feeling of happiness at the neighbourhood level. Land use variables (beautiful and attractive neighbourhood design, availability and quality of shopping centres, sufficient recreational spaces and facilities, and sufficient daily service centres) are used as independent variables and the happiness score is used as the dependent variable in this study. In addition to the land use variables, socio-economic factors (gender, race, marital status, employment status, education, and income) are also considered as independent variables. This study uses the Oxford happiness questionnaire to estimate happiness score of more than 300 people living in six neighbourhoods. The neighbourhoods are selected randomly from Skudai neighbourhoods in Johor, Malaysia. The land use data were obtained by adding related questions to the Oxford happiness questionnaire. The strength of the relationship in this study is found using generalised linear modelling (GLM). The findings of this research indicate that increase in happiness feeling is correlated with an increasing income, more beautiful and attractive neighbourhood design, sufficient shopping centres, recreational spaces, and daily service centres. The results show that all land use factors in this study have significant relationship with happiness but only income, among socio-economic factors, can affect happiness significantly. Therefore, land use factors can affect happiness in Skudai more than socio-economic factors.Keywords: neighbourhood land use, neighbourhood design, happiness, socio-economic factors, generalised linear modelling
Procedia PDF Downloads 1495001 Analyzing the Support to Fisheries in the European Union: Modelling Budgetary Transfers in Wild Fisheries
Authors: Laura Angulo, Petra Salamon, Martin Banse, Frederic Storkamp
Abstract:
Fisheries subsidies are focus on reduce management costs or deliver income benefits to fishers. In 2015, total fishery budgetary transfers in 31 OECD countries represented 35% of their total landing value. However, subsidies to fishing have adverse effects on trade and it has been claimed that they may contribute directly to overfishing. Therefore, this paper analyses to what extend fisheries subsidies may 1) influence capture production facing quotas and 2) affect price dynamics. The study uses the fish module in AGMEMOD (Agriculture Member States Modelling, details see Chantreuil et al. (2012)) which covers eight fish categories (cephalopods; crustaceans; demersal marine fish; pelagic marine fish; molluscs excl. cephalopods; other marine finfish species; freshwater and diadromous fish) for EU member states and other selected countries developed under the SUCCESS project. This model incorporates transfer payments directly linked to fisheries operational costs. As aquaculture and wild fishery are not included within the WTO Agreement on Agriculture, data on fisheries subsidies is obtained from the OECD Fisheries Support Estimates (FSE) database, which provides statistics on budgetary transfers to the fisheries sector. Since support has been moving from budgetary transfers to General Service Support Estimate the last years, subsidies in capture production may not present substantial effects. Nevertheless, they would still show the impact across countries and fish categories within the European Union.Keywords: AGMEMOD, budgetary transfers, EU Member States, fish model, fisheries support estimate
Procedia PDF Downloads 2485000 Modelling the Antecedents of Supply Chain Enablers in Online Groceries Using Interpretive Structural Modelling and MICMAC Analysis
Authors: Rose Antony, Vivekanand B. Khanapuri, Karuna Jain
Abstract:
Online groceries have transformed the way the supply chains are managed. These are facing numerous challenges in terms of product wastages, low margins, long breakeven to achieve and low market penetration to mention a few. The e-grocery chains need to overcome these challenges in order to survive the competition. The purpose of this paper is to carry out a structural analysis of the enablers in e-grocery chains by applying Interpretive Structural Modeling (ISM) and MICMAC analysis in the Indian context. The research design is descriptive-explanatory in nature. The enablers have been identified from the literature and through semi-structured interviews conducted among the managers having relevant experience in e-grocery supply chains. The experts have been contacted through professional/social networks by adopting a purposive snowball sampling technique. The interviews have been transcribed, and manual coding is carried using open and axial coding method. The key enablers are categorized into themes, and the contextual relationship between these and the performance measures is sought from the Industry veterans. Using ISM, the hierarchical model of the enablers is developed and MICMAC analysis identifies the driver and dependence powers. Based on the driver-dependence power the enablers are categorized into four clusters namely independent, autonomous, dependent and linkage. The analysis found that information technology (IT) and manpower training acts as key enablers towards reducing the lead time and enhancing the online service quality. Many of the enablers fall under the linkage cluster viz., frequent software updating, branding, the number of delivery boys, order processing, benchmarking, product freshness and customized applications for different stakeholders, depicting these as critical in online food/grocery supply chains. Considering the perishability nature of the product being handled, the impact of the enablers on the product quality is also identified. Hence, study aids as a tool to identify and prioritize the vital enablers in the e-grocery supply chain. The work is perhaps unique, which identifies the complex relationships among the supply chain enablers in fresh food for e-groceries and linking them to the performance measures. It contributes to the knowledge of supply chain management in general and e-retailing in particular. The approach focus on the fresh food supply chains in the Indian context and hence will be applicable in developing economies context, where supply chains are evolving.Keywords: interpretive structural modelling (ISM), India, online grocery, retail operations, supply chain management
Procedia PDF Downloads 2044999 Genetic Variation of Autosomal STR Loci from Unrelated Individual in Iraq
Authors: H. Imad, Q. Cheah, J. Mohammad, O. Aamera
Abstract:
The aim of this study is twofold. One is to determine the genetic structure of Iraq population and the second objective of the study was to evaluate the importance of these loci for forensic genetic purposes. FTA® Technology (FTA™ paper DNA extraction) utilized to extract DNA. Twenty STR loci and Amelogenin including D3S1358, D13S317, PentaE, D16S539, D18S51, D2S1338, CSF1PO, Penta D, THO1, vWA, D21S11, D7S820, TPOX, D8S1179, FGA, D2S1338, D5S818, D6S1043, D12S391, D19S433, and Amelogenin amplified by using power plex21® kit. PCR products detected by genetic analyzer 3730xL then data analyzed by PowerStatsV1.2. Based on the allelic frequencies, several statistical parameters of genetic and forensic efficiency have been estimated. This includes the homozygosity and heterozygosity, effective number of alleles (n), the polymorphism information content (PIC), the power of discrimination (DP), and the power of exclusion (PE). The power of discrimination values for all tested loci was from 75% to 96% therefore, those loci can be safely used to establish a DNA-based database for Iraq population.Keywords: autosomal STR, genetic variation, Middle and South of Iraq, statistical parameters
Procedia PDF Downloads 3854998 A Review on Water Models of Surface Water Environment
Authors: Shahbaz G. Hassan
Abstract:
Water quality models are very important to predict the changes in surface water quality for environmental management. The aim of this paper is to give an overview of the water qualities, and to provide directions for selecting models in specific situation. Water quality models include one kind of model based on a mechanistic approach, while other models simulate water quality without considering a mechanism. Mechanistic models can be widely applied and have capabilities for long-time simulation, with highly complexity. Therefore, more spaces are provided to explain the principle and application experience of mechanistic models. Mechanism models have certain assumptions on rivers, lakes and estuaries, which limits the application range of the model, this paper introduces the principles and applications of water quality model based on the above three scenarios. On the other hand, mechanistic models are more easily to compute, and with no limit to the geographical conditions, but they cannot be used with confidence to simulate long term changes. This paper divides the empirical models into two broad categories according to the difference of mathematical algorithm, models based on artificial intelligence and models based on statistical methods.Keywords: empirical models, mathematical, statistical, water quality
Procedia PDF Downloads 2654997 A Comparative Time-Series Analysis and Deep Learning Projection of Innate Radon Gas Risk in Canadian and Swedish Residential Buildings
Authors: Selim M. Khan, Dustin D. Pearson, Tryggve Rönnqvist, Markus E. Nielsen, Joshua M. Taron, Aaron A. Goodarzi
Abstract:
Accumulation of radioactive radon gas in indoor air poses a serious risk to human health by increasing the lifetime risk of lung cancer and is classified by IARC as a category one carcinogen. Radon exposure risks are a function of geologic, geographic, design, and human behavioural variables and can change over time. Using time series and deep machine learning modelling, we analyzed long-term radon test outcomes as a function of building metrics from 25,489 Canadian and 38,596 Swedish residential properties constructed between 1945 to 2020. While Canadian and Swedish properties built between 1970 and 1980 are comparable (96–103 Bq/m³), innate radon risks subsequently diverge, rising in Canada and falling in Sweden such that 21st Century Canadian houses show 467% greater average radon (131 Bq/m³) relative to Swedish equivalents (28 Bq/m³). These trends are consistent across housing types and regions within each country. The introduction of energy efficiency measures within Canadian and Swedish building codes coincided with opposing radon level trajectories in each nation. Deep machine learning modelling predicts that, without intervention, average Canadian residential radon levels will increase to 176 Bq/m³ by 2050, emphasizing the importance and urgency of future building code intervention to achieve systemic radon reduction in Canada.Keywords: radon health risk, time-series, deep machine learning, lung cancer, Canada, Sweden
Procedia PDF Downloads 854996 Transport Related Air Pollution Modeling Using Artificial Neural Network
Authors: K. D. Sharma, M. Parida, S. S. Jain, Anju Saini, V. K. Katiyar
Abstract:
Air quality models form one of the most important components of an urban air quality management plan. Various statistical modeling techniques (regression, multiple regression and time series analysis) have been used to predict air pollution concentrations in the urban environment. These models calculate pollution concentrations due to observed traffic, meteorological and pollution data after an appropriate relationship has been obtained empirically between these parameters. Artificial neural network (ANN) is increasingly used as an alternative tool for modeling the pollutants from vehicular traffic particularly in urban areas. In the present paper, an attempt has been made to model traffic air pollution, specifically CO concentration using neural networks. In case of CO concentration, two scenarios were considered. First, with only classified traffic volume input and the second with both classified traffic volume and meteorological variables. The results showed that CO concentration can be predicted with good accuracy using artificial neural network (ANN).Keywords: air quality management, artificial neural network, meteorological variables, statistical modeling
Procedia PDF Downloads 5244995 The Role of Parents on Fear Acquisition of Children in COVID-19 Pandemic
Authors: Begum Serim-Yildiz
Abstract:
The aim of this study is to examine the role of parents' emotional and behavioral reactions on fears of children in the COVID-19 pandemic considering Rachman’s Three Pathways Theory. For this purpose, a phenomenological qualitative study was conducted. Thirteen participants living with their children were utilized through criterion and snowball sampling. In semi-structured interviews parents were asked about their own and their children’s beahavioral and emotional reactions in the COVID-19 pandemic, and they were expected to give detailed information about fears of their children before and in pandemic. Firstly, parents were asked about their behavioral and emotional reactions in the COVID-19 pandemic. As behavioral reactions, precautions taken by parents to protect the rest of the family from negative physical and emotional impact of the pandemic were mentioned, while emotional reactions were defined as acquisition of negative emotions like fear, anxiety, and worry. Secondly, parents were asked about their children’s behavioral and emotional reactions. Some of the parents talked about positive behavioral changes such as gaining self-control, while some others explained negative behavioral changes like increased time spent with technological tools. In the emotional changes section, all of the parents explained at least one negative emotion. All of the parents stated that their children had COVID-19 related fears. According to parents’ expressions, fears of children in pandemic were examined in two dimensions. Fears directly related to COVID-19 were fear of virus/microbes, illness or death of someone in family and death and fears. Fears indirectly related to COVID-19 were fear of going out, sleep alone at night, separation, touching stuff outside the home, and cold. Considering existing literature and based on the findings of this study, it can be concluded that children’s modelling experiences have impact on acquisition of negative emotions, especially fear, therefore, preventive interventions involving caregivers should be provided by mental health professionals working with children.Keywords: children’s fears, COVID-19 pandemic, modelling experiences, parents’ reactions
Procedia PDF Downloads 1664994 Evaluation of Diagnosis Performance Based on Pairwise Model Construction and Filtered Data
Authors: Hyun-Woo Cho
Abstract:
It is quite important to utilize right time and intelligent production monitoring and diagnosis of industrial processes in terms of quality and safety issues. When compared with monitoring task, fault diagnosis represents the task of finding process variables responsible causing a specific fault in the process. It can be helpful to process operators who should investigate and eliminate root causes more effectively and efficiently. This work focused on the active use of combining a nonlinear statistical technique with a preprocessing method in order to implement practical real-time fault identification schemes for data-rich cases. To compare its performance to existing identification schemes, a case study on a benchmark process was performed in several scenarios. The results showed that the proposed fault identification scheme produced more reliable diagnosis results than linear methods. In addition, the use of the filtering step improved the identification results for the complicated processes with massive data sets.Keywords: diagnosis, filtering, nonlinear statistical techniques, process monitoring
Procedia PDF Downloads 2444993 A Ground Observation Based Climatology of Winter Fog: Study over the Indo-Gangetic Plains, India
Authors: Sanjay Kumar Srivastava, Anu Rani Sharma, Kamna Sachdeva
Abstract:
Every year, fog formation over the Indo-Gangetic Plains (IGPs) of Indian region during the winter months of December and January is believed to create numerous hazards, inconvenience, and economic loss to the inhabitants of this densely populated region of Indian subcontinent. The aim of the paper is to analyze the spatial and temporal variability of winter fog over IGPs. Long term ground observations of visibility and other meteorological parameters (1971-2010) have been analyzed to understand the formation of fog phenomena and its relevance during the peak winter months of January and December over IGP of India. In order to examine the temporal variability, time series and trend analysis were carried out by using the Mann-Kendall Statistical test. Trend analysis performed by using the Mann-Kendall test, accepts the alternate hypothesis with 95% confidence level indicating that there exists a trend. Kendall tau’s statistics showed that there exists a positive correlation between time series and fog frequency. Further, the Theil and Sen’s median slope estimate showed that the magnitude of trend is positive. Magnitude is higher during January compared to December for the entire IGP except in December when it is high over the western IGP. Decade wise time series analysis revealed that there has been continuous increase in fog days. The net overall increase of 99 % was observed over IGP in last four decades. Diurnal variability and average daily persistence were computed by using descriptive statistical techniques. Geo-statistical analysis of fog was carried out to understand the spatial variability of fog. Geo-statistical analysis of fog revealed that IGP is a high fog prone zone with fog occurrence frequency of more than 66% days during the study period. Diurnal variability indicates the peak occurrence of fog is between 06:00 and 10:00 local time and average daily fog persistence extends to 5 to 7 hours during the peak winter season. The results would offer a new perspective to take proactive measures in reducing the irreparable damage that could be caused due to changing trends of fog.Keywords: fog, climatology, Mann-Kendall test, trend analysis, spatial variability, temporal variability, visibility
Procedia PDF Downloads 242