Search results for: discrete choice models
7775 Dynamic Web-Based 2D Medical Image Visualization and Processing Software
Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail
Abstract:
In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN
Procedia PDF Downloads 1607774 Role of Physical Properties of Maize Grains Towards Resistance to Sitotroga Cerealella (OLIV.) (Gelechiidae: Lepidoptera) in No Choice
Authors: Sohail Ahmed, Ahmad Raza
Abstract:
Physical properties of maize grains were correlated with levels of the life history of Sitotroga cerealella (Oliv.) (Gelechiidae: Lepidoptera) in no choice test to find out relative resistance in different varieties. Eight maize varieties /lines (EV-6089, Sahiwal-2002, Golden, 34N43, EV-1098, Sultan, China-1, EV-20) including seven yellow and one white were obtained from Maize and Millet Research Institute, Yousaf Wala, Sahiwal, Punjab, Pakistan. Freshly laid eggs (one day old) of S. cerealella were obtained and cultured on a susceptible maize variety for two generations for later on shifting to test varieties. Results showed that maximum moth emergence (10.33), fecundity (35.66), hatching (87.66%), moth weight (5.05 mg), development time (36.0 days) damage (93.35%) and grain weight loss (38.84%) was found in varieties, 34N43 and Golden, Sultan, Sahiwal 2002, 34N43, EV-6089, 34N43 and EV-1089, respectively. Varieties had significant difference with other varieties in these parameters (P<0.05). The varieties had positive as well as negative correlation between hardness index, grain weight and bulk density with the biological parameters of S. cerealella, percent grain damage and weight loss. Possible involvement of these grain properties in the resistance of maize grains towards S. cerealella is discussed.Keywords: sitotroga cerealella, hardness index, grain damage, maize, varieties
Procedia PDF Downloads 3877773 Parental Diet Effects on Offspring Body Size and Pathogen Resistance in Bactrocera tryoni
Authors: Hue Dinh, Binh Nguyen, Vivian Mendez, Phillip W. Taylor, Fleur Ponton
Abstract:
Better understanding of how parental diet affects offspring traits is an important ecological and evolutionary question. In this study, we explored how maternal diet influences offspring physiology and resistance to infection using Bactrocera tryoni (Q-fly) as a system model. Female Q-flies were fed one of six single diets varying in their yeast-to-sugar ratio yielding six protein-to-carbohydrate ratios. As controls, we used females that were given a choice between yeast and sugar. Males were reared on a choice diet and allowed to mate with females 14 days post-emergence. Results showed that while maternal diet does not influence offspring developmental time, it has a strong effect on larval body weight. Mother fed either high-protein or high-sugar diet produced larger progeny. By challenging offspring with the bacterium Serratia marcescens, we found that female offspring from mothers fed high-sugar diet survived better the infection compared to those from mothers fed low-sugar diet. In contrast, male offspring produced by mother fed high-protein diet showed better resistance to the infection compared to those produced by mother fed low-protein diet. These results suggested sex-dependent transgenerational effects of maternal nutrition on offspring physiology and immunity.Keywords: bacterial infection, Bactrocera tryoni, maternal diet, offspring, Serretia marcescens
Procedia PDF Downloads 1437772 A Comparative Study on ANN, ANFIS and SVM Methods for Computing Resonant Frequency of A-Shaped Compact Microstrip Antennas
Authors: Ahmet Kayabasi, Ali Akdagli
Abstract:
In this study, three robust predicting methods, namely artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for computing the resonant frequency of A-shaped compact microstrip antennas (ACMAs) operating at UHF band. Firstly, the resonant frequencies of 144 ACMAs with various dimensions and electrical parameters were simulated with the help of IE3D™ based on method of moment (MoM). The ANN, ANFIS and SVM models for computing the resonant frequency were then built by considering the simulation data. 124 simulated ACMAs were utilized for training and the remaining 20 ACMAs were used for testing the ANN, ANFIS and SVM models. The performance of the ANN, ANFIS and SVM models are compared in the training and test process. The average percentage errors (APE) regarding the computed resonant frequencies for training of the ANN, ANFIS and SVM were obtained as 0.457%, 0.399% and 0.600%, respectively. The constructed models were then tested and APE values as 0.601% for ANN, 0.744% for ANFIS and 0.623% for SVM were achieved. The results obtained here show that ANN, ANFIS and SVM methods can be successfully applied to compute the resonant frequency of ACMAs, since they are useful and versatile methods that yield accurate results.Keywords: a-shaped compact microstrip antenna, artificial neural network (ANN), adaptive neuro-fuzzy inference system (ANFIS), support vector machine (SVM)
Procedia PDF Downloads 4417771 Tram Track Deterioration Modeling
Authors: Mohammad Yousefikia, Sara Moridpour, Ehsan Mazloumi
Abstract:
Perceiving track geometry deterioration decisively influences the optimization of track maintenance operations. The effective management of this deterioration and increasingly utilized system with limited financial resources is a significant challenge. This paper provides a review of degradation models relevant for railroad tracks. Furthermore, due to the lack of long term information on the condition development of tram infrastructures, presents the methodology which will be used to derive degradation models from the data of Melbourne tram network.Keywords: deterioration modeling, asset management, railway, tram
Procedia PDF Downloads 3797770 Modeling of Diurnal Pattern of Air Temperature in a Tropical Environment: Ile-Ife and Ibadan, Nigeria
Authors: Rufus Temidayo Akinnubi, M. O. Adeniyi
Abstract:
Existing diurnal air temperature models simulate night time air temperature over Nigeria with high biases. An improved parameterization is presented for modeling the diurnal pattern of air temperature (Ta) which is applicable in the calculation of turbulent heat fluxes in Global climate models, based on Nigeria Micrometeorological Experimental site (NIMEX) surface layer observations. Five diurnal Ta models for estimating hourly Ta from daily maximum, daily minimum, and daily mean air temperature were validated using root-mean-square error (RMSE), Mean Error Bias (MBE) and scatter graphs. The original Fourier series model showed better performance for unstable air temperature parameterizations while the stable Ta was strongly overestimated with a large error. The model was improved with the inclusion of the atmospheric cooling rate that accounts for the temperature inversion that occurs during the nocturnal boundary layer condition. The MBE and RMSE estimated by the modified Fourier series model reduced by 4.45 oC and 3.12 oC during the transitional period from dry to wet stable atmospheric conditions. The modified Fourier series model gave good estimation of the diurnal weather patterns of Ta when compared with other existing models for a tropical environment.Keywords: air temperature, mean bias error, Fourier series analysis, surface energy balance,
Procedia PDF Downloads 2307769 An Estimating Equation for Survival Data with a Possibly Time-Varying Covariates under a Semiparametric Transformation Models
Authors: Yemane Hailu Fissuh, Zhongzhan Zhang
Abstract:
An estimating equation technique is an alternative method of the widely used maximum likelihood methods, which enables us to ease some complexity due to the complex characteristics of time-varying covariates. In the situations, when both the time-varying covariates and left-truncation are considered in the model, the maximum likelihood estimation procedures become much more burdensome and complex. To ease the complexity, in this study, the modified estimating equations those have been given high attention and considerations in many researchers under semiparametric transformation model was proposed. The purpose of this article was to develop the modified estimating equation under flexible and general class of semiparametric transformation models for left-truncated and right censored survival data with time-varying covariates. Besides the commonly applied Cox proportional hazards model, such kind of problems can be also analyzed with a general class of semiparametric transformation models to estimate the effect of treatment given possibly time-varying covariates on the survival time. The consistency and asymptotic properties of the estimators were intuitively derived via the expectation-maximization (EM) algorithm. The characteristics of the estimators in the finite sample performance for the proposed model were illustrated via simulation studies and Stanford heart transplant real data examples. To sum up the study, the bias for covariates has been adjusted by estimating density function for the truncation time variable. Then the effect of possibly time-varying covariates was evaluated in some special semiparametric transformation models.Keywords: EM algorithm, estimating equation, semiparametric transformation models, time-to-event outcomes, time varying covariate
Procedia PDF Downloads 1527768 Evaluating Generative Neural Attention Weights-Based Chatbot on Customer Support Twitter Dataset
Authors: Sinarwati Mohamad Suhaili, Naomie Salim, Mohamad Nazim Jambli
Abstract:
Sequence-to-sequence (seq2seq) models augmented with attention mechanisms are playing an increasingly important role in automated customer service. These models, which are able to recognize complex relationships between input and output sequences, are crucial for optimizing chatbot responses. Central to these mechanisms are neural attention weights that determine the focus of the model during sequence generation. Despite their widespread use, there remains a gap in the comparative analysis of different attention weighting functions within seq2seq models, particularly in the domain of chatbots using the Customer Support Twitter (CST) dataset. This study addresses this gap by evaluating four distinct attention-scoring functions—dot, multiplicative/general, additive, and an extended multiplicative function with a tanh activation parameter — in neural generative seq2seq models. Utilizing the CST dataset, these models were trained and evaluated over 10 epochs with the AdamW optimizer. Evaluation criteria included validation loss and BLEU scores implemented under both greedy and beam search strategies with a beam size of k=3. Results indicate that the model with the tanh-augmented multiplicative function significantly outperforms its counterparts, achieving the lowest validation loss (1.136484) and the highest BLEU scores (0.438926 under greedy search, 0.443000 under beam search, k=3). These results emphasize the crucial influence of selecting an appropriate attention-scoring function in improving the performance of seq2seq models for chatbots. Particularly, the model that integrates tanh activation proves to be a promising approach to improve the quality of chatbots in the customer support context.Keywords: attention weight, chatbot, encoder-decoder, neural generative attention, score function, sequence-to-sequence
Procedia PDF Downloads 787767 Analysis of the Contribution of Drude and Brendel Model Terms to the Dielectric Function
Authors: Christopher Mkirema Maghanga, Maurice Mghendi Mwamburi
Abstract:
Parametric modeling provides a means to deeper understand the properties of materials. Drude, Brendel, Lorentz and OJL incorporated in SCOUT® software are some of the models used to study dielectric films. In our work, we utilized Brendel and Drude models to extract the optical constants from spectroscopic data of fabricated undoped and niobium doped titanium oxide thin films. The individual contributions by the two models were studied to establish how they influence the dielectric function. The effect of dopants on their influences was also analyzed. For the undoped films, results indicate minimal contribution from the Drude term due to the dielectric nature of the films. However as doping levels increase, the rise in the concentration of free electrons favors the use of Drude model. Brendel model was confirmed to work well with dielectric films - the undoped titanium Oxide films in our case.Keywords: modeling, Brendel model, optical constants, titanium oxide, Drude Model
Procedia PDF Downloads 1837766 Improving Our Understanding of the in vivo Modelling of Psychotic Disorders
Authors: Zsanett Bahor, Cristina Nunes-Fonseca, Gillian L. Currie, Emily S. Sena, Lindsay D.G. Thomson, Malcolm R. Macleod
Abstract:
Psychosis is ranked as the third most disabling medical condition in the world by the World Health Organization. Despite a substantial amount of research in recent years, available treatments are not universally effective and have a wide range of adverse side effects. Since many clinical drug candidates are identified through in vivo modelling, a deeper understanding of these models, and their strengths and limitations, might help us understand reasons for difficulties in psychosis drug development. To provide an unbiased summary of the preclinical psychosis literature we performed a systematic electronic search of PubMed for publications modelling a psychotic disorder in vivo, identifying 14,721 relevant studies. Double screening of 11,000 publications from this dataset so far established 2403 animal studies of psychosis, with the most common model being schizophrenia (95%). 61% of these models are induced using pharmacological agents. For all the models only 56% of publications test a therapeutic treatment. We propose a systematic review of these studies to assess the prevalence of reporting of measures to reduce risk of bias, and a meta-analysis to assess the internal and external validity of these animal models. Our findings are likely to be relevant to future preclinical studies of psychosis as this generation of strong empirical evidence has the potential to identify weaknesses, areas for improvement and make suggestions on refinement of experimental design. Such a detailed understanding of the data which inform what we think we know will help improve the current attrition rate between bench and bedside in psychosis research.Keywords: animal models, psychosis, systematic review, schizophrenia
Procedia PDF Downloads 2907765 Multivariate Rainfall Disaggregation Using MuDRain Model: Malaysia Experience
Authors: Ibrahim Suliman Hanaish
Abstract:
Disaggregation daily rainfall using stochastic models formulated based on multivariate approach (MuDRain) is discussed in this paper. Seven rain gauge stations are considered in this study for different distances from the referred station starting from 4 km to 160 km in Peninsular Malaysia. The hourly rainfall data used are covered the period from 1973 to 2008 and July and November months are considered as an example of dry and wet periods. The cross-correlation among the rain gauges is considered for the available hourly rainfall information at the neighboring stations or not. This paper discussed the applicability of the MuDRain model for disaggregation daily rainfall to hourly rainfall for both sources of cross-correlation. The goodness of fit of the model was based on the reproduction of fitting statistics like the means, variances, coefficients of skewness, lag zero cross-correlation of coefficients and the lag one auto correlation of coefficients. It is found the correlation coefficients based on extracted correlations that was based on daily are slightly higher than correlations based on available hourly rainfall especially for neighboring stations not more than 28 km. The results showed also the MuDRain model did not reproduce statistics very well. In addition, a bad reproduction of the actual hyetographs comparing to the synthetic hourly rainfall data. Mean while, it is showed a good fit between the distribution function of the historical and synthetic hourly rainfall. These discrepancies are unavoidable because of the lowest cross correlation of hourly rainfall. The overall performance indicated that the MuDRain model would not be appropriate choice for disaggregation daily rainfall.Keywords: rainfall disaggregation, multivariate disaggregation rainfall model, correlation, stochastic model
Procedia PDF Downloads 5167764 Transport Emission Inventories and Medical Exposure Modeling: A Missing Link for Urban Health
Authors: Frederik Schulte, Stefan Voß
Abstract:
The adverse effects of air pollution on public health are an increasingly vital problem in planning for urban regions in many parts of the world. The issue is addressed from various angles and by distinct disciplines in research. Epidemiological studies model the relative increase of numerous diseases in response to an increment of different forms of air pollution. A significant share of air pollution in urban regions is related to transport emissions that are often measured and stored in emission inventories. Though, most approaches in transport planning, engineering, and operational design of transport activities are restricted to general emission limits for specific air pollutants and do not consider more nuanced exposure models. We conduct an extensive literature review on exposure models and emission inventories used to study the health impact of transport emissions. Furthermore, we review methods applied in both domains and use emission inventory data of transportation hubs such as ports, airports, and urban traffic for an in-depth analysis of public health impacts deploying medical exposure models. The results reveal specific urban health risks related to transport emissions that may improve urban planning for environmental health by providing insights in actual health effects instead of only referring to general emission limits.Keywords: emission inventories, exposure models, transport emissions, urban health
Procedia PDF Downloads 3897763 Removal of Basic Yellow 28 Dye from Aqueous Solutions Using Plastic Wastes
Authors: Nadjib Dahdouh, Samira Amokrane, Elhadj Mekatel, Djamel Nibou
Abstract:
The removal of Basic Yellow 28 (BY28) from aqueous solutions by plastic wastes PMMA was investigated. The characteristics of plastic wastes PMMA were determined by SEM, FTIR and chemical composition analysis. The effects of solution pH, initial Basic Yellow 28 (BY28) concentration C, solid/liquid ratio R, and temperature T were studied in batch experiments. The Freundlich and the Langmuir models have been applied to the adsorption process, and it was found that the equilibrium followed well Langmuir adsorption isotherm. A comparison of kinetic models applied to the adsorption of BY28 on the PMMA was evaluated for the pseudo-first-order and the pseudo-second-order kinetic models. It was found that used models were correlated with the experimental data. Intraparticle diffusion model was also used in these experiments. The thermodynamic parameters namely the enthalpy ∆H°, entropy ∆S° and free energy ∆G° of adsorption of BY28 on PMMA were determined. From the obtained results, the negative values of Gibbs free energy ∆G° indicated the spontaneity of the adsorption of BY28 by PMMA. The negative values of ∆H° revealed the exothermic nature of the process and the negative values of ∆S° suggest the stability of BY28 on the surface of SW PMMA.Keywords: removal, Waste PMMA, BY28 dye, equilibrium, kinetic study, thermodynamic study
Procedia PDF Downloads 1537762 Analysis on Prediction Models of TBM Performance and Selection of Optimal Input Parameters
Authors: Hang Lo Lee, Ki Il Song, Hee Hwan Ryu
Abstract:
An accurate prediction of TBM(Tunnel Boring Machine) performance is very difficult for reliable estimation of the construction period and cost in preconstruction stage. For this purpose, the aim of this study is to analyze the evaluation process of various prediction models published since 2000 for TBM performance, and to select the optimal input parameters for the prediction model. A classification system of TBM performance prediction model and applied methodology are proposed in this research. Input and output parameters applied for prediction models are also represented. Based on these results, a statistical analysis is performed using the collected data from shield TBM tunnel in South Korea. By performing a simple regression and residual analysis utilizinFg statistical program, R, the optimal input parameters are selected. These results are expected to be used for development of prediction model of TBM performance.Keywords: TBM performance prediction model, classification system, simple regression analysis, residual analysis, optimal input parameters
Procedia PDF Downloads 3097761 Investigation of Contact Pressure Distribution at Expanded Polystyrene Geofoam Interfaces Using Tactile Sensors
Authors: Chen Liu, Dawit Negussey
Abstract:
EPS (Expanded Polystyrene) geofoam as light-weight material in geotechnical applications are made of pre-expanded resin beads that form fused cellular micro-structures. The strength and deformation properties of geofoam blocks are determined by unconfined compression of small test samples between rigid loading plates. Applied loads are presumed to be supported uniformly over the entire mating end areas. Predictions of field performance on the basis of such laboratory tests widely over-estimate actual post-construction settlements and exaggerate predictions of long-term creep deformations. This investigation examined the development of contact pressures at a large number of discrete points at low and large strain levels for different densities of geofoam. Development of pressure patterns for fine and coarse interface material textures as well as for molding skin and hot wire cut geofoam surfaces were examined. The lab testing showed that I-Scan tactile sensors are useful for detailed observation of contact pressures at a large number of discrete points simultaneously. At low strain level (1%), the lower density EPS block presents low variations in localized stress distribution compared to higher density EPS. At high strain level (10%), the dense geofoam reached the sensor cut-off limit. The imprint and pressure patterns for different interface textures can be distinguished with tactile sensing. The pressure sensing system can be used in many fields with real-time pressure detection. The research findings provide a better understanding of EPS geofoam behavior for improvement of design methods and performance prediction of critical infrastructures, which will be anticipated to guide future improvements in design and rapid construction of critical transportation infrastructures with geofoam in geotechnical applications.Keywords: geofoam, pressure distribution, tactile pressure sensors, interface
Procedia PDF Downloads 1737760 Statistical Assessment of Models for Determination of Soil–Water Characteristic Curves of Sand Soils
Authors: S. J. Matlan, M. Mukhlisin, M. R. Taha
Abstract:
Characterization of the engineering behavior of unsaturated soil is dependent on the soil-water characteristic curve (SWCC), a graphical representation of the relationship between water content or degree of saturation and soil suction. A reasonable description of the SWCC is thus important for the accurate prediction of unsaturated soil parameters. The measurement procedures for determining the SWCC, however, are difficult, expensive, and time-consuming. During the past few decades, researchers have laid a major focus on developing empirical equations for predicting the SWCC, with a large number of empirical models suggested. One of the most crucial questions is how precisely existing equations can represent the SWCC. As different models have different ranges of capability, it is essential to evaluate the precision of the SWCC models used for each particular soil type for better SWCC estimation. It is expected that better estimation of SWCC would be achieved via a thorough statistical analysis of its distribution within a particular soil class. With this in view, a statistical analysis was conducted in order to evaluate the reliability of the SWCC prediction models against laboratory measurement. Optimization techniques were used to obtain the best-fit of the model parameters in four forms of SWCC equation, using laboratory data for relatively coarse-textured (i.e., sandy) soil. The four most prominent SWCCs were evaluated and computed for each sample. The result shows that the Brooks and Corey model is the most consistent in describing the SWCC for sand soil type. The Brooks and Corey model prediction also exhibit compatibility with samples ranging from low to high soil water content in which subjected to the samples that evaluated in this study.Keywords: soil-water characteristic curve (SWCC), statistical analysis, unsaturated soil, geotechnical engineering
Procedia PDF Downloads 3387759 Data Poisoning Attacks on Federated Learning and Preventive Measures
Authors: Beulah Rani Inbanathan
Abstract:
In the present era, it is vivid from the numerous outcomes that data privacy is being compromised in various ways. Machine learning is one technology that uses the centralized server, and then data is given as input which is being analyzed by the algorithms present on this mentioned server, and hence outputs are predicted. However, each time the data must be sent by the user as the algorithm will analyze the input data in order to predict the output, which is prone to threats. The solution to overcome this issue is federated learning, where the models alone get updated while the data resides on the local machine and does not get exchanged with the other local models. Nevertheless, even on these local models, there are chances of data poisoning, and it is crystal clear from various experiments done by many people. This paper delves into many ways where data poisoning occurs and the many methods through which it is prevalent that data poisoning still exists. It includes the poisoning attacks on IoT devices, Edge devices, Autoregressive model, and also, on Industrial IoT systems and also, few points on how these could be evadible in order to protect our data which is personal, or sensitive, or harmful when exposed.Keywords: data poisoning, federated learning, Internet of Things, edge computing
Procedia PDF Downloads 877758 Curative Effect of Blumea lacera Leaves on Experimental Haemorrhoids in Rats
Authors: Priyanka Sharma, Tarkewshwar Dubey, Hemalatha Siva
Abstract:
Hemorrhoids are one of the most common anorectal diseases around the world. Severalfactors are involved in causing hemorrhoids including irregularbowel function (constipation, diarrhea), exercise, gravity, low fiberdiet, pregnancy, obesity, high abdominal pressure, prolongedsitting, genetic factors, and aging. Pain, bleeding, itching,swelling and anal discharge are the symptoms of the disease. Due to limitedmodern pharmacotherapeutic options available for treatment, theherbal medicines remain the choice of therapy. Blumea lacera (Burm f.) DC. belonging to the Asteraceae family is a common plain land weed of Bangladesh. Traditionally it has been used for treatment of hemorrhoids.Considering the above fact, present study was aimed to validate the ethnomedicinal use of B. lacera leaves on experimental hemorrhoids in rats. The anti-hemorrhoid activity was performed by using croton oil induced rat models. The parameters studied were assessment of TNF-α and IL-6, Evans blue exudation, macroscopic severity score, recto-anal coefficient, histomorphological scores. Also, in vivo antioxidant parameters and histopathological studies were also performed. All paramaters exhibited significant anti-hemorrhoid activity. Moreover ethanolic extract of B. lacera (EBL) leaves 400mg/kg showed ameliorative effect oncroton oil induced hemorrhoids.In conclusion, EBL exhibitedbeneficial effect on croton oil- induced hemorrhoids and validates its ethnomedicinal use in treatment of piles.Keywords: haemorrhoids, IL-6, piles, TNF-α
Procedia PDF Downloads 2947757 Forecasting Unusual Infection of Patient Used by Irregular Weighted Point Set
Authors: Seema Vaidya
Abstract:
Mining association rule is a key issue in data mining. In any case, the standard models ignore the distinction among the exchanges, and the weighted association rule mining does not transform on databases with just binary attributes. This paper proposes a novel continuous example and executes a tree (FP-tree) structure, which is an increased prefix-tree structure for securing compacted, discriminating data about examples, and makes a fit FP-tree-based mining system, FP enhanced capacity algorithm is used, for mining the complete game plan of examples by illustration incessant development. Here, this paper handles the motivation behind making remarkable and weighted item sets, i.e. rare weighted item set mining issue. The two novel brightness measures are proposed for figuring the infrequent weighted item set mining issue. Also, the algorithm are handled which perform IWI which is more insignificant IWI mining. Moreover we utilized the rare item set for choice based structure. The general issue of the start of reliable definite rules is troublesome for the grounds that hypothetically no inciting technique with no other person can promise the rightness of influenced theories. In this way, this framework expects the disorder with the uncommon signs. Usage study demonstrates that proposed algorithm upgrades the structure which is successful and versatile for mining both long and short diagnostics rules. Structure upgrades aftereffects of foreseeing rare diseases of patient.Keywords: association rule, data mining, IWI mining, infrequent item set, frequent pattern growth
Procedia PDF Downloads 3997756 Lean Impact Analysis Assessment Models: Development of a Lean Measurement Structural Model
Authors: Catherine Maware, Olufemi Adetunji
Abstract:
The paper is aimed at developing a model to measure the impact of Lean manufacturing deployment on organizational performance. The model will help industry practitioners to assess the impact of implementing Lean constructs on organizational performance. It will also harmonize the measurement models of Lean performance with the house of Lean that seems to have become the industry standard. The sheer number of measurement models for impact assessment of Lean implementation makes it difficult for new adopters to select an appropriate assessment model or deployment methodology. A literature review is conducted to classify the Lean performance model. Pareto analysis is used to select the Lean constructs for the development of the model. The model is further formalized through the use of Structural Equation Modeling (SEM) in defining the underlying latent structure of a Lean system. An impact assessment measurement model developed can be used to measure Lean performance and can be adopted by different industries.Keywords: impact measurement model, lean bundles, lean manufacturing, organizational performance
Procedia PDF Downloads 4857755 Spatial Time Series Models for Rice and Cassava Yields Based on Bayesian Linear Mixed Models
Authors: Panudet Saengseedam, Nanthachai Kantanantha
Abstract:
This paper proposes a linear mixed model (LMM) with spatial effects to forecast rice and cassava yields in Thailand at the same time. A multivariate conditional autoregressive (MCAR) model is assumed to present the spatial effects. A Bayesian method is used for parameter estimation via Gibbs sampling Markov Chain Monte Carlo (MCMC). The model is applied to the rice and cassava yields monthly data which have been extracted from the Office of Agricultural Economics, Ministry of Agriculture and Cooperatives of Thailand. The results show that the proposed model has better performance in most provinces in both fitting part and validation part compared to the simple exponential smoothing and conditional auto regressive models (CAR) from our previous study.Keywords: Bayesian method, linear mixed model, multivariate conditional autoregressive model, spatial time series
Procedia PDF Downloads 3957754 Magneto-Thermo-Mechanical Analysis of Electromagnetic Devices Using the Finite Element Method
Authors: Michael G. Pantelyat
Abstract:
Fundamental basics of pure and applied research in the area of magneto-thermo-mechanical numerical analysis and design of innovative electromagnetic devices (modern induction heaters, novel thermoelastic actuators, rotating electrical machines, induction cookers, electrophysical devices) are elaborated. Thus, mathematical models of magneto-thermo-mechanical processes in electromagnetic devices taking into account main interactions of interrelated phenomena are developed. In addition, graphical representation of coupled (multiphysics) phenomena under consideration is proposed. Besides, numerical techniques for nonlinear problems solution are developed. On this base, effective numerical algorithms for solution of actual problems of practical interest are proposed, validated and implemented in applied 2D and 3D computer codes developed. Many applied problems of practical interest regarding modern electrical engineering devices are numerically solved. Investigations of the influences of various interrelated physical phenomena (temperature dependences of material properties, thermal radiation, conditions of convective heat transfer, contact phenomena, etc.) on the accuracy of the electromagnetic, thermal and structural analyses are conducted. Important practical recommendations on the choice of rational structures, materials and operation modes of electromagnetic devices under consideration are proposed and implemented in industry.Keywords: electromagnetic devices, multiphysics, numerical analysis, simulation and design
Procedia PDF Downloads 3867753 Oryzanol Recovery from Rice Bran Oil: Adsorption Equilibrium Models Through Kinetics Data Approachments
Authors: A.D. Susanti, W. B. Sediawan, S.K. Wirawan, Budhijanto, Ritmaleni
Abstract:
Oryzanol content in rice bran oil (RBO) naturally has high antioxidant activity. Its reviewed has several health properties and high interested in pharmacy, cosmetics, and nutrition’s. Because of the low concentration of oryzanol in crude RBO (0.9-2.9%) then its need to be further processed for practical usage, such as via adsorption process. In this study, investigation and adjustment of adsorption equilibrium models were conducted through the kinetic data approachments. Mathematical modeling on kinetics of batch adsorption of oryzanol separation from RBO has been set-up and then applied for equilibrium results. The size of adsorbent particles used in this case are usually relatively small then the concentration in the adsorbent is assumed to be not different. Hence, the adsorption rate is controlled by the rate of oryzanol mass transfer from the bulk fluid of RBO to the surface of silica gel. In this approachments, the rate of mass transfer is assumed to be proportional to the concentration deviation from the equilibrium state. The equilibrium models applied were Langmuir, coefficient distribution, and Freundlich with the values of the parameters obtained from equilibrium results. It turned out that the models set-up can quantitatively describe the experimental kinetics data and the adjustment of the values of equilibrium isotherm parameters significantly improves the accuracy of the model. And then the value of mass transfer coefficient per unit adsorbent mass (kca) is obtained by curve fitting.Keywords: adsorption equilibrium, adsorption kinetics, oryzanol, rice bran oil
Procedia PDF Downloads 3237752 Vibration of a Beam on an Elastic Foundation Using the Variational Iteration Method
Authors: Desmond Adair, Kairat Ismailov, Martin Jaeger
Abstract:
Modelling of Timoshenko beams on elastic foundations has been widely used in the analysis of buildings, geotechnical problems, and, railway and aerospace structures. For the elastic foundation, the most widely used models are one-parameter mechanical models or two-parameter models to include continuity and cohesion of typical foundations, with the two-parameter usually considered the better of the two. Knowledge of free vibration characteristics of beams on an elastic foundation is considered necessary for optimal design solutions in many engineering applications, and in this work, the efficient and accurate variational iteration method is developed and used to calculate natural frequencies of a Timoshenko beam on a two-parameter foundation. The variational iteration method is a technique capable of dealing with some linear and non-linear problems in an easy and efficient way. The calculations are compared with those using a finite-element method and other analytical solutions, and it is shown that the results are accurate and are obtained efficiently. It is found that the effect of the presence of the two-parameter foundation is to increase the beam’s natural frequencies and this is thought to be because of the shear-layer stiffness, which has an effect on the elastic stiffness. By setting the two-parameter model’s stiffness parameter to zero, it is possible to obtain a one-parameter foundation model, and so, comparison between the two foundation models is also made.Keywords: Timoshenko beam, variational iteration method, two-parameter elastic foundation model
Procedia PDF Downloads 1947751 Positive Bias and Length Bias in Deep Neural Networks for Premises Selection
Authors: Jiaqi Huang, Yuheng Wang
Abstract:
Premises selection, the task of selecting a set of axioms for proving a given conjecture, is a major bottleneck in automated theorem proving. An array of deep-learning-based methods has been established for premises selection, but a perfect performance remains challenging. Our study examines the inaccuracy of deep neural networks in premises selection. Through training network models using encoded conjecture and axiom pairs from the Mizar Mathematical Library, two potential biases are found: the network models classify more premises as necessary than unnecessary, referred to as the ‘positive bias’, and the network models perform better in proving conjectures that paired with more axioms, referred to as ‘length bias’. The ‘positive bias’ and ‘length bias’ discovered could inform the limitation of existing deep neural networks.Keywords: automated theorem proving, premises selection, deep learning, interpreting deep learning
Procedia PDF Downloads 1837750 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts
Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig
Abstract:
This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.Keywords: expert interview, hazard management, modeling, simulation, snow avalanche
Procedia PDF Downloads 3267749 The Impact of International Financial Reporting Standards (IFRS) Adoption on Performance’s Measure: A Study of UK Companies
Authors: Javad Izadi, Sahar Majioud
Abstract:
This study presents an approach of assessing the choice of performance measures of companies in the United Kingdom after the application of IFRS in 2005. The aim of this study is to investigate the effects of IFRS on the choice of performance evaluation methods for UK companies. We analyse through an econometric model the relationship of the dependent variable, the firm’s performance, which is a nominal variable with the independent ones. Independent variables are split into two main groups: the first one is the group of accounting-based measures: Earning per share, return on assets and return on equities. The second one is the group of market-based measures: market value of property plant and equipment, research and development, sales growth, market to book value, leverage, segment and size of companies. Concerning the regression used, it is a multinomial logistic regression performed on a sample of 130 UK listed companies. Our finding shows after IFRS adoption, and companies give more importance to some variables such as return on equities and sales growth to assess their performance, whereas the return on assets and market to book value ratio does not have as much importance as before IFRS in evaluating the performance of companies. Also, there are some variables that have no impact on the performance measures anymore, such as earning per share. This article finding is empirically important for business in subjects related to IFRS and companies’ performance measurement.Keywords: performance’s Measure, nominal variable, econometric model, evaluation methods
Procedia PDF Downloads 1387748 Modified Clusterwise Regression for Pavement Management
Authors: Mukesh Khadka, Alexander Paz, Hanns de la Fuente-Mella
Abstract:
Typically, pavement performance models are developed in two steps: (i) pavement segments with similar characteristics are grouped together to form a cluster, and (ii) the corresponding performance models are developed using statistical techniques. A challenge is to select the characteristics that define clusters and the segments associated with them. If inappropriate characteristics are used, clusters may include homogeneous segments with different performance behavior or heterogeneous segments with similar performance behavior. Prediction accuracy of performance models can be improved by grouping the pavement segments into more uniform clusters by including both characteristics and a performance measure. This grouping is not always possible due to limited information. It is impractical to include all the potential significant factors because some of them are potentially unobserved or difficult to measure. Historical performance of pavement segments could be used as a proxy to incorporate the effect of the missing potential significant factors in clustering process. The current state-of-the-art proposes Clusterwise Linear Regression (CLR) to determine the pavement clusters and the associated performance models simultaneously. CLR incorporates the effect of significant factors as well as a performance measure. In this study, a mathematical program was formulated for CLR models including multiple explanatory variables. Pavement data collected recently over the entire state of Nevada were used. International Roughness Index (IRI) was used as a pavement performance measure because it serves as a unified standard that is widely accepted for evaluating pavement performance, especially in terms of riding quality. Results illustrate the advantage of the using CLR. Previous studies have used CLR along with experimental data. This study uses actual field data collected across a variety of environmental, traffic, design, and construction and maintenance conditions.Keywords: clusterwise regression, pavement management system, performance model, optimization
Procedia PDF Downloads 2527747 Using the Bootstrap for Problems Statistics
Authors: Brahim Boukabcha, Amar Rebbouh
Abstract:
The bootstrap method based on the idea of exploiting all the information provided by the initial sample, allows us to study the properties of estimators. In this article we will present a theoretical study on the different methods of bootstrapping and using the technique of re-sampling in statistics inference to calculate the standard error of means of an estimator and determining a confidence interval for an estimated parameter. We apply these methods tested in the regression models and Pareto model, giving the best approximations.Keywords: bootstrap, error standard, bias, jackknife, mean, median, variance, confidence interval, regression models
Procedia PDF Downloads 3817746 Framework for Developing Change Team to Maximize Change Initiative Success
Authors: Mohammad Z. Ansari, Lisa Brodie, Marilyn Goh
Abstract:
Change facilitators are individuals who utilize change philosophy to make a positive change to organizations. The application of change facilitators can be seen in various change models; Lewin, Lippitt, etc. The facilitators within numerous change models are considered as internal/external consultants. Whilst most of the scholarly paper considers change facilitation as a consensus attempt to improve organization, there is a lack of a framework that develops both the organization and the change facilitator creating a self-sustaining change environment. This research paper introduces the development of the framework for change Leaders, Planners, and Executers (LPE), aiming at various organizational levels (Process, Departmental, and Organisational). The LPE framework is derived by exploring interrelated characteristics between facilitator(s) and the organization through qualitative research for understanding change management techniques and facilitator(s) behavioral aspect from existing Change Management models and Organisation behavior works of literature. The introduced framework assists in highlighting and identify the most appropriate change team to successfully deliver the change initiative within any organization (s).Keywords: change initiative, LPE framework, change facilitator(s), sustainable change
Procedia PDF Downloads 196