Search results for: virtual models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7601

Search results for: virtual models

3011 Research on Models and Selection of Entry Strategies for Catering Industry Based on the Evolutionary Game Theory

Authors: Jianxin Zhu, Na Liu

Abstract:

Entry strategies play a vital role in the development of new enterprises in the catering industry. Different entry strategies will have different effects on the development of new enterprise. Based on the research of scholars at home and abroad, and combining the characteristics of the catering industry, the entry strategies are divided into low-price entry strategies and high-quality entry strategies. Facing the entry of new enterprise, the strategies of incumbent enterprises are divided into response strategies and non-response strategies. This paper uses evolutionary game theory to study the strategic interaction mechanism between incumbent companies and new enterprises. When different initial values and parameter values are set, which strategy will the two-game subjects choose, respectively? Using matlab2016 for numerical simulation, the results show that the choice of strategies for new enterprise and incumbent enterprise is influenced by more than one factor, and the system has different evolution trends under different circumstances. When the parameters were set, the choice of two subjects' strategies mainly depends on the net profit between the strategies.

Keywords: catering industry, entry strategy, evolutionary game, strategic interaction mechanism

Procedia PDF Downloads 112
3010 Early Impact Prediction and Key Factors Study of Artificial Intelligence Patents: A Method Based on LightGBM and Interpretable Machine Learning

Authors: Xingyu Gao, Qiang Wu

Abstract:

Patents play a crucial role in protecting innovation and intellectual property. Early prediction of the impact of artificial intelligence (AI) patents helps researchers and companies allocate resources and make better decisions. Understanding the key factors that influence patent impact can assist researchers in gaining a better understanding of the evolution of AI technology and innovation trends. Therefore, identifying highly impactful patents early and providing support for them holds immeasurable value in accelerating technological progress, reducing research and development costs, and mitigating market positioning risks. Despite the extensive research on AI patents, accurately predicting their early impact remains a challenge. Traditional methods often consider only single factors or simple combinations, failing to comprehensively and accurately reflect the actual impact of patents. This paper utilized the artificial intelligence patent database from the United States Patent and Trademark Office and the Len.org patent retrieval platform to obtain specific information on 35,708 AI patents. Using six machine learning models, namely Multiple Linear Regression, Random Forest Regression, XGBoost Regression, LightGBM Regression, Support Vector Machine Regression, and K-Nearest Neighbors Regression, and using early indicators of patents as features, the paper comprehensively predicted the impact of patents from three aspects: technical, social, and economic. These aspects include the technical leadership of patents, the number of citations they receive, and their shared value. The SHAP (Shapley Additive exPlanations) metric was used to explain the predictions of the best model, quantifying the contribution of each feature to the model's predictions. The experimental results on the AI patent dataset indicate that, for all three target variables, LightGBM regression shows the best predictive performance. Specifically, patent novelty has the greatest impact on predicting the technical impact of patents and has a positive effect. Additionally, the number of owners, the number of backward citations, and the number of independent claims are all crucial and have a positive influence on predicting technical impact. In predicting the social impact of patents, the number of applicants is considered the most critical input variable, but it has a negative impact on social impact. At the same time, the number of independent claims, the number of owners, and the number of backward citations are also important predictive factors, and they have a positive effect on social impact. For predicting the economic impact of patents, the number of independent claims is considered the most important factor and has a positive impact on economic impact. The number of owners, the number of sibling countries or regions, and the size of the extended patent family also have a positive influence on economic impact. The study primarily relies on data from the United States Patent and Trademark Office for artificial intelligence patents. Future research could consider more comprehensive data sources, including artificial intelligence patent data, from a global perspective. While the study takes into account various factors, there may still be other important features not considered. In the future, factors such as patent implementation and market applications may be considered as they could have an impact on the influence of patents.

Keywords: patent influence, interpretable machine learning, predictive models, SHAP

Procedia PDF Downloads 29
3009 Simulation of 1D Dielectric Barrier Discharge in Argon Mixtures

Authors: Lucas Wilman Crispim, Patrícia Hallack, Maikel Ballester

Abstract:

This work aims at modeling electric discharges in gas mixtures. The mathematical model mimics the ignition process in a commercial spark-plug when a high voltage is applied to the plug terminals. A longitudinal unidimensional Cartesian domain is chosen for the simulation region. Energy and mass transfer are considered for a macroscopic fluid representation, while energy transfer in molecular collisions and chemical reactions are contemplated at microscopic level. The macroscopic model is represented by a set of uncoupled partial differential equations. Microscopic effects are studied within a discrete model for electronic and molecular collisions in the frame of ZDPlasKin, a plasma modeling numerical tool. The BOLSIG+ solver is employed in solving the electronic Boltzmann equation. An operator splitting technique is used to separate microscopic and macroscopic models. The simulation gas is a mixture of atomic Argon neutral, excited and ionized. Spatial and temporal evolution of such species and temperature are presented and discussed.

Keywords: CFD, electronic discharge, ignition, spark plug

Procedia PDF Downloads 149
3008 Adsorption of Cerium as One of the Rare Earth Elements Using Multiwall Carbon Nanotubes from Aqueous Solution: Modeling, Equilibrium and Kinetics

Authors: Saeb Ahmadi, Mohsen Vafaie Sefti, Mohammad Mahdi Shadman, Ebrahim Tangestani

Abstract:

Carbon nanotube has shown great potential for the removal of various inorganic and organic components due to properties such as large surface area and high adsorption capacity. Central composite design is widely used method for determining optimal conditions. Also due to the economic reasons and wide application, the rare earth elements are important components. The analyses of cerium (Ce(III)) adsorption as one of the Rare Earth Elements (REEs) adsorption on Multiwall Carbon Nanotubes (MWCNTs) have been studied. The optimization process was performed using Response Surface Methodology (RSM). The optimum amount conditions were pH of 4.5, initial Ce (III) concentration of 90 mg/l and MWCNTs dosage of 80 mg. Under this condition, the optimum adsorption percentage of Ce (III) was obtained about 96%. Next, at the obtained optimum conditions the kinetic and isotherm studied and result showed the pseudo-second order and Langmuir isotherm are more fitted with experimental data than other models.

Keywords: cerium, rare earth element, MWCNTs, adsorption, optimization

Procedia PDF Downloads 149
3007 Adsorption of Cd2+ from Aqueous Solutions Using Chitosan Obtained from a Mixture of Littorina littorea and Achatinoidea Shells

Authors: E. D. Paul, O. F. Paul, J. E. Toryila, A. J. Salifu, C. E. Gimba

Abstract:

Adsorption of Cd2+ ions from aqueous solution by Chitosan, a natural polymer, obtained from a mixture of the exoskeletons of Littorina littorea (Periwinkle) and Achatinoidea (Snail) was studied at varying adsorbent dose, contact time, metal ion concentrations, temperature and pH using batch adsorption method. The equilibrium adsorption isotherms were determined between 298 K and 345 K. The adsorption data were adjusted to Langmuir, Freundlich and the pseudo second order kinetic models. It was found that the Langmuir isotherm model most fitted the experimental data, with a maximum monolayer adsorption of 35.1 mgkg⁻¹ at 308 K. The entropy and enthalpy of adsorption were -0.1121 kJmol⁻¹K⁻¹ and -11.43 kJmol⁻¹ respectively. The Freundlich adsorption model, gave Kf and n values consistent with good adsorption. The pseudo-second order reaction model gave a straight line plot with rate constant of 1.291x 10⁻³ kgmg⁻¹ min⁻¹. The qe value was 21.98 mgkg⁻¹, indicating that the adsorption of Cadmium ion by the chitosan composite followed the pseudo-second order kinetic model.

Keywords: adsorption, chitosan, littorina littorea, achatinoidea, natural polymer

Procedia PDF Downloads 386
3006 Modelling Fluoride Pollution of Groundwater Using Artificial Neural Network in the Western Parts of Jharkhand

Authors: Neeta Kumari, Gopal Pathak

Abstract:

Artificial neural network has been proved to be an efficient tool for non-parametric modeling of data in various applications where output is non-linearly associated with input. It is a preferred tool for many predictive data mining applications because of its power , flexibility, and ease of use. A standard feed forward networks (FFN) is used to predict the groundwater fluoride content. The ANN model is trained using back propagated algorithm, Tansig and Logsig activation function having varying number of neurons. The models are evaluated on the basis of statistical performance criteria like Root Mean Squarred Error (RMSE) and Regression coefficient (R2), bias (mean error), Coefficient of variation (CV), Nash-Sutcliffe efficiency (NSE), and the index of agreement (IOA). The results of the study indicate that Artificial neural network (ANN) can be used for groundwater fluoride prediction in the limited data situation in the hard rock region like western parts of Jharkhand with sufficiently good accuracy.

Keywords: Artificial neural network (ANN), FFN (Feed-forward network), backpropagation algorithm, Levenberg-Marquardt algorithm, groundwater fluoride contamination

Procedia PDF Downloads 525
3005 On Reliability of a Credit Default Swap Contract during the EMU Debt Crisis

Authors: Petra Buzkova, Milos Kopa

Abstract:

Reliability of the credit default swap market had been questioned repeatedly during the EMU debt crisis. The article examines whether this development influenced sovereign EMU CDS prices in general. We regress the CDS market price on a model risk neutral CDS price obtained from an adopted reduced form valuation model in the 2009-2013 period. We look for a break point in the single-equation and multi-equation econometric models in order to show the changes in relations between CDS market and model prices. Our results differ according to the risk profile of a country. We find that in the case of riskier countries, the relationship between the market and model price changed when market participants started to question the ability of CDS contracts to protect their buyers. Specifically, it weakened after the change. In the case of less risky countries, the change happened earlier and the effect of a weakened relationship is not observed.

Keywords: chow stability test, credit default swap, debt crisis, reduced form valuation model, seemingly unrelated regression

Procedia PDF Downloads 247
3004 Application of Artificial Neural Network for Prediction of Load-Haul-Dump Machine Performance Characteristics

Authors: J. Balaraju, M. Govinda Raj, C. S. N. Murthy

Abstract:

Every industry is constantly looking for enhancement of its day to day production and productivity. This can be possible only by maintaining the men and machinery at its adequate level. Prediction of performance characteristics plays an important role in performance evaluation of the equipment. Analytical and statistical approaches will take a bit more time to solve complex problems such as performance estimations as compared with software-based approaches. Keeping this in view the present study deals with an Artificial Neural Network (ANN) modelling of a Load-Haul-Dump (LHD) machine to predict the performance characteristics such as reliability, availability and preventive maintenance (PM). A feed-forward-back-propagation ANN technique has been used to model the Levenberg-Marquardt (LM) training algorithm. The performance characteristics were computed using Isograph Reliability Workbench 13.0 software. These computed values were validated using predicted output responses of ANN models. Further, recommendations are given to the industry based on the performed analysis for improvement of equipment performance.

Keywords: load-haul-dump, LHD, artificial neural network, ANN, performance, reliability, availability, preventive maintenance

Procedia PDF Downloads 129
3003 New Estimation in Autoregressive Models with Exponential White Noise by Using Reversible Jump MCMC Algorithm

Authors: Suparman Suparman

Abstract:

A white noise in autoregressive (AR) model is often assumed to be normally distributed. In application, the white noise usually do not follows a normal distribution. This paper aims to estimate a parameter of AR model that has a exponential white noise. A Bayesian method is adopted. A prior distribution of the parameter of AR model is selected and then this prior distribution is combined with a likelihood function of data to get a posterior distribution. Based on this posterior distribution, a Bayesian estimator for the parameter of AR model is estimated. Because the order of AR model is considered a parameter, this Bayesian estimator cannot be explicitly calculated. To resolve this problem, a method of reversible jump Markov Chain Monte Carlo (MCMC) is adopted. A result is a estimation of the parameter AR model can be simultaneously calculated.

Keywords: autoregressive (AR) model, exponential white Noise, bayesian, reversible jump Markov Chain Monte Carlo (MCMC)

Procedia PDF Downloads 343
3002 The GRIT Study: Getting Global Rare Disease Insights Through Technology Study

Authors: Aneal Khan, Elleine Allapitan, Desmond Koo, Katherine-Ann Piedalue, Shaneel Pathak, Utkarsh Subnis

Abstract:

Background: Disease management of metabolic, genetic disorders is long-term and can be cumbersome to patients and caregivers. Patient-Reported Outcome Measures (PROMs) have been a useful tool in capturing patient perspectives to help enhance treatment compliance and engagement with health care providers, reduce utilization of emergency services, and increase satisfaction with their treatment choices. Currently, however, PROMs are collected during infrequent and decontextualized clinic visits, which makes translation of patient experiences challenging over time. The GRIT study aims to evaluate a digital health journal application called Zamplo that provides a personalized health diary to record self-reported health outcomes accurately and efficiently in patients with metabolic, genetic disorders. Methods: This is a randomized controlled trial (RCT) (1:1) that assesses the efficacy of Zamplo to increase patient activation (primary outcome), improve healthcare satisfaction and confidence to manage medications (secondary outcomes), and reduce costs to the healthcare system (exploratory). Using standardized online surveys, assessments will be collected at baseline, 1 month, 3 months, 6 months, and 12 months. Outcomes will be compared between patients who were given access to the application versus those with no access. Results: Seventy-seven patients were recruited as of November 30, 2021. Recruitment for the study commenced in November 2020 with a target of n=150 patients. The accrual rate was 50% from those eligible and invited for the study, with the majority of patients having Fabry disease (n=48) and the remaining having Pompe disease and mitochondrial disease. Real-time clinical responses, such as pain, are being measured and correlated to disease-modifying therapies, supportive treatments like pain medications, and lifestyle interventions. Engagement with the application, along with compliance metrics of surveys and journal entries, are being analyzed. An interim analysis of the engagement data along with preliminary findings from this pilot RCT, and qualitative patient feedback will be presented. Conclusions: The digital self-care journal provides a unique approach to disease management, allowing patients direct access to their progress and actively participating in their care. Findings from the study can help serve the virtual care needs of patients with metabolic, genetic disorders in North America and the world over.

Keywords: eHealth, mobile health, rare disease, patient outcomes, quality of life (QoL), pain, Fabry disease, Pompe disease

Procedia PDF Downloads 143
3001 A Magnetic Hydrochar Nanocomposite as a Potential Adsorbent of Emerging Pollutants

Authors: Aura Alejandra Burbano Patino, Mariela Agotegaray, Veronica Lassalle, Fernanda Horst

Abstract:

Water pollution is of worldwide concern due to its importance as an essential resource for life. Industrial and urbanistic growth are anthropogenic activities that have caused an increase of undesirable compounds in water. In the last decade, emerging pollutants have become of great interest since, at very low concentrations (µg/L and ng/L), they exhibit a hazardous effect on wildlife, aquatic ecosystems, and human organisms. One group of emerging pollutants that are a matter of study are pharmaceuticals. Their high consumption rate and their inappropriate disposal have led to their detection in wastewater treatment plant influent, effluent, surface water, and drinking water. In consequence, numerous technologies have been developed to efficiently treat these pollutants. Adsorption appears like an easy and cost-effective technology. One of the most used adsorbents of emerging pollutants removal is carbon-based materials such as hydrochars. This study aims to use a magnetic hydrochar nanocomposite to be employed as an adsorbent for diclofenac removal. Kinetics models and the adsorption efficiency in real water samples were analyzed. For this purpose, a magnetic hydrochar nanocomposite was synthesized through the hydrothermal carbonization (HTC) technique hybridized to co-precipitation to add the magnetic component into the hydrochar, based on iron oxide nanoparticles. The hydrochar was obtained from sunflower husk residue as the precursor. TEM, TGA, FTIR, Zeta potential as a function of pH, DLS, BET technique, and elemental analysis were employed to characterize the material in terms of composition and chemical structure. Adsorption kinetics were carried out in distilled water and real water at room temperature, pH of 5.5 for distilled water and natural pH for real water samples, 1:1 adsorbent: adsorbate dosage ratio, contact times from 10-120 minutes, and 50% dosage concentration of DCF. Results have demonstrated that magnetic hydrochar presents superparamagnetic properties with a saturation magnetization value of 55.28 emu/g. Besides, it is mesoporous with a surface area of 55.52 m²/g. It is composed of magnetite nanoparticles incorporated into the hydrochar matrix, as can be proven by TEM micrographs, FTIR spectra, and zeta potential. On the other hand, kinetic studies were carried out using DCF models, finding percent removal efficiencies up to 85.34% after 80 minutes of contact time. In addition, after 120 minutes of contact time, desorption of emerging pollutants from active sites took place, which indicated that the material got saturated after that t time. In real water samples, percent removal efficiencies decrease up to 57.39%, ascribable to a possible mechanism of competitive adsorption of organic or inorganic compounds, ions for active sites of the magnetic hydrochar. The main suggested adsorption mechanism between the magnetic hydrochar and diclofenac include hydrophobic and electrostatic interactions as well as hydrogen bonds. It can be concluded that the magnetic hydrochar nanocomposite could be valorized into a by-product which appears as an efficient adsorbent for DCF removal as a model emerging pollutant. These results are being complemented by modifying experimental variables such as pollutant’s initial concentration, adsorbent: adsorbate dosage ratio, and temperature. Currently, adsorption assays of other emerging pollutants are being been carried out.

Keywords: environmental remediation, emerging pollutants, hydrochar, magnetite nanoparticles

Procedia PDF Downloads 176
3000 Challenges and Lessons of Mentoring Processes for Novice Principals: An Exploratory Case Study of Induction Programs in Chile

Authors: Carolina Cuéllar, Paz González

Abstract:

Research has shown that school leadership has a significant indirect effect on students’ achievements. In Chile, evidence has also revealed that this impact is stronger in vulnerable schools. With the aim of strengthening school leadership, public policy has taken up the challenge of enhancing capabilities of novice principals through the implementation of induction programs, which include a mentoring component, entrusting the task of delivering these programs to universities. The importance of using mentoring or coaching models in the preparation of novice school leaders has been emphasized in the international literature. Thus, it can be affirmed that building leadership capacity through partnership is crucial to facilitate cognitive and affective support required in the initial phase of the principal career, gain role clarification and socialization in context, stimulate reflective leadership practice, among others. In Chile, mentoring is a recent phenomenon in the field of school leadership and it is even more new in the preparation of new principals who work in public schools. This study, funded by the Chilean Ministry of Education, sought to explore the challenges and lessons arising from the design and implementation of mentoring processes which are part of the induction programs, according to the perception of the different actors involved: ministerial agents, university coordinators, mentors and novice principals. The investigation used a qualitative design, based on a study of three cases (three induction programs). The sources of information were 46 semi-structured interviews, applied in two moments (at the beginning and end of mentoring). Content analysis technique was employed. Data focused on the uniqueness of each case and the commonalities within the cases. Five main challenges and lessons emerged in the design and implementation of mentoring within the induction programs for new principals from Chilean public schools. They comprised the need of (i) developing a shared conceptual framework on mentoring among the institutions and actors involved, which helps align the expectations for the mentoring component within the induction programs, along with assisting in establishing a theory of action of mentoring that is relevant to the public school context; (ii) recognizing trough actions and decisions at different levels that the role of a mentor differs from the role of a principal, which challenge the idea that an effective principal will always be an effective mentor; iii) improving mentors’ selection and preparation processes trough the definition of common guiding criteria to ensure that a mentor takes responsibility for developing critical judgment of novice principals, which implies not limiting the mentor’s actions to assist in the compliance of prescriptive practices and standards; (iv) generating common evaluative models with goals, instruments and indicators consistent with the characteristics of mentoring processes, which helps to assess expected results and impact; and (v) including the design of a mentoring structure as an outcome of the induction programs, which helps sustain mentoring within schools as a collective professional development practice. Results showcased interwoven elements that entail continuous negotiations at different levels. Taking action will contribute to policy efforts aimed at professionalizing the leadership role in public schools.

Keywords: induction programs, mentoring, novice principals, school leadership preparation

Procedia PDF Downloads 110
2999 On the Influence of Sleep Habits for Predicting Preterm Births: A Machine Learning Approach

Authors: C. Fernandez-Plaza, I. Abad, E. Diaz, I. Diaz

Abstract:

Births occurring before the 37th week of gestation are considered preterm births. A threat of preterm is defined as the beginning of regular uterine contractions, dilation and cervical effacement between 23 and 36 gestation weeks. To author's best knowledge, the factors that determine the beginning of the birth are not completely defined yet. In particular, the incidence of sleep habits on preterm births is weekly studied. The aim of this study is to develop a model to predict the factors affecting premature delivery on pregnancy, based on the above potential risk factors, including those derived from sleep habits and light exposure at night (introduced as 12 variables obtained by a telephone survey using two questionnaires previously used by other authors). Thus, three groups of variables were included in the study (maternal, fetal and sleep habits). The study was approved by Research Ethics Committee of the Principado of Asturias (Spain). An observational, retrospective and descriptive study was performed with 481 births between January 1, 2015 and May 10, 2016 in the University Central Hospital of Asturias (Spain). A statistical analysis using SPSS was carried out to compare qualitative and quantitative variables between preterm and term delivery. Chi-square test qualitative variable and t-test for quantitative variables were applied. Statistically significant differences (p < 0.05) between preterm vs. term births were found for primiparity, multi-parity, kind of conception, place of residence or premature rupture of membranes and interruption during nights. In addition to the statistical analysis, machine learning methods to look for a prediction model were tested. In particular, tree based models were applied as the trade-off between performance and interpretability is especially suitable for this study. C5.0, recursive partitioning, random forest and tree bag models were analysed using caret R-package. Cross validation with 10-folds and parameter tuning to optimize the methods were applied. In addition, different noise reduction methods were applied to the initial data using NoiseFiltersR package. The best performance was obtained by C5.0 method with Accuracy 0.91, Sensitivity 0.93, Specificity 0.89 and Precision 0.91. Some well known preterm birth factors were identified: Cervix Dilation, maternal BMI, Premature rupture of membranes or nuchal translucency analysis in the first trimester. The model also identifies other new factors related to sleep habits such as light through window, bedtime on working days, usage of electronic devices before sleeping from Mondays to Fridays or change of sleeping habits reflected in the number of hours, in the depth of sleep or in the lighting of the room. IF dilation < = 2.95 AND usage of electronic devices before sleeping from Mondays to Friday = YES and change of sleeping habits = YES, then preterm is one of the predicting rules obtained by C5.0. In this work a model for predicting preterm births is developed. It is based on machine learning together with noise reduction techniques. The method maximizing the performance is the one selected. This model shows the influence of variables related to sleep habits in preterm prediction.

Keywords: machine learning, noise reduction, preterm birth, sleep habit

Procedia PDF Downloads 130
2998 Optimizing the Capacity of a Convolutional Neural Network for Image Segmentation and Pattern Recognition

Authors: Yalong Jiang, Zheru Chi

Abstract:

In this paper, we study the factors which determine the capacity of a Convolutional Neural Network (CNN) model and propose the ways to evaluate and adjust the capacity of a CNN model for best matching to a specific pattern recognition task. Firstly, a scheme is proposed to adjust the number of independent functional units within a CNN model to make it be better fitted to a task. Secondly, the number of independent functional units in the capsule network is adjusted to fit it to the training dataset. Thirdly, a method based on Bayesian GAN is proposed to enrich the variances in the current dataset to increase its complexity. Experimental results on the PASCAL VOC 2010 Person Part dataset and the MNIST dataset show that, in both conventional CNN models and capsule networks, the number of independent functional units is an important factor that determines the capacity of a network model. By adjusting the number of functional units, the capacity of a model can better match the complexity of a dataset.

Keywords: CNN, convolutional neural network, capsule network, capacity optimization, character recognition, data augmentation, semantic segmentation

Procedia PDF Downloads 135
2997 Fast and Accurate Model to Detect Ictal Waveforms in Electroencephalogram Signals

Authors: Piyush Swami, Bijaya Ketan Panigrahi, Sneh Anand, Manvir Bhatia, Tapan Gandhi

Abstract:

Visual inspection of electroencephalogram (EEG) signals to detect epileptic signals is very challenging and time-consuming task even for any expert neurophysiologist. This problem is most challenging in under-developed and developing countries due to shortage of skilled neurophysiologists. In the past, notable research efforts have gone in trying to automate the seizure detection process. However, due to high false alarm detections and complexity of the models developed so far, have vastly delimited their practical implementation. In this paper, we present a novel scheme for epileptic seizure detection using empirical mode decomposition technique. The intrinsic mode functions obtained were then used to calculate the standard deviations. This was followed by probability density based classifier to discriminate between non-ictal and ictal patterns in EEG signals. The model presented here demonstrated very high classification rates ( > 97%) without compromising the statistical performance. The computation timings for each testing phase were also very low ( < 0.029 s) which makes this model ideal for practical applications.

Keywords: electroencephalogram (EEG), epilepsy, ictal patterns, empirical mode decomposition

Procedia PDF Downloads 392
2996 A Multi-criteria Decision Support System for Migrating Legacies into Open Systems

Authors: Nasser Almonawer

Abstract:

Timely reaction to an evolving global business environment and volatile market conditions necessitates system and process flexibility, which in turn demands agile and adaptable architecture and a steady infusion of affordable new technologies. On the contrary, a large number of organizations utilize systems characterized by inflexible and obsolete legacy architectures. To effectively respond to the dynamic contemporary business environments, such architectures must be migrated to robust and modular open architectures. To this end, this paper proposes an integrated decision support system for a seamless migration to open systems. The proposed decision support system (DSS) integrates three well-established quantitative and qualitative decision-making models—namely, the Delphi method, Analytic Hierarchy Process (AHP) and Goal Programming (GP) to (1) assess risks and establish evaluation criteria; (2) formulate migration strategy and rank candidate systems; and (3) allocate resources among the selected systems.

Keywords: decision support systems, open systems architecture, analytic hierarchy process (AHP), goal programming (GP), delphi method

Procedia PDF Downloads 16
2995 Conflict and Hunger Revisit: Evidences from Global Surveys, 1989-2020

Authors: Manasse Elusma, Thung-Hong Lin, Chun-yin Lee

Abstract:

The relationship between hunger and war or conflict remains to be discussed. Do wars or conflicts cause hunger and food scarcity, or is the reverse relationship is true? As the world becomes more peaceful and wealthier, some countries are still suffered from hunger and food shortage. So, eradicating hunger calls for a more comprehensive understanding of the relationship between conflict and hunger. Several studies are carried out to detect the importance of conflict or war on food security. Most of these studies, however, perform only descriptive analysis and largely use food security indicators instead of the global hunger index. Few studies have employed cross-country panel data to explicitly analyze the association between conflict and chronic hunger, including hidden hunger. Herein, this study addresses this issue and the knowledge gap. We combine global datasets to build a new panel dataset including 143 countries from 1989 to 2020. This study examines the effect of conflict on hunger with fixed effect models, and the results show that the increase of conflict frequency deteriorates hunger. Peacebuilding efforts and war prevention initiative are required to eradicate global hunger.

Keywords: armed conflict, food scarcity, hidden hunger, hunger, malnutrition

Procedia PDF Downloads 153
2994 Effect of Footing Shape on Bearing Capacity and Settlement of Closely Spaced Footings on Sandy Soil

Authors: A. Shafaghat, H. Khabbaz, S. Moravej, Ah. Shafaghat

Abstract:

The bearing capacity of closely spaced shallow footings alters with their spacing and the shape of footing. In this study, the bearing capacity and settlement of two adjacent footings constructed on a sand layer are investigated. The effect of different footing shapes including square, circular, ring and strip on sandy soil is captured in the calculations. The investigations are carried out numerically using PLAXIS-3D software and analytically employing conventional settlement equations. For this purpose, foundations are modelled in the program with practical dimensions and various spacing ratios ranging from 1 to 5. The spacing ratio is defined as the centre-to-centre distance to the width of foundations (S/B). Overall, 24 models are analyzed; and the results are compared and discussed in detail. It can be concluded that the presence of adjacent foundation leads to the reduction in bearing capacity for round shape footings while it can increase the bearing capacity of rectangular footings in some specific distances.

Keywords: bearing capacity, finite element analysis, loose sand, settlement equations, shallow foundation

Procedia PDF Downloads 243
2993 Performance of Slot-Entry Hybrid Worn Journal Bearing under Turbulent Lubrication

Authors: Nathi Ram, Saurabh K. Yadav

Abstract:

In turbomachinery, the turbulent flow occurs due to the use of high velocity of low kinematic viscosity lubricants and used in many industrial applications. In the present work, the performance of symmetric slot-entry hybrid worn journal bearing under laminar and turbulent lubrication has been investigated. For turbulent lubrication, the Reynolds equation has been modified using Constantinescu turbulent model. This modified equation has been solved using the finite element method. The effect of turbulent lubrication on bearing’s performance has been presented for symmetric hybrid journal bearing. The slot-entry hybrid worn journal bearing under turbulent/laminar regimes have been investigated. It has been observed that the stiffness and damping coefficients are more for the bearing having slot width ratio (SWR) of 0.25 than the bearing with SWR of 0.5 and 0.75 under the turbulent regime. Further, it is also observed that for constant wear depth parameter, stability threshold speed gets increased for bearing operates at slot width ratio 0.25 under turbulent lubrication.

Keywords: hydrostatic bearings, journal bearings, restrictors, turbulent flow models, finite element technique

Procedia PDF Downloads 150
2992 A Neural Network Modelling Approach for Predicting Permeability from Well Logs Data

Authors: Chico Horacio Jose Sambo

Abstract:

Recently neural network has gained popularity when come to solve complex nonlinear problems. Permeability is one of fundamental reservoir characteristics system that are anisotropic distributed and non-linear manner. For this reason, permeability prediction from well log data is well suited by using neural networks and other computer-based techniques. The main goal of this paper is to predict reservoir permeability from well logs data by using neural network approach. A multi-layered perceptron trained by back propagation algorithm was used to build the predictive model. The performance of the model on net results was measured by correlation coefficient. The correlation coefficient from testing, training, validation and all data sets was evaluated. The results show that neural network was capable of reproducing permeability with accuracy in all cases, so that the calculated correlation coefficients for training, testing and validation permeability were 0.96273, 0.89991 and 0.87858, respectively. The generalization of the results to other field can be made after examining new data, and a regional study might be possible to study reservoir properties with cheap and very fast constructed models.

Keywords: neural network, permeability, multilayer perceptron, well log

Procedia PDF Downloads 381
2991 Application of Artificial Neural Network Technique for Diagnosing Asthma

Authors: Azadeh Bashiri

Abstract:

Introduction: Lack of proper diagnosis and inadequate treatment of asthma leads to physical and financial complications. This study aimed to use data mining techniques and creating a neural network intelligent system for diagnosis of asthma. Methods: The study population is the patients who had visited one of the Lung Clinics in Tehran. Data were analyzed using the SPSS statistical tool and the chi-square Pearson's coefficient was the basis of decision making for data ranking. The considered neural network is trained using back propagation learning technique. Results: According to the analysis performed by means of SPSS to select the top factors, 13 effective factors were selected, in different performances, data was mixed in various forms, so the different models were made for training the data and testing networks and in all different modes, the network was able to predict correctly 100% of all cases. Conclusion: Using data mining methods before the design structure of system, aimed to reduce the data dimension and the optimum choice of the data, will lead to a more accurate system. Therefore, considering the data mining approaches due to the nature of medical data is necessary.

Keywords: asthma, data mining, Artificial Neural Network, intelligent system

Procedia PDF Downloads 258
2990 Optimization of the Measure of Compromise as a Version of Sorites Paradox

Authors: Aleksandar Hatzivelkos

Abstract:

The term ”compromise” is mostly used casually within the social choice theory. It is usually used as a mere result of the social choice function, and this omits its deeper meaning and ramifications. This paper is based on a mathematical model for the description of a compromise as a version of the Sorites paradox. It introduces a formal definition of d-measure of divergence from a compromise and models a notion of compromise that is often used only colloquially. Such a model for vagueness phenomenon, which lies at the core of the notion of compromise enables the introduction of new mathematical structures. In order to maximize compromise, different methods can be used. In this paper, we explore properties of a social welfare function TdM (from Total d-Measure), which is defined as a function which minimizes the total sum of d-measures of divergence over all possible linear orderings. We prove that TdM satisfy strict Pareto principle and behaves well asymptotically. Furthermore, we show that for certain domain restrictions, TdM satisfy positive responsiveness and IIIA (intense independence of irrelevant alternatives) thus being equivalent to Borda count on such domain restriction. This result gives new opportunities in social choice, especially when there is an emphasis on compromise in the decision-making process.

Keywords: borda count, compromise, measure of divergence, minimization

Procedia PDF Downloads 111
2989 Runoff Simulation by Using WetSpa Model in Garmabrood Watershed of Mazandaran Province, Iran

Authors: Mohammad Reza Dahmardeh Ghaleno, Mohammad Nohtani, Saeedeh Khaledi

Abstract:

Hydrological models are applied to simulation and prediction floods in watersheds. WetSpa is a distributed, continuous and physically model with daily or hourly time step that explains of precipitation, runoff and evapotranspiration processes for both simple and complex contexts. This model uses a modified rational method for runoff calculation. In this model, runoff is routed along the flow path using Diffusion-Wave Equation which depend on the slope, velocity and flow route characteristics. Garmabrood watershed located in Mazandaran province in Iran and passing over coordinates 53° 10´ 55" to 53° 38´ 20" E and 36° 06´ 45" to 36° 25´ 30"N. The area of the catchment is about 1133 km2 and elevations in the catchment range from 213 to 3136 m at the outlet, with average slope of 25.77 %. Results of the simulations show a good agreement between calculated and measured hydrographs at the outlet of the basin. Drawing upon Nash-Sutcliffe Model Efficiency Coefficient for calibration periodic model estimated daily hydrographs and maximum flow rate with an accuracy up to 61% and 83.17 % respectively.

Keywords: watershed simulation, WetSpa, runoff, flood prediction

Procedia PDF Downloads 323
2988 Requirement Engineering and Software Product Line Scoping Paradigm

Authors: Ahmed Mateen, Zhu Qingsheng, Faisal Shahzad

Abstract:

Requirement Engineering (RE) is a part being created for programming structure during the software development lifecycle. Software product line development is a new topic area within the domain of software engineering. It also plays important role in decision making and it is ultimately helpful in rising business environment for productive programming headway. Decisions are central to engineering processes and they hold them together. It is argued that better decisions will lead to better engineering. To achieve better decisions requires that they are understood in detail. In order to address the issues, companies are moving towards Software Product Line Engineering (SPLE) which helps in providing large varieties of products with minimum development effort and cost. This paper proposed a new framework for software product line and compared with other models. The results can help to understand the needs in SPL testing, by identifying points that still require additional investigation. In our future scenario, we will combine this model in a controlled environment with industrial SPL projects which will be the new horizon for SPL process management testing strategies.

Keywords: requirements engineering, software product lines, scoping, process structure, domain specific language

Procedia PDF Downloads 216
2987 CE Method for Development of Japan's Stochastic Earthquake Catalogue

Authors: Babak Kamrani, Nozar Kishi

Abstract:

Stochastic catalog represents the events module of the earthquake loss estimation models. It includes series of events with different magnitudes and corresponding frequencies/probabilities. For the development of the stochastic catalog, random or uniform sampling methods are used to sample the events from the seismicity model. For covering all the Magnitude Frequency Distribution (MFD), a huge number of events should be generated for the above-mentioned methods. Characteristic Event (CE) method chooses the events based on the interest of the insurance industry. We divide the MFD of each source into bins. We have chosen the bins based on the probability of the interest by the insurance industry. First, we have collected the information for the available seismic sources. Sources are divided into Fault sources, subduction, and events without specific fault source. We have developed the MFD for each of the individual and areal source based on the seismicity of the sources. Afterward, we have calculated the CE magnitudes based on the desired probability. To develop the stochastic catalog, we have introduced uncertainty to the location of the events too.

Keywords: stochastic catalogue, earthquake loss, uncertainty, characteristic event

Procedia PDF Downloads 283
2986 Comparison of Effect of Pre-Stressed Strand Diameters Providing Beamm to Column Connection

Authors: Mustafa Kaya

Abstract:

In this study, the effect of pre-stressed strand diameters, providing the beam-to-column connections, was investigated from both experimental, and analytical aspects. In the experimental studies, the strength, stiffness, and energy dissipation capacities of the precast specimens comprising two pre-stressed strand samples of 12.70 mm, and 15.24 mm diameters, were compared with the reference specimen. The precast specimen with strands of 15.24 mm reached 96% of the maximum strength of the reference specimen; the amount of energy dissipated by this specimen until end of the test reached 48% of the amount of energy dissipated by the reference sample, and the stiffness of the same specimen at a 1.5% drift of reached 77% of the stiffness of the reference specimen at this drift. Parallel results were obtained during the analytical studies from the aspects of strength, and behavior, but the initial stiffness of the analytical models was lower than that of the test specimen.

Keywords: precast beam to column connection, moment resisting connection, post tensioned connections, finite element method

Procedia PDF Downloads 537
2985 Robust Optimisation Model and Simulation-Particle Swarm Optimisation Approach for Vehicle Routing Problem with Stochastic Demands

Authors: Mohanad Al-Behadili, Djamila Ouelhadj

Abstract:

In this paper, a specific type of vehicle routing problem under stochastic demand (SVRP) is considered. This problem is of great importance because it models for many of the real world vehicle routing applications. This paper used a robust optimisation model to solve the problem along with the novel Simulation-Particle Swarm Optimisation (Sim-PSO) approach. The proposed Sim-PSO approach is based on the hybridization of the Monte Carlo simulation technique with the PSO algorithm. A comparative study between the proposed model and the Sim-PSO approach against other solution methods in the literature has been given in this paper. This comparison including the Analysis of Variance (ANOVA) to show the ability of the model and solution method in solving the complicated SVRP. The experimental results show that the proposed model and Sim-PSO approach has a significant impact on the obtained solution by providing better quality solutions comparing with well-known algorithms in the literature.

Keywords: stochastic vehicle routing problem, robust optimisation model, Monte Carlo simulation, particle swarm optimisation

Procedia PDF Downloads 263
2984 The Predictive Value of Serum Bilirubin in the Post-Transplant De Novo Malignancy: A Data Mining Approach

Authors: Nasim Nosoudi, Amir Zadeh, Hunter White, Joshua Conrad, Joon W. Shim

Abstract:

De novo Malignancy has become one of the major causes of death after transplantation, so early cancer diagnosis and detection can drastically improve survival rates post-transplantation. Most previous work focuses on using artificial intelligence (AI) to predict transplant success or failure outcomes. In this work, we focused on predicting de novo malignancy after liver transplantation using AI. We chose the patients that had malignancy after liver transplantation with no history of malignancy pre-transplant. Their donors were cancer-free as well. We analyzed 254,200 patient profiles with post-transplant malignancy from the US Organ Procurement and Transplantation Network (OPTN). Several popular data mining methods were applied to the resultant dataset to build predictive models to characterize de novo malignancy after liver transplantation. Recipient's bilirubin, creatinine, weight, gender, number of days recipient was on the transplant waiting list, Epstein Barr Virus (EBV), International normalized ratio (INR), and ascites are among the most important factors affecting de novo malignancy after liver transplantation

Keywords: De novo malignancy, bilirubin, data mining, transplantation

Procedia PDF Downloads 91
2983 The Predictive Significance of Metastasis Associated in Colon Cancer-1 (MACC1) in Primary Breast Cancer

Authors: Jasminka Mujic, Karin Milde-Langosch, Volkmar Mueller, Mirza Suljagic, Tea Becirevic, Jozo Coric, Daria Ler

Abstract:

MACC1 (metastasis associated in colon cancer-1) is a prognostic biomarker for tumor progression, metastasis, and survival of a variety of solid cancers. MACC1 also causes tumor growth in xenograft models and acts as a master regulator of the HGF/MET signaling pathway. In breast cancer, the expression of MACC1 determined by immunohistochemistry was significantly associated with positive lymph node status and advanced clinical stage. The aim of the present study was to further investigate the prognostic or predictive value of MACC1 expression in breast cancer using western blot analysis and immunohistochemistry. The results of our study have shown that high MACC1 expression in breast cancer is associated with shorter disease-free survival, especially in node-negative tumors. The MACC1 might be a suitable biomarker to select patients with a higher probability of recurrence which might benefit from adjuvant chemotherapy. Our results support a biologic role and potentially open the perspective for the use of MACC1 as predictive biomarker for treatment decision in breast cancer patients.

Keywords: breast cancer, biomarker, HGF/MET, MACC1

Procedia PDF Downloads 214
2982 Implementing Delivery Drones in Logistics Business Process: Case of Pharmaceutical Industry

Authors: Nikola Vlahovic, Blazenka Knezevic, Petra Batalic

Abstract:

In this paper, we will present a research about feasibility of implementing unmanned aerial vehicles, also known as 'drones', in logistics. Research is based on available information about current incentives and experiments in application of delivery drones in commercial use. Overview of current pilot projects and literature, as well as an overview of detected challenges, will be compiled and presented. Based on these findings, we will present a conceptual model of business process that implements delivery drones in business to business logistic operations. Business scenario is based on a pharmaceutical supply chain. Simulation modeling will be used to create models for running experiments and collecting performance data. Comparative study of the presented conceptual model will be given. The work will outline the main advantages and disadvantages of implementing unmanned aerial vehicles in delivery services as a supplementary distribution channel along the supply chain.

Keywords: business process, delivery drones, logistics, simulation modelling, unmanned aerial vehicles

Procedia PDF Downloads 381