Search results for: optimized summarization models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7825

Search results for: optimized summarization models

7795 Improving the Biomechanical Resistance of a Treated Tooth via Composite Restorations Using Optimised Cavity Geometries

Authors: Behzad Babaei, B. Gangadhara Prusty

Abstract:

The objective of this study is to assess the hypotheses that a restored tooth with a class II occlusal-distal (OD) cavity can be strengthened by designing an optimized cavity geometry, as well as selecting the composite restoration with optimized elastic moduli when there is a sharp de-bonded edge at the interface of the tooth and restoration. Methods: A scanned human maxillary molar tooth was segmented into dentine and enamel parts. The dentine and enamel profiles were extracted and imported into a finite element (FE) software. The enamel rod orientations were estimated virtually. Fifteen models for the restored tooth with different cavity occlusal depths (1.5, 2, and 2.5 mm) and internal cavity angles were generated. By using a semi-circular stone part, a 400 N load was applied to two contact points of the restored tooth model. The junctions between the enamel, dentine, and restoration were considered perfectly bonded. All parts in the model were considered homogeneous, isotropic, and elastic. The quadrilateral and triangular elements were employed in the models. A mesh convergence analysis was conducted to verify that the element numbers did not influence the simulation results. According to the criteria of a 5% error in the stress, we found that a total element number of over 14,000 elements resulted in the convergence of the stress. A Python script was employed to automatically assign 2-22 GPa moduli (with increments of 4 GPa) for the composite restorations, 18.6 GPa to the dentine, and two different elastic moduli to the enamel (72 GPa in the enamel rods’ direction and 63 GPa in perpendicular one). The linear, homogeneous, and elastic material models were considered for the dentine, enamel, and composite restorations. 108 FEA simulations were successively conducted. Results: The internal cavity angles (α) significantly altered the peak maximum principal stress at the interface of the enamel and restoration. The strongest structures against the contact loads were observed in the models with α = 100° and 105. Even when the enamel rods’ directional mechanical properties were disregarded, interestingly, the models with α = 100° and 105° exhibited the highest resistance against the mechanical loads. Regarding the effect of occlusal cavity depth, the models with 1.5 mm depth showed higher resistance to contact loads than the model with thicker cavities (2.0 and 2.5 mm). Moreover, the composite moduli in the range of 10-18 GPa alleviated the stress levels in the enamel. Significance: For the class II OD cavity models in this study, the optimal geometries, composite properties, and occlusal cavity depths were determined. Designing the cavities with α ≥100 ̊ was significantly effective in minimizing peak stress levels. The composite restoration with optimized properties reduced the stress concentrations on critical points of the models. Additionally, when more enamel was preserved, the sturdier enamel-restoration interface against the mechanical loads was observed.

Keywords: dental composite restoration, cavity geometry, finite element approach, maximum principal stress

Procedia PDF Downloads 72
7794 A Method for Clinical Concept Extraction from Medical Text

Authors: Moshe Wasserblat, Jonathan Mamou, Oren Pereg

Abstract:

Natural Language Processing (NLP) has made a major leap in the last few years, in practical integration into medical solutions; for example, extracting clinical concepts from medical texts such as medical condition, medication, treatment, and symptoms. However, training and deploying those models in real environments still demands a large amount of annotated data and NLP/Machine Learning (ML) expertise, which makes this process costly and time-consuming. We present a practical and efficient method for clinical concept extraction that does not require costly labeled data nor ML expertise. The method includes three steps: Step 1- the user injects a large in-domain text corpus (e.g., PubMed). Then, the system builds a contextual model containing vector representations of concepts in the corpus, in an unsupervised manner (e.g., Phrase2Vec). Step 2- the user provides a seed set of terms representing a specific medical concept (e.g., for the concept of the symptoms, the user may provide: ‘dry mouth,’ ‘itchy skin,’ and ‘blurred vision’). Then, the system matches the seed set against the contextual model and extracts the most semantically similar terms (e.g., additional symptoms). The result is a complete set of terms related to the medical concept. Step 3 –in production, there is a need to extract medical concepts from the unseen medical text. The system extracts key-phrases from the new text, then matches them against the complete set of terms from step 2, and the most semantically similar will be annotated with the same medical concept category. As an example, the seed symptom concepts would result in the following annotation: “The patient complaints on fatigue [symptom], dry skin [symptom], and Weight loss [symptom], which can be an early sign for Diabetes.” Our evaluations show promising results for extracting concepts from medical corpora. The method allows medical analysts to easily and efficiently build taxonomies (in step 2) representing their domain-specific concepts, and automatically annotate a large number of texts (in step 3) for classification/summarization of medical reports.

Keywords: clinical concepts, concept expansion, medical records annotation, medical records summarization

Procedia PDF Downloads 106
7793 Comparison of Two Neural Networks To Model Margarine Age And Predict Shelf-Life Using Matlab

Authors: Phakamani Xaba, Robert Huberts, Bilainu Oboirien

Abstract:

The present study was aimed at developing & comparing two neural-network-based predictive models to predict shelf-life/product age of South African margarine using free fatty acid (FFA), water droplet size (D3.3), water droplet distribution (e-sigma), moisture content, peroxide value (PV), anisidine valve (AnV) and total oxidation (totox) value as input variables to the model. Brick margarine products which had varying ages ranging from fresh i.e. week 0 to week 47 were sourced. The brick margarine products which had been stored at 10 & 25 °C and were characterized. JMP and MATLAB models to predict shelf-life/ margarine age were developed and their performances were compared. The key performance indicators to evaluate the model performances were correlation coefficient (CC), root mean square error (RMSE), and mean absolute percentage error (MAPE) relative to the actual data. The MATLAB-developed model showed a better performance in all three performance indicators. The correlation coefficient of the MATLAB model was 99.86% versus 99.74% for the JMP model, the RMSE was 0.720 compared to 1.005 and the MAPE was 7.4% compared to 8.571%. The MATLAB model was selected to be the most accurate, and then, the number of hidden neurons/ nodes was optimized to develop a single predictive model. The optimized MATLAB with 10 neurons showed a better performance compared to the models with 1 & 5 hidden neurons. The developed models can be used by margarine manufacturers, food research institutions, researchers etc, to predict shelf-life/ margarine product age, optimize addition of antioxidants, extend shelf-life of products and proactively troubleshoot for problems related to changes which have an impact on shelf-life of margarine without conducting expensive trials.

Keywords: margarine shelf-life, predictive modelling, neural networks, oil oxidation

Procedia PDF Downloads 165
7792 Artificial Intelligence for Generative Modelling

Authors: Shryas Bhurat, Aryan Vashistha, Sampreet Dinakar Nayak, Ayush Gupta

Abstract:

As the technology is advancing more towards high computational resources, there is a paradigm shift in the usage of these resources to optimize the design process. This paper discusses the usage of ‘Generative Design using Artificial Intelligence’ to build better models that adapt the operations like selection, mutation, and crossover to generate results. The human mind thinks of the simplest approach while designing an object, but the intelligence learns from the past & designs the complex optimized CAD Models. Generative Design takes the boundary conditions and comes up with multiple solutions with iterations to come up with a sturdy design with the most optimal parameter that is given, saving huge amounts of time & resources. The new production techniques that are at our disposal allow us to use additive manufacturing, 3D printing, and other innovative manufacturing techniques to save resources and design artistically engineered CAD Models. Also, this paper discusses the Genetic Algorithm, the Non-Domination technique to choose the right results using biomimicry that has evolved for current habitation for millions of years. The computer uses parametric models to generate newer models using an iterative approach & uses cloud computing to store these iterative designs. The later part of the paper compares the topology optimization technology with Generative Design that is previously being used to generate CAD Models. Finally, this paper shows the performance of algorithms and how these algorithms help in designing resource-efficient models.

Keywords: genetic algorithm, bio mimicry, generative modeling, non-dominant techniques

Procedia PDF Downloads 118
7791 Development of Computational Approach for Calculation of Hydrogen Solubility in Hydrocarbons for Treatment of Petroleum

Authors: Abdulrahman Sumayli, Saad M. AlShahrani

Abstract:

For the hydrogenation process, knowing the solubility of hydrogen (H2) in hydrocarbons is critical to improve the efficiency of the process. We investigated the H2 solubility computation in four heavy crude oil feedstocks using machine learning techniques. Temperature, pressure, and feedstock type were considered as the inputs to the models, while the hydrogen solubility was the sole response. Specifically, we employed three different models: Support Vector Regression (SVR), Gaussian process regression (GPR), and Bayesian ridge regression (BRR). To achieve the best performance, the hyper-parameters of these models are optimized using the whale optimization algorithm (WOA). We evaluated the models using a dataset of solubility measurements in various feedstocks, and we compared their performance based on several metrics. Our results show that the WOA-SVR model tuned with WOA achieves the best performance overall, with an RMSE of 1.38 × 10− 2 and an R-squared of 0.991. These findings suggest that machine learning techniques can provide accurate predictions of hydrogen solubility in different feedstocks, which could be useful in the development of hydrogen-related technologies. Besides, the solubility of hydrogen in the four heavy oil fractions is estimated in different ranges of temperatures and pressures of 150 ◦C–350 ◦C and 1.2 MPa–10.8 MPa, respectively

Keywords: temperature, pressure variations, machine learning, oil treatment

Procedia PDF Downloads 42
7790 Sensitivity Based Robust Optimization Using 9 Level Orthogonal Array and Stepwise Regression

Authors: K. K. Lee, H. W. Han, H. L. Kang, T. A. Kim, S. H. Han

Abstract:

For the robust optimization of the manufacturing product design, there are design objectives that must be achieved, such as a minimization of the mean and standard deviation in objective functions within the required sensitivity constraints. The authors utilized the sensitivity of objective functions and constraints with respect to the effective design variables to reduce the computational burden associated with the evaluation of the probabilities. The individual mean and sensitivity values could be estimated easily by using the 9 level orthogonal array based response surface models optimized by the stepwise regression. The present study evaluates a proposed procedure from the robust optimization of rubber domes that are commonly used for keyboard switching, by using the 9 level orthogonal array and stepwise regression along with a desirability function. In addition, a new robust optimization process, i.e., the I2GEO (Identify, Integrate, Generate, Explore and Optimize), was proposed on the basis of the robust optimization in rubber domes. The optimized results from the response surface models and the estimated results by using the finite element analysis were consistent within a small margin of error. The standard deviation of objective function is decreasing 54.17% with suggested sensitivity based robust optimization. (Business for Cooperative R&D between Industry, Academy, and Research Institute funded Korea Small and Medium Business Administration in 2017, S2455569)

Keywords: objective function, orthogonal array, response surface model, robust optimization, stepwise regression

Procedia PDF Downloads 261
7789 A Review on Water Models of Surface Water Environment

Authors: Shahbaz G. Hassan

Abstract:

Water quality models are very important to predict the changes in surface water quality for environmental management. The aim of this paper is to give an overview of the water qualities, and to provide directions for selecting models in specific situation. Water quality models include one kind of model based on a mechanistic approach, while other models simulate water quality without considering a mechanism. Mechanistic models can be widely applied and have capabilities for long-time simulation, with highly complexity. Therefore, more spaces are provided to explain the principle and application experience of mechanistic models. Mechanism models have certain assumptions on rivers, lakes and estuaries, which limits the application range of the model, this paper introduces the principles and applications of water quality model based on the above three scenarios. On the other hand, mechanistic models are more easily to compute, and with no limit to the geographical conditions, but they cannot be used with confidence to simulate long term changes. This paper divides the empirical models into two broad categories according to the difference of mathematical algorithm, models based on artificial intelligence and models based on statistical methods.

Keywords: empirical models, mathematical, statistical, water quality

Procedia PDF Downloads 231
7788 Management and Marketing Implications of Tourism Gravity Models

Authors: Clive L. Morley

Abstract:

Gravity models and panel data modelling of tourism flows are receiving renewed attention, after decades of general neglect. Such models have quite different underpinnings from conventional demand models derived from micro-economic theory. They operate at a different level of data and with different theoretical bases. These differences have important consequences for the interpretation of the results and their policy and managerial implications. This review compares and contrasts the two model forms, clarifying the distinguishing features and the estimation requirements of each. In general, gravity models are not recommended for use to address specific management and marketing purposes.

Keywords: gravity models, micro-economics, demand models, marketing

Procedia PDF Downloads 411
7787 Electric Models for Crosstalk Predection: Analysis and Performance Evaluation

Authors: Kachout Mnaouer, Bel Hadj Tahar Jamel, Choubani Fethi

Abstract:

In this paper, three electric equivalent models to evaluate crosstalk between three-conductor transmission lines are proposed. First, electric equivalent models for three-conductor transmission lines are presented. Secondly, rigorous equations to calculate the per-unit length inductive and capacitive parameters are developed. These models allow us to calculate crosstalk between conductors. Finally, to validate the presented models, we compare the theoretical results with simulation data. Obtained results show that proposed models can be used to predict crosstalk performance.

Keywords: near-end crosstalk, inductive parameter, L, Π, T models

Procedia PDF Downloads 422
7786 Optimization-Based Design Improvement of Synchronizer in Transmission System for Efficient Vehicle Performance

Authors: Sanyka Banerjee, Saikat Nandi, P. K. Dan

Abstract:

Synchronizers as an integral part of gearbox is a key element in the transmission system in automotive. The performance of synchronizer affects transmission efficiency and driving comfort. Synchronizing mechanism as a major component of transmission system must be capable of preventing vibration and noise in the gears. Gear shifting efficiency improvement with an aim to achieve smooth, quick and energy efficient power transmission remains a challenge for the automotive industry. Performance of the synchronizer is dependent on the features and characteristics of its sub-components and therefore analysis of the contribution of such characteristics is necessary. An important exercise involved is to identify all such characteristics or factors which are associated with the modeling and analysis and for this purpose the literature was reviewed, rather extensively, to study the mathematical models, formulated considering such. It has been observed that certain factors are rather common across models; however, there are few factors which have specifically been selected for individual models, as reported. In order to obtain a more realistic model, an attempt here has been made to identify and assimilate practically all possible factors which may be considered in formulating the model more comprehensively. A simulation study, formulated as a block model, for such analysis has been carried out in a reliable environment like MATLAB. Lower synchronization time is desirable and hence, it has been considered here as the output factors in the simulation modeling for evaluating transmission efficiency. An improved synchronizer model requires optimized values of sub-component design parameters. A parametric optimization utilizing Taguchi’s design of experiment based response data and their analysis has been carried out for this purpose. The effectiveness of the optimized parameters for the improved synchronizer performance has been validated by the simulation study of the synchronizer block model with improved parameter values as input parameters for better transmission efficiency and driver comfort.

Keywords: design of experiments, modeling, parametric optimization, simulation, synchronizer

Procedia PDF Downloads 276
7785 The Grit in the Glamour: A Qualitative Study of the Well-Being of Fashion Models

Authors: Emily Fortune Super, Ameerah Khadaroo, Aurore Bardey

Abstract:

Fashion models are often assumed to have a glamorous job with limited consideration for their well-being. This study aims to assess the well-being of models through semi-structured interviews with six professional fashion models and six industry professionals. Thematic analysis revealed that although models experienced improved self-confidence, they also reported heightened anxiety levels, body image issues, and the negative influence of modelling on their self-esteem. By contrast, industry professionals reported no or minimum concerns about anxious behaviours or the general well-being of fashion models. Being resilient as a model was perceived as an essential attribute to have by both models and industry professionals as they face recurrent rejection in this industry. These results demonstrate a significant gap in the current understanding of the well-being of fashion models between industry professionals and the models themselves. Findings imply that there is an inherent need for change in the modelling industry to promote and enhance their well-being.

Keywords: body image, fashion industry, modelling, well-being

Procedia PDF Downloads 141
7784 Topology Optimization of Composite Structures with Material Nonlinearity

Authors: Mengxiao Li, Johnson Zhang

Abstract:

Currently, topology optimization technique is widely used to define the layout design of structures that are presented as truss-like topologies. However, due to the difficulty in combining optimization technique with more realistic material models where their nonlinear properties should be considered, the achieved optimized topologies are commonly unable to apply straight towards the practical design problems. This study presented an optimization procedure of composite structures where different elastic stiffness, yield criteria, and hardening models are assumed for the candidate materials. From the results, it can be concluded that a more explicit modeling has the significant influence on the resulting topologies. Also, the isotropic or kinematic hardening is important for elastoplastic structural optimization design. The capability of the proposed optimization procedure is shown through several cases.

Keywords: topology optimization, material composition, nonlinear modeling, hardening rules

Procedia PDF Downloads 454
7783 Modeling Water Resources Carrying Capacity, Optimizing Water Treatment, Smart Water Management, and Conceptualizing a Watershed Management Approach

Authors: Pius Babuna

Abstract:

Sustainable water use is important for the existence of the human race. Water resources carrying capacity (WRCC) measures the sustainability of water use; however, the calculation and optimization of WRCC remain challenging. This study used a mathematical model (the Logistics Growth of Water Resources -LGWR) and a linear objective function to model water sustainability. We tested the validity of the models using data from Ghana. Total freshwater resources, water withdrawal, and population data were used in MATLAB. The results show that the WRCC remains sustainable until the year 2132 ±18, when half of the total annual water resources will be used. The optimized water treatment cost suggests that Ghana currently wastes GHȼ 1115.782± 50 cedis (~$182.21± 50) per water treatment plant per month or ~ 0.67 million gallons of water in an avoidable loss. Adopting an optimized water treatment scheme and a watershed management approach will help sustain the WRCC.

Keywords: water resources carrying capacity, smart water management, optimization, sustainable water use, water withdrawal

Procedia PDF Downloads 57
7782 Lean Models Classification: Towards a Holistic View

Authors: Y. Tiamaz, N. Souissi

Abstract:

The purpose of this paper is to present a classification of Lean models which aims to capture all the concepts related to this approach and thus facilitate its implementation. This classification allows the identification of the most relevant models according to several dimensions. From this perspective, we present a review and an analysis of Lean models literature and we propose dimensions for the classification of the current proposals while respecting among others the axes of the Lean approach, the maturity of the models as well as their application domains. This classification allowed us to conclude that researchers essentially consider the Lean approach as a toolbox also they design their models to solve problems related to a specific environment. Since Lean approach is no longer intended only for the automotive sector where it was invented, but to all fields (IT, Hospital, ...), we consider that this approach requires a generic model that is capable of being implemented in all areas.

Keywords: lean approach, lean models, classification, dimensions, holistic view

Procedia PDF Downloads 406
7781 The Effect of Particle Porosity in Mixed Matrix Membrane Permeation Models

Authors: Z. Sadeghi, M. R. Omidkhah, M. E. Masoomi

Abstract:

The purpose of this paper is to examine gas transport behavior of mixed matrix membranes (MMMs) combined with porous particles. Main existing models are categorized in two main groups; two-phase (ideal contact) and three-phase (non-ideal contact). A new coefficient, J, was obtained to express equations for estimating effect of the particle porosity in two-phase and three-phase models. Modified models evaluates with existing models and experimental data using Matlab software. Comparison of gas permeability of proposed modified models with existing models in different MMMs shows a better prediction of gas permeability in MMMs.

Keywords: mixed matrix membrane, permeation models, porous particles, porosity

Procedia PDF Downloads 347
7780 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach

Authors: Gong Zhilin, Jing Yang, Jian Yin

Abstract:

The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).

Keywords: credit card, data mining, fraud detection, money transactions

Procedia PDF Downloads 102
7779 Production Optimization under Geological Uncertainty Using Distance-Based Clustering

Authors: Byeongcheol Kang, Junyi Kim, Hyungsik Jung, Hyungjun Yang, Jaewoo An, Jonggeun Choe

Abstract:

It is important to figure out reservoir properties for better production management. Due to the limited information, there are geological uncertainties on very heterogeneous or channel reservoir. One of the solutions is to generate multiple equi-probable realizations using geostatistical methods. However, some models have wrong properties, which need to be excluded for simulation efficiency and reliability. We propose a novel method of model selection scheme, based on distance-based clustering for reliable application of production optimization algorithm. Distance is defined as a degree of dissimilarity between the data. We calculate Hausdorff distance to classify the models based on their similarity. Hausdorff distance is useful for shape matching of the reservoir models. We use multi-dimensional scaling (MDS) to describe the models on two dimensional space and group them by K-means clustering. Rather than simulating all models, we choose one representative model from each cluster and find out the best model, which has the similar production rates with the true values. From the process, we can select good reservoir models near the best model with high confidence. We make 100 channel reservoir models using single normal equation simulation (SNESIM). Since oil and gas prefer to flow through the sand facies, it is critical to characterize pattern and connectivity of the channels in the reservoir. After calculating Hausdorff distances and projecting the models by MDS, we can see that the models assemble depending on their channel patterns. These channel distributions affect operation controls of each production well so that the model selection scheme improves management optimization process. We use one of useful global search algorithms, particle swarm optimization (PSO), for our production optimization. PSO is good to find global optimum of objective function, but it takes too much time due to its usage of many particles and iterations. In addition, if we use multiple reservoir models, the simulation time for PSO will be soared. By using the proposed method, we can select good and reliable models that already matches production data. Considering geological uncertainty of the reservoir, we can get well-optimized production controls for maximum net present value. The proposed method shows one of novel solutions to select good cases among the various probabilities. The model selection schemes can be applied to not only production optimization but also history matching or other ensemble-based methods for efficient simulations.

Keywords: distance-based clustering, geological uncertainty, particle swarm optimization (PSO), production optimization

Procedia PDF Downloads 112
7778 Exploring Students’ Visual Conception of Matter and Its Implications to Teaching and Learning Chemistry

Authors: Allen A. Espinosa, Arlyne C. Marasigan, Janir T. Datukan

Abstract:

The study explored how students visualize the states and classifications of matter using scientific models. It also identified misconceptions of students in using scientific models. In general, high percentage of students was able to use scientific models correctly and only a little misconception was identified. From the result of the study, a teaching framework was formulated wherein scientific models should be employed in classroom instruction to visualize abstract concepts in chemistry and for better conceptual understanding.

Keywords: visual conception, scientific models, mental models, states of matter, classification of matter

Procedia PDF Downloads 368
7777 Adaptive Architecture: Reformulation of Socio-Ecological Systems

Authors: Pegah Zamani

Abstract:

This multidisciplinary study interrogates the reformulation of socio-ecological systems by bringing different disciplines together and incorporating ecological, social, and technological components to the sustainable design. The study seeks for a holistic sustainable system to understand the multidimensional impact of the evolving innovative technologies on responding to the variable socio-environmental conditions. Through a range of cases, from the vernacular built spaces to the sophisticated optimized systems, the research unfolds how far the environmental elements would impact the performance of a sustainable building, its micro-climatic ecological requirements, and its human inhabitation. As a product of the advancing technologies, an optimized and environmentally responsive building offers new identification, and realization of the built space through reformulating the connection to its internal and external environments (such as solar, thermal, and airflow), as well as its dwellers. The study inquires properties of optimized buildings, by bringing into the equation not only the environmental but also the socio-cultural, morphological, and phenomenal factors. Thus, the research underlines optimized built space as a product and practice which would not be meaningful without addressing and dynamically adjusting to the diversity and complexity of socio-ecological systems.

Keywords: ecology, morphology, socio-ecological systems, sustainability

Procedia PDF Downloads 178
7776 Evaluation of Central Nervous System Activity of Synthesized 5, 5-Diphenylimidazolidine-2, 4-Dione Derivatives

Authors: Shweta Verma

Abstract:

Background: Epilepsy is a chronic non-communicable central nervous system (CNS) disorder which affects a large population of all ages. Different classes of drugs are used for the treatment of this neurological disorder, but due to augmented drug resistance and side effects, these drugs become incompetent. Therefore, we design the synthesis of ten new derivatives of Phenytoin. The moiety of Phenytoin was hybridized with different phenols by using three step approach. The synthesized molecules were then investigated for different physicochemical parameters, such as Log P values using diverse software programs and to predict the potential to cross the blood-brain barrier. Objective: The Phenytoin derivatives were designed, synthesized, and characterized to meet the structural necessities indispensable for antiepileptic activity. Method: Firstly, the chloroacetylation of the 5,5-diphenyl hydantoin was carried out, and then various substituted phenols were added to it. The synthesized compounds were characterized and evaluated for antianxiety activity by elevated plus maze method and antiepileptic activity by using subcutaneous pentylenetetrazole (scPTZ) and maximal electroshock (MES) models and neurotoxicity. Result: The number of derivatives of 5,5-diphenyl hydantoin was developed and optimized. The number of parameters was optimized which reveal that the compound containing chloro group such as C3 and C6 showed imperative potential when compared with the standard drug Diazepam. Other compounds containing nitro and methyl group were also found to possess activity. Conclusion: It was summarized that the new compounds of 5,5-diphenyl hydantoin derivatives were synthesized. The results of the data show that the compound containing chloro group is more potent for CNS activity. The new compounds have the probability of being optimized further to engender new scaffolds to treat various CNS disorders.

Keywords: phenytoin, parameters, CNS activity, blood-brain barrier, Log P, CNS active

Procedia PDF Downloads 36
7775 Preserving Privacy in Workflow Delegation Models

Authors: Noha Nagy, Hoda Mokhtar, Mohamed El Sherkawi

Abstract:

The popularity of workflow delegation models and the increasing number of workflow provenance-aware systems motivate the need for finding more strict delegation models. Such models combine different approaches for enhanced security and respecting workflow privacy. Although modern enterprises seek conformance to workflow constraints to ensure correctness of their work, these constraints pose a threat to security, because these constraints can be good seeds for attacking privacy even in secure models. This paper introduces a comprehensive Workflow Delegation Model (WFDM) that utilizes provenance and workflow constraints to prevent malicious delegate from attacking workflow privacy as well as extending the delegation functionalities. In addition, we argue the need for exploiting workflow constraints to improve workflow security models.

Keywords: workflow delegation models, secure workflow, workflow privacy, workflow provenance

Procedia PDF Downloads 305
7774 A Method to Saturation Modeling of Synchronous Machines in d-q Axes

Authors: Mohamed Arbi Khlifi, Badr M. Alshammari

Abstract:

This paper discusses the general methods to saturation in the steady-state, two axis (d & q) frame models of synchronous machines. In particular, the important role of the magnetic coupling between the d-q axes (cross-magnetizing phenomenon), is demonstrated. For that purpose, distinct methods of saturation modeling of dumper synchronous machine with cross-saturation are identified, and detailed models synthesis in d-q axes. A number of models are given in the final developed form. The procedure and the novel models are verified by a critical application to prove the validity of the method and the equivalence between all developed models is reported. Advantages of some of the models over the existing ones and their applicability are discussed.

Keywords: cross-magnetizing, models synthesis, synchronous machine, saturated modeling, state-space vectors

Procedia PDF Downloads 425
7773 Robot Spatial Reasoning via 3D Models

Authors: John Allard, Alex Rich, Iris Aguilar, Zachary Dodds

Abstract:

With this paper we present several experiences deploying novel, low-cost resources for computing with 3D spatial models. Certainly, computing with 3D models undergirds some of our field’s most important contributions to the human experience. Most often, those are contrived artifacts. This work extends that tradition by focusing on novel resources that deliver uncontrived models of a system’s current surroundings. Atop this new capability, we present several projects investigating the student-accessibility of the computational tools for reasoning about the 3D space around us. We conclude that, with current scaffolding, real-world 3D models are now an accessible and viable foundation for creative computational work.

Keywords: 3D vision, matterport model, real-world 3D models, mathematical and computational methods

Procedia PDF Downloads 510
7772 Exploratory Study of Contemporary Models of Leadership

Authors: Gadah Alkeniah

Abstract:

Leadership is acknowledged internationally as fundamental to school efficiency and school enhancement nevertheless there are various understandings of what leadership is and how it is realised in practice. There are a number of educational leadership models that are considered important. However, the present study uses a systematic review method to examine and compare five models of the most well-known contemporary models of leadership as well as introduces the dimension of each model. Our results reveal that recently the distributed leadership has grown in popularity within the field of education. The study concludes by suggesting future directions in leadership development and education research.

Keywords: distributed leadership, instructional leadership, leadership models, moral leadership, strategic leadership, transformational leadership

Procedia PDF Downloads 176
7771 Determining the Number of Single Models in a Combined Forecast

Authors: Serkan Aras, Emrah Gulay

Abstract:

Combining various forecasting models is an important tool for researchers to attain more accurate forecasts. A great number of papers have shown that selecting single models as dissimilar models, or methods based on different information as possible leads to better forecasting performances. However, there is not a certain rule regarding the number of single models to be used in any combining methods. This study focuses on determining the optimal or near optimal number for single models with the help of statistical tests. An extensive experiment is carried out by utilizing some well-known time series data sets from diverse fields. Furthermore, many rival forecasting methods and some of the commonly used combining methods are employed. The obtained results indicate that some statistically significant performance differences can be found regarding the number of the single models in the combining methods under investigation.

Keywords: combined forecast, forecasting, M-competition, time series

Procedia PDF Downloads 328
7770 Operator Efficiency Study for Assembly Line Optimization at Semiconductor Assembly and Test

Authors: Rohana Abdullah, Md Nizam Abd Rahman, Seri Rahayu Kamat

Abstract:

Operator efficiency aspect is gaining importance in ensuring optimized usage of resources especially in the semi-automated manufacturing environment. This paper addresses a case study done to solve operator efficiency and line balancing issue at a semiconductor assembly and test manufacturing. A Man-to-Machine (M2M) work study technique is used to study operator current utilization and determine the optimum allocation of the operators to the machines. Critical factors such as operator activity, activity frequency and operator competency level are considered to gain insight on the parameters that affects the operator utilization. Equipment standard time and overall equipment efficiency (OEE) information are also gathered and analyzed to achieve a balanced and optimized production.

Keywords: operator efficiency, optimized production, line balancing, industrial and manufacturing engineering

Procedia PDF Downloads 699
7769 A Study of Population Growth Models and Future Population of India

Authors: Sheena K. J., Jyoti Badge, Sayed Mohammed Zeeshan

Abstract:

A Comparative Study of Exponential and Logistic Population Growth Models in India India is the second most populous city in the world, just behind China, and is going to be in the first place by next year. The Indian population has remarkably at higher rate than the other countries from the past 20 years. There were many scientists and demographers who has formulated various models of population growth in order to study and predict the future population. Some of the models are Fibonacci population growth model, Exponential growth model, Logistic growth model, Lotka-Volterra model, etc. These models have been effective in the past to an extent in predicting the population. However, it is essential to have a detailed comparative study between the population models to come out with a more accurate one. Having said that, this research study helps to analyze and compare the two population models under consideration - exponential and logistic growth models, thereby identifying the most effective one. Using the census data of 2011, the approximate population for 2016 to 2031 are calculated for 20 Indian states using both the models, compared and recorded the data with the actual population. On comparing the results of both models, it is found that logistic population model is more accurate than the exponential model, and using this model, we can predict the future population in a more effective way. This will give an insight to the researchers about the effective models of population and how effective these population models are in predicting the future population.

Keywords: population growth, population models, exponential model, logistic model, fibonacci model, lotka-volterra model, future population prediction, demographers

Procedia PDF Downloads 88
7768 Feature Analysis of Predictive Maintenance Models

Authors: Zhaoan Wang

Abstract:

Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.

Keywords: automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation

Procedia PDF Downloads 105
7767 Gene Names Identity Recognition Using Siamese Network for Biomedical Publications

Authors: Micheal Olaolu Arowolo, Muhammad Azam, Fei He, Mihail Popescu, Dong Xu

Abstract:

As the quantity of biological articles rises, so does the number of biological route figures. Each route figure shows gene names and relationships. Annotating pathway diagrams manually is time-consuming. Advanced image understanding models could speed up curation, but they must be more precise. There is rich information in biological pathway figures. The first step to performing image understanding of these figures is to recognize gene names automatically. Classical optical character recognition methods have been employed for gene name recognition, but they are not optimized for literature mining data. This study devised a method to recognize an image bounding box of gene name as a photo using deep Siamese neural network models to outperform the existing methods using ResNet, DenseNet and Inception architectures, the results obtained about 84% accuracy.

Keywords: biological pathway, gene identification, object detection, Siamese network

Procedia PDF Downloads 243
7766 Dynamic vs. Static Bankruptcy Prediction Models: A Dynamic Performance Evaluation Framework

Authors: Mohammad Mahdi Mousavi

Abstract:

Bankruptcy prediction models have been implemented for continuous evaluation and monitoring of firms. With the huge number of bankruptcy models, an extensive number of studies have focused on answering the question that which of these models are superior in performance. In practice, one of the drawbacks of existing comparative studies is that the relative assessment of alternative bankruptcy models remains an exercise that is mono-criterion in nature. Further, a very restricted number of criteria and measure have been applied to compare the performance of competing bankruptcy prediction models. In this research, we overcome these methodological gaps through implementing an extensive range of criteria and measures for comparison between dynamic and static bankruptcy models, and through proposing a multi-criteria framework to compare the relative performance of bankruptcy models in forecasting firm distress for UK firms.

Keywords: bankruptcy prediction, data envelopment analysis, performance criteria, performance measures

Procedia PDF Downloads 219