Search results for: organizing models
5505 Modelling Operational Risk Using Extreme Value Theory and Skew t-Copulas via Bayesian Inference
Authors: Betty Johanna Garzon Rozo, Jonathan Crook, Fernando Moreira
Abstract:
Operational risk losses are heavy tailed and are likely to be asymmetric and extremely dependent among business lines/event types. We propose a new methodology to assess, in a multivariate way, the asymmetry and extreme dependence between severity distributions, and to calculate the capital for Operational Risk. This methodology simultaneously uses (i) several parametric distributions and an alternative mix distribution (the Lognormal for the body of losses and the Generalized Pareto Distribution for the tail) via extreme value theory using SAS®, (ii) the multivariate skew t-copula applied for the first time for operational losses and (iii) Bayesian theory to estimate new n-dimensional skew t-copula models via Markov chain Monte Carlo (MCMC) simulation. This paper analyses a newly operational loss data set, SAS Global Operational Risk Data [SAS OpRisk], to model operational risk at international financial institutions. All the severity models are constructed in SAS® 9.2. We implement the procedure PROC SEVERITY and PROC NLMIXED. This paper focuses in describing this implementation.Keywords: operational risk, loss distribution approach, extreme value theory, copulas
Procedia PDF Downloads 6035504 Explaining the Impact of Poverty Risk on Frailty Trajectories in Old Age Using Growth Curve Models
Authors: Erwin Stolz, Hannes Mayerl, Anja Waxenegger, Wolfgang Freidl
Abstract:
Research has often found poverty associated with adverse health outcomes, but it is unclear which (interplay of) mechanisms actually translate low economic resources into poor physical health. The goal of this study was to assess the impact of educational, material, psychosocial and behavioural factors in explaining the poverty-health association in old age. We analysed 28,360 observations from 11,390 community-dwelling respondents (65+) from the Survey of Health, Ageing and Retirement in Europe (SHARE, 2004-2013, 10 countries). We used multilevel growth curve models to assess the impact of combined income- and asset poverty risk on old age frailty index levels and trajectories. In total, 61.8% of the variation of poverty risk on frailty levels could be explained by direct and indirect effects, thereby highlighting the role of material and particularly psychosocial factors, such as perceived control and social isolation. We suggest strengthening social policy and public health efforts in order to fight poverty and its deleterious effects from early age on and to broaden the scope of interventions with regard to psychosocial factors.Keywords: frailty, health inequality, old age, poverty
Procedia PDF Downloads 3335503 Simulation of Flow through Dam Foundation by FEM and ANN Methods Case Study: Shahid Abbaspour Dam
Authors: Mehrdad Shahrbanozadeh, Gholam Abbas Barani, Saeed Shojaee
Abstract:
In this study, a finite element (Seep3D model) and an artificial neural network (ANN) model were developed to simulate flow through dam foundation. Seep3D model is capable of simulating three-dimensional flow through a heterogeneous and anisotropic, saturated and unsaturated porous media. Flow through the Shahid Abbaspour dam foundation has been used as a case study. The FEM with 24960 triangular elements and 28707 nodes applied to model flow through foundation of this dam. The FEM being made denser in the neighborhood of the curtain screen. The ANN model developed for Shahid Abbaspour dam is a feedforward four layer network employing the sigmoid function as an activator and the back-propagation algorithm for the network learning. The water level elevations of the upstream and downstream of the dam have been used as input variables and the piezometric heads as the target outputs in the ANN model. The two models are calibrated and verified using the Shahid Abbaspour’s dam piezometric data. Results of the models were compared with those measured by the piezometers which are in good agreement. The model results also revealed that the ANN model performed as good as and in some cases better than the FEM.Keywords: seepage, dam foundation, finite element method, neural network, seep 3D model
Procedia PDF Downloads 4735502 Deep-Learning to Generation of Weights for Image Captioning Using Part-of-Speech Approach
Authors: Tiago do Carmo Nogueira, Cássio Dener Noronha Vinhal, Gélson da Cruz Júnior, Matheus Rudolfo Diedrich Ullmann
Abstract:
Generating automatic image descriptions through natural language is a challenging task. Image captioning is a task that consistently describes an image by combining computer vision and natural language processing techniques. To accomplish this task, cutting-edge models use encoder-decoder structures. Thus, Convolutional Neural Networks (CNN) are used to extract the characteristics of the images, and Recurrent Neural Networks (RNN) generate the descriptive sentences of the images. However, cutting-edge approaches still suffer from problems of generating incorrect captions and accumulating errors in the decoders. To solve this problem, we propose a model based on the encoder-decoder structure, introducing a module that generates the weights according to the importance of the word to form the sentence, using the part-of-speech (PoS). Thus, the results demonstrate that our model surpasses state-of-the-art models.Keywords: gated recurrent units, caption generation, convolutional neural network, part-of-speech
Procedia PDF Downloads 1025501 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach
Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta
Abstract:
Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.Keywords: support vector machines, decision tree, random forest
Procedia PDF Downloads 405500 Comparative Analysis of Predictive Models for Customer Churn Prediction in the Telecommunication Industry
Authors: Deepika Christopher, Garima Anand
Abstract:
To determine the best model for churn prediction in the telecom industry, this paper compares 11 machine learning algorithms, namely Logistic Regression, Support Vector Machine, Random Forest, Decision Tree, XGBoost, LightGBM, Cat Boost, AdaBoost, Extra Trees, Deep Neural Network, and Hybrid Model (MLPClassifier). It also aims to pinpoint the top three factors that lead to customer churn and conducts customer segmentation to identify vulnerable groups. According to the data, the Logistic Regression model performs the best, with an F1 score of 0.6215, 81.76% accuracy, 68.95% precision, and 56.57% recall. The top three attributes that cause churn are found to be tenure, Internet Service Fiber optic, and Internet Service DSL; conversely, the top three models in this article that perform the best are Logistic Regression, Deep Neural Network, and AdaBoost. The K means algorithm is applied to establish and analyze four different customer clusters. This study has effectively identified customers that are at risk of churn and may be utilized to develop and execute strategies that lower customer attrition.Keywords: attrition, retention, predictive modeling, customer segmentation, telecommunications
Procedia PDF Downloads 575499 A Predictive MOC Solver for Water Hammer Waves Distribution in Network
Authors: A. Bayle, F. Plouraboué
Abstract:
Water Distribution Network (WDN) still suffers from a lack of knowledge about fast pressure transient events prediction, although the latter may considerably impact their durability. Accidental or planned operating activities indeed give rise to complex pressure interactions and may drastically modified the local pressure value generating leaks and, in rare cases, pipe’s break. In this context, a numerical predictive analysis is conducted to prevent such event and optimize network management. A couple of Python/FORTRAN 90, home-made software, has been developed using Method Of Characteristic (MOC) solving for water-hammer equations. The solver is validated by direct comparison with theoretical and experimental measurement in simple configurations whilst afterward extended to network analysis. The algorithm's most costly steps are designed for parallel computation. A various set of boundary conditions and energetic losses models are considered for the network simulations. The results are analyzed in both real and frequencies domain and provide crucial information on the pressure distribution behavior within the network.Keywords: energetic losses models, method of characteristic, numerical predictive analysis, water distribution network, water hammer
Procedia PDF Downloads 2325498 The Musician as the Athlete: Psychological Response to Injury
Authors: Shulamit Sternin
Abstract:
Athletes experience injuries that can have both a physical and psychological impact on the individual. In such instances, athletes are able to rely on the established field of sports psychology to facilitate holistic rehabilitation. Musicians, like athletes rely on their bodies to perform in much the same way athletes do and are also susceptible to injury. Due to the similar performative nature of succeeding as an athletes or a musician, these careers share many of the same primary psychological concerns and therefore it is reasonable that athletes and musicians may require similar rehabilitation post-injury. However, musicians face their own unique psychological challenges and understanding the needs of an injured athlete can serve as a foundation for understanding the injured musician but is not enough to fully rehabilitate an injured musician. The current research surrounding musicians and their injuries is primarily focused on physiological aspects of injury and rehabilitation; the psychological aspects have not yet received adequate attention resulting in poor musician rehabilitation post- injury. This review paper uses current models of psychological response to injury in athletes to draw parallels with the psychological response to injury in musicians. Search engines such as Medline and PsycInfo were systematically searched using specific key words, such as psychological response, injury, athlete, and musician. Studies that focused on post-injury psychology of either the musician or the athlete were included. Within the literature there is evidence to support psychological responses, unique to the musician, that are not accounted for by current models of response in athletes. The models of psychological response to injury in athletes are inadequate tools for application to the musician. Future directions for performance arts research that can fill the gaps in our understanding and modeling of musicians’ response to injury are discussed. A better understanding of the psychological impact of injuries on musicians holds significant implications for health care practitioners working with injured musicians. Understanding the unique barriers musicians face post-injury, and how support for this population must be tailored to properly suit musicians’ needs will aid in more holistic rehabilitation and a higher likelihood of musician’s returning to pre-injury performance levels.Keywords: athlete, injury, musician, psychological response
Procedia PDF Downloads 2055497 The Impact and Performances of Controlled Ventilation Strategy on Thermal Comfort and Indoor Atmosphere in Building
Authors: Selma Bouasria, Mahi Abdelkader, Abbès Azzi, Herouz Keltoum
Abstract:
Ventilation in buildings is a key element to provide high indoor air quality. Its efficiency appears as one of the most important factors in maintaining thermal comfort for occupants of buildings. Personal displacement ventilation is a new ventilation concept that combines the positive features of displacement ventilation with those of task conditioning or personalized ventilation. This work aims to study numerically the supply air flow in a room to optimize a comfortable microclimate for an occupant. The room is heated, and a dummy is designed to simulate the occupant. Two types of configurations were studied. The first consist of a room without windows; and the second one is a local equipped with a window. The influence of the blowing speed and the solar radiation coming from the window on the thermal comfort of the occupant is studied. To conduct this study we used the turbulence models, namely the high Reynolds k-e, the RNG and the SST models. The numerical tool used is based on the finite volume method. The numerical simulation of the supply air flow in a room can predict and provide a significant information about indoor comfort.Keywords: local, comfort, thermique, ventilation, internal environment
Procedia PDF Downloads 4125496 Effects of Directivity and Fling Step on Buildings Equipped with J-Hook Sandwich Composite Walls and Reinforced Concrete Shear Walls
Authors: Majid Saaly, Shahriar Tavousi Tafreshi, Mehdi Nazari Afshar
Abstract:
The structural systems with the sandwich composite wall (SCSSC) are of very popular due to their ductileness and competency to swallow more energy and power than standard reinforced concrete shear walls. The purpose of this enhanced system is in high-rise building, Nuclear power plant facilities, and bridge slabs are much more. SCSSCs showed acceptable seismic performance under experimental tests and cyclic loading from the points of view of in-plane and out-of-plane shear and flexural interaction, in-plane punching shear, and compressive behavior. The use of sandwich composite walls with J-hook connectors has a significant effect on energy dissipation and reduction of dynamic responses of mid-rise and high-rise structural models. By changing the systems of the building from SW to SCWJ, the maximum inter-story drift values of ten- and fifteen-story models are reduced by up to 25% and 35%, respectively.Keywords: J-Hook sandwich composite walls, fling step, directivity, IDA analyses, fractile curves
Procedia PDF Downloads 1565495 Research and Application of Multi-Scale Three Dimensional Plant Modeling
Authors: Weiliang Wen, Xinyu Guo, Ying Zhang, Jianjun Du, Boxiang Xiao
Abstract:
Reconstructing and analyzing three-dimensional (3D) models from situ measured data is important for a number of researches and applications in plant science, including plant phenotyping, functional-structural plant modeling (FSPM), plant germplasm resources protection, agricultural technology popularization. It has many scales like cell, tissue, organ, plant and canopy from micro to macroscopic. The techniques currently used for data capture, feature analysis, and 3D reconstruction are quite different of different scales. In this context, morphological data acquisition, 3D analysis and modeling of plants on different scales are introduced systematically. The commonly used data capture equipment for these multiscale is introduced. Then hot issues and difficulties of different scales are described respectively. Some examples are also given, such as Micron-scale phenotyping quantification and 3D microstructure reconstruction of vascular bundles within maize stalks based on micro-CT scanning, 3D reconstruction of leaf surfaces and feature extraction from point cloud acquired by using 3D handheld scanner, plant modeling by combining parameter driven 3D organ templates. Several application examples by using the 3D models and analysis results of plants are also introduced. A 3D maize canopy was constructed, and light distribution was simulated within the canopy, which was used for the designation of ideal plant type. A grape tree model was constructed from 3D digital and point cloud data, which was used for the production of science content of 11th international conference on grapevine breeding and genetics. By using the tissue models of plants, a Google glass was used to look around visually inside the plant to understand the internal structure of plants. With the development of information technology, 3D data acquisition, and data processing techniques will play a greater role in plant science.Keywords: plant, three dimensional modeling, multi-scale, plant phenotyping, three dimensional data acquisition
Procedia PDF Downloads 2775494 The Relationship between Organizations' Acquired Skills, Knowledge, Abilities and Shareholders (SKAS) Wealth Maximization: The Mediating Role of Training Investment
Authors: Gabriel Dwomoh, Williams Kwasi Boachie, Kofi Kwarteng
Abstract:
The study looked at the relationship between organizations’ acquired knowledge, skills, abilities, and shareholders wealth with training playing the mediating role. The sample of the study consisted of organizations that spent 10% or more of its annual budget on training and those whose training budget is less than 10% of the organization’s annual budget. A total of 620 questionnaires were distributed to employees working in various organizations out of which 580 representing 93.5% were retrieved. The respondents that constitute the sample were drawn using convenience sampling. The researchers used regression models for their analyses with the help of SPSS 16.0. Analyzing multiple models, it was discovered that organizations training investment plays a considerable indirect and direct effect with partial mediation between organizations acquired skills, knowledge, abilities, and shareholders wealth. Shareholders should allow their agents to invest part of their holdings to develop the human capital of the organization but this should be done with caution since shareholders returns do not depend much on how much organizations spend in developing its human resource capital.Keywords: skills, knowledge, abilities, shareholders wealth, training investment
Procedia PDF Downloads 2405493 Count Data Regression Modeling: An Application to Spontaneous Abortion in India
Authors: Prashant Verma, Prafulla K. Swain, K. K. Singh, Mukti Khetan
Abstract:
Objective: In India, around 20,000 women die every year due to abortion-related complications. In the modelling of count variables, there is sometimes a preponderance of zero counts. This article concerns the estimation of various count regression models to predict the average number of spontaneous abortion among women in the Punjab state of India. It also assesses the factors associated with the number of spontaneous abortions. Materials and methods: The study included 27,173 married women of Punjab obtained from the DLHS-4 survey (2012-13). Poisson regression (PR), Negative binomial (NB) regression, zero hurdle negative binomial (ZHNB), and zero-inflated negative binomial (ZINB) models were employed to predict the average number of spontaneous abortions and to identify the determinants affecting the number of spontaneous abortions. Results: Statistical comparisons among four estimation methods revealed that the ZINB model provides the best prediction for the number of spontaneous abortions. Antenatal care (ANC) place, place of residence, total children born to a woman, woman's education and economic status were found to be the most significant factors affecting the occurrence of spontaneous abortion. Conclusions: The study offers a practical demonstration of techniques designed to handle count variables. Statistical comparisons among four estimation models revealed that the ZINB model provided the best prediction for the number of spontaneous abortions and is recommended to be used to predict the number of spontaneous abortions. The study suggests that women receive institutional Antenatal care to attain limited parity. It also advocates promoting higher education among women in Punjab, India.Keywords: count data, spontaneous abortion, Poisson model, negative binomial model, zero hurdle negative binomial, zero-inflated negative binomial, regression
Procedia PDF Downloads 1555492 Investigations into the in situ Enterococcus faecalis Biofilm Removal Efficacies of Passive and Active Sodium Hypochlorite Irrigant Delivered into Lateral Canal of a Simulated Root Canal Model
Authors: Saifalarab A. Mohmmed, Morgana E. Vianna, Jonathan C. Knowles
Abstract:
The issue of apical periodontitis has received considerable critical attention. Bacteria is integrated into communities, attached to surfaces and consequently form biofilm. The biofilm structure provides bacteria with a series protection skills against, antimicrobial agents and enhances pathogenicity (e.g. apical periodontitis). Sodium hypochlorite (NaOCl) has become the irrigant of choice for elimination of bacteria from the root canal system based on its antimicrobial findings. The aim of the study was to investigate the effect of different agitation techniques on the efficacy of 2.5% NaOCl to eliminate the biofilm from the surface of the lateral canal using the residual biofilm, and removal rate of biofilm as outcome measures. The effect of canal complexity (lateral canal) on the efficacy of the irrigation procedure was also assessed. Forty root canal models (n = 10 per group) were manufactured using 3D printing and resin materials. Each model consisted of two halves of an 18 mm length root canal with apical size 30 and taper 0.06, and a lateral canal of 3 mm length, 0.3 mm diameter located at 3 mm from the apical terminus. E. faecalis biofilms were grown on the apical 3 mm and lateral canal of the models for 10 days in Brain Heart Infusion broth. Biofilms were stained using crystal violet for visualisation. The model halves were reassembled, attached to an apparatus and tested under a fluorescence microscope. Syringe and needle irrigation protocol was performed using 9 mL of 2.5% NaOCl irrigant for 60 seconds. The irrigant was either left stagnant in the canal or activated for 30 seconds using manual (gutta-percha), sonic and ultrasonic methods. Images were then captured every second using an external camera. The percentages of residual biofilm were measured using image analysis software. The data were analysed using generalised linear mixed models. The greatest removal was associated with the ultrasonic group (66.76%) followed by sonic (45.49%), manual (43.97%), and passive irrigation group (control) (38.67%) respectively. No marked reduction in the efficiency of NaOCl to remove biofilm was found between the simple and complex anatomy models (p = 0.098). The removal efficacy of NaOCl on the biofilm was limited to the 1 mm level of the lateral canal. The agitation of NaOCl results in better penetration of the irrigant into the lateral canals. Ultrasonic agitation of NaOCl improved the removal of bacterial biofilm.Keywords: 3D printing, biofilm, root canal irrigation, sodium hypochlorite
Procedia PDF Downloads 2295491 Improving Chest X-Ray Disease Detection with Enhanced Data Augmentation Using Novel Approach of Diverse Conditional Wasserstein Generative Adversarial Networks
Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Daniyal Haider, Xiaodong Yang
Abstract:
Chest X-rays are instrumental in the detection and monitoring of a wide array of diseases, including viral infections such as COVID-19, tuberculosis, pneumonia, lung cancer, and various cardiac and pulmonary conditions. To enhance the accuracy of diagnosis, artificial intelligence (AI) algorithms, particularly deep learning models like Convolutional Neural Networks (CNNs), are employed. However, these deep learning models demand a substantial and varied dataset to attain optimal precision. Generative Adversarial Networks (GANs) can be employed to create new data, thereby supplementing the existing dataset and enhancing the accuracy of deep learning models. Nevertheless, GANs have their limitations, such as issues related to stability, convergence, and the ability to distinguish between authentic and fabricated data. In order to overcome these challenges and advance the detection and classification of CXR normal and abnormal images, this study introduces a distinctive technique known as DCWGAN (Diverse Conditional Wasserstein GAN) for generating synthetic chest X-ray (CXR) images. The study evaluates the effectiveness of this Idiosyncratic DCWGAN technique using the ResNet50 model and compares its results with those obtained using the traditional GAN approach. The findings reveal that the ResNet50 model trained on the DCWGAN-generated dataset outperformed the model trained on the classic GAN-generated dataset. Specifically, the ResNet50 model utilizing DCWGAN synthetic images achieved impressive performance metrics with an accuracy of 0.961, precision of 0.955, recall of 0.970, and F1-Measure of 0.963. These results indicate the promising potential for the early detection of diseases in CXR images using this Inimitable approach.Keywords: CNN, classification, deep learning, GAN, Resnet50
Procedia PDF Downloads 885490 Parameter Tuning of Complex Systems Modeled in Agent Based Modeling and Simulation
Authors: Rabia Korkmaz Tan, Şebnem Bora
Abstract:
The major problem encountered when modeling complex systems with agent-based modeling and simulation techniques is the existence of large parameter spaces. A complex system model cannot be expected to reflect the whole of the real system, but by specifying the most appropriate parameters, the actual system can be represented by the model under certain conditions. When the studies conducted in recent years were reviewed, it has been observed that there are few studies for parameter tuning problem in agent based simulations, and these studies have focused on tuning parameters of a single model. In this study, an approach of parameter tuning is proposed by using metaheuristic algorithms such as Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Artificial Bee Colonies (ABC), Firefly (FA) algorithms. With this hybrid structured study, the parameter tuning problems of the models in the different fields were solved. The new approach offered was tested in two different models, and its achievements in different problems were compared. The simulations and the results reveal that this proposed study is better than the existing parameter tuning studies.Keywords: parameter tuning, agent based modeling and simulation, metaheuristic algorithms, complex systems
Procedia PDF Downloads 2265489 A Hierarchical Bayesian Calibration of Data-Driven Models for Composite Laminate Consolidation
Authors: Nikolaos Papadimas, Joanna Bennett, Amir Sakhaei, Timothy Dodwell
Abstract:
Composite modeling of consolidation processes is playing an important role in the process and part design by indicating the formation of possible unwanted prior to expensive experimental iterative trial and development programs. Composite materials in their uncured state display complex constitutive behavior, which has received much academic interest, and this with different models proposed. Errors from modeling and statistical which arise from this fitting will propagate through any simulation in which the material model is used. A general hyperelastic polynomial representation was proposed, which can be readily implemented in various nonlinear finite element packages. In our case, FEniCS was chosen. The coefficients are assumed uncertain, and therefore the distribution of parameters learned using Markov Chain Monte Carlo (MCMC) methods. In engineering, the approach often followed is to select a single set of model parameters, which on average, best fits a set of experiments. There are good statistical reasons why this is not a rigorous approach to take. To overcome these challenges, A hierarchical Bayesian framework was proposed in which population distribution of model parameters is inferred from an ensemble of experiments tests. The resulting sampled distribution of hyperparameters is approximated using Maximum Entropy methods so that the distribution of samples can be readily sampled when embedded within a stochastic finite element simulation. The methodology is validated and demonstrated on a set of consolidation experiments of AS4/8852 with various stacking sequences. The resulting distributions are then applied to stochastic finite element simulations of the consolidation of curved parts, leading to a distribution of possible model outputs. With this, the paper, as far as the authors are aware, represents the first stochastic finite element implementation in composite process modelling.Keywords: data-driven , material consolidation, stochastic finite elements, surrogate models
Procedia PDF Downloads 1465488 Optimum Design of Alkali Activated Slag Concretes for Low Chloride Ion Permeability and Water Absorption Capacity
Authors: Müzeyyen Balçikanli, Erdoğan Özbay, Hakan Tacettin Türker, Okan Karahan, Cengiz Duran Atiş
Abstract:
In this research, effect of curing time (TC), curing temperature (CT), sodium concentration (SC) and silicate modules (SM) on the compressive strength, chloride ion permeability, and water absorption capacity of alkali activated slag (AAS) concretes were investigated. For maximization of compressive strength while for minimization of chloride ion permeability and water absorption capacity of AAS concretes, best possible combination of CT, CTime, SC and SM were determined. An experimental program was conducted by using the central composite design method. Alkali solution-slag ratio was kept constant at 0.53 in all mixture. The effects of the independent parameters were characterized and analyzed by using statistically significant quadratic regression models on the measured properties (dependent parameters). The proposed regression models are valid for AAS concretes with the SC from 0.1% to 7.5%, SM from 0.4 to 3.2, CT from 20 °C to 94 °C and TC from 1.2 hours to 25 hours. The results of test and analysis indicate that the most effective parameter for the compressive strength, chloride ion permeability and water absorption capacity is the sodium concentration.Keywords: alkali activation, slag, rapid chloride permeability, water absorption capacity
Procedia PDF Downloads 3125487 Recognition of Gene Names from Gene Pathway Figures Using Siamese Network
Authors: Muhammad Azam, Micheal Olaolu Arowolo, Fei He, Mihail Popescu, Dong Xu
Abstract:
The number of biological papers is growing quickly, which means that the number of biological pathway figures in those papers is also increasing quickly. Each pathway figure shows extensive biological information, like the names of genes and how the genes are related. However, manually annotating pathway figures takes a lot of time and work. Even though using advanced image understanding models could speed up the process of curation, these models still need to be made more accurate. To improve gene name recognition from pathway figures, we applied a Siamese network to map image segments to a library of pictures containing known genes in a similar way to person recognition from photos in many photo applications. We used a triple loss function and a triplet spatial pyramid pooling network by combining the triplet convolution neural network and the spatial pyramid pooling (TSPP-Net). We compared VGG19 and VGG16 as the Siamese network model. VGG16 achieved better performance with an accuracy of 93%, which is much higher than OCR results.Keywords: biological pathway, image understanding, gene name recognition, object detection, Siamese network, VGG
Procedia PDF Downloads 2915486 A Comparison of Convolutional Neural Network Architectures for the Classification of Alzheimer’s Disease Patients Using MRI Scans
Authors: Tomas Premoli, Sareh Rowlands
Abstract:
In this study, we investigate the impact of various convolutional neural network (CNN) architectures on the accuracy of diagnosing Alzheimer’s disease (AD) using patient MRI scans. Alzheimer’s disease is a debilitating neurodegenerative disorder that affects millions worldwide. Early, accurate, and non-invasive diagnostic methods are required for providing optimal care and symptom management. Deep learning techniques, particularly CNNs, have shown great promise in enhancing this diagnostic process. We aim to contribute to the ongoing research in this field by comparing the effectiveness of different CNN architectures and providing insights for future studies. Our methodology involved preprocessing MRI data, implementing multiple CNN architectures, and evaluating the performance of each model. We employed intensity normalization, linear registration, and skull stripping for our preprocessing. The selected architectures included VGG, ResNet, and DenseNet models, all implemented using the Keras library. We employed transfer learning and trained models from scratch to compare their effectiveness. Our findings demonstrated significant differences in performance among the tested architectures, with DenseNet201 achieving the highest accuracy of 86.4%. Transfer learning proved to be helpful in improving model performance. We also identified potential areas for future research, such as experimenting with other architectures, optimizing hyperparameters, and employing fine-tuning strategies. By providing a comprehensive analysis of the selected CNN architectures, we offer a solid foundation for future research in Alzheimer’s disease diagnosis using deep learning techniques. Our study highlights the potential of CNNs as a valuable diagnostic tool and emphasizes the importance of ongoing research to develop more accurate and effective models.Keywords: Alzheimer’s disease, convolutional neural networks, deep learning, medical imaging, MRI
Procedia PDF Downloads 735485 Performance Improvement of Information System of a Banking System Based on Integrated Resilience Engineering Design
Authors: S. H. Iranmanesh, L. Aliabadi, A. Mollajan
Abstract:
Integrated resilience engineering (IRE) is capable of returning banking systems to the normal state in extensive economic circumstances. In this study, information system of a large bank (with several branches) is assessed and optimized under severe economic conditions. Data envelopment analysis (DEA) models are employed to achieve the objective of this study. Nine IRE factors are considered to be the outputs, and a dummy variable is defined as the input of the DEA models. A standard questionnaire is designed and distributed among executive managers to be considered as the decision-making units (DMUs). Reliability and validity of the questionnaire is examined based on Cronbach's alpha and t-test. The most appropriate DEA model is determined based on average efficiency and normality test. It is shown that the proposed integrated design provides higher efficiency than the conventional RE design. Results of sensitivity and perturbation analysis indicate that self-organization, fault tolerance, and reporting culture respectively compose about 50 percent of total weight.Keywords: banking system, Data Envelopment Analysis (DEA), Integrated Resilience Engineering (IRE), performance evaluation, perturbation analysis
Procedia PDF Downloads 1885484 Exploration and Evaluation of the Effect of Multiple Countermeasures on Road Safety
Authors: Atheer Al-Nuaimi, Harry Evdorides
Abstract:
Every day many people die or get disabled or injured on roads around the world, which necessitates more specific treatments for transportation safety issues. International road assessment program (iRAP) model is one of the comprehensive road safety models which accounting for many factors that affect road safety in a cost-effective way in low and middle income countries. In iRAP model road safety has been divided into five star ratings from 1 star (the lowest level) to 5 star (the highest level). These star ratings are based on star rating score which is calculated by iRAP methodology depending on road attributes, traffic volumes and operating speeds. The outcome of iRAP methodology are the treatments that can be used to improve road safety and reduce fatalities and serious injuries (FSI) numbers. These countermeasures can be used separately as a single countermeasure or mix as multiple countermeasures for a location. There is general agreement that the adequacy of a countermeasure is liable to consistent losses when it is utilized as a part of mix with different countermeasures. That is, accident diminishment appraisals of individual countermeasures cannot be easily added together. The iRAP model philosophy makes utilization of a multiple countermeasure adjustment factors to predict diminishments in the effectiveness of road safety countermeasures when more than one countermeasure is chosen. A multiple countermeasure correction factors are figured for every 100-meter segment and for every accident type. However, restrictions of this methodology incorporate a presumable over-estimation in the predicted crash reduction. This study aims to adjust this correction factor by developing new models to calculate the effect of using multiple countermeasures on the number of fatalities for a location or an entire road. Regression models have been used to establish relationships between crash frequencies and the factors that affect their rates. Multiple linear regression, negative binomial regression, and Poisson regression techniques were used to develop models that can address the effectiveness of using multiple countermeasures. Analyses are conducted using The R Project for Statistical Computing showed that a model developed by negative binomial regression technique could give more reliable results of the predicted number of fatalities after the implementation of road safety multiple countermeasures than the results from iRAP model. The results also showed that the negative binomial regression approach gives more precise results in comparison with multiple linear and Poisson regression techniques because of the overdispersion and standard error issues.Keywords: international road assessment program, negative binomial, road multiple countermeasures, road safety
Procedia PDF Downloads 2405483 An Investigation into the Ideological Facets Involved in Western Interpretations of the History of Communism
Authors: Anna Stoutenburg
Abstract:
With the rise of the so-called 'new left' within the United States, marked by social democratic figures such as Bernie Sanders and Alexandria Ocasio-Cortez, significant questions have been raised in response to those who would identify with the term 'socialist'. These queries typically revolve around the negatively perceived legacy of past and present countries that share the term in question, with the stark conclusion that not only is socialism a structure that does not work economically, but that it also tends to inflict more harm on those living under it that would be endured in a country functioning under capitalism. In order to examine these claims, the goal of this paper is to examine the legacy of anti-communist historiography in a western context, with the Union of Soviet Socialist Republics, China, and modern Venezuela used as case studies for how this phenomenon operates. Not only will key portions of each nation’s history be re-examined, but there will also be a critical analysis of source cultivation and usage among western historians. The intent of this paper is not merely to deride previous attempts at historicizing and reporting on the events of the nations, but rather to attempt to glean a clearer picture that is free from anti-communist sentiments. Theoretical works that will be consulted in order to define this project are 'The Historiography of Communism' by Michael Brown, as well as 'Beyond Philosophy: Ethics, History, Marxism, and Liberation Theology' by Enrique Dussel. The latter will provide insights concerning why these questions are relevant in a larger context, namely by articulating how the means of liberation understood through an analectic method can be achieved structurally. For a majority of leftists, this question is integral, and by using history as a tool, the ways that political organizing can be used can be better understood, bridging the gap between the common assumption that the communist legacy is a dire one and the idea that it is something which needs to be completely lauded.Keywords: anti-communism, history, ideology, Marxism
Procedia PDF Downloads 1235482 What 4th-Year Primary-School Students are Thinking: A Paper Airplane Problem
Authors: Neslihan Şahin Çelik, Ali Eraslan
Abstract:
In recent years, mathematics educators have frequently stressed the necessity of instructing students about models and modeling approaches that encompass cognitive and metacognitive thought processes, starting from the first years of school and continuing on through the years of higher education. The purpose of this study is to examine the thought processes of 4th-grade primary school students in their modeling activities and to explore the difficulties encountered in these processes, if any. The study, of qualitative design, was conducted in the 2015-2016 academic year at a public state-school located in a central city in the Black Sea Region of Turkey. A preliminary study was first implemented with designated 4th grade students, after which the criterion sampling method was used to select three students that would be recruited into the focus group. The focus group that was thus formed was asked to work on the model eliciting activity of the Paper Airplane Problem and the entire process was recorded on video. The Paper Airplane Problem required the students to determine the winner with respect to: (a) the plane that stays in the air for the longest time; (b) the plane that travels the greatest distance in a straight-line path; and (c) the overall winner for the contest. A written transcript was made of the video recording, after which the recording and the students' worksheets were analyzed using the Blum and Ferri modeling cycle. The results of the study revealed that the students tested the hypotheses related to daily life that they had set up, generated ideas of their own, verified their models by making connections with real life, and tried to make their models generalizable. On the other hand, the students had some difficulties in terms of their interpretation of the table of data and their ways of operating on the data during the modeling processes.Keywords: primary school students, model eliciting activity, mathematical modeling, modeling process, paper airplane problem
Procedia PDF Downloads 3585481 Creativity and Innovation in Postgraduate Supervision
Authors: Rajendra Chetty
Abstract:
The paper aims to address two aspects of postgraduate studies: interdisciplinary research and creative models of supervision. Interdisciplinary research can be viewed as a key imperative to solve complex problems. While excellent research requires a context of disciplinary strength, the cutting edge is often found at the intersection between disciplines. Interdisciplinary research foregrounds a team approach and information, methodologies, designs, and theories from different disciplines are integrated to advance fundamental understanding or to solve problems whose solutions are beyond the scope of a single discipline. Our aim should also be to generate research that transcends the original disciplines i.e. transdisciplinary research. Complexity is characteristic of the knowledge economy, hence, postgraduate research and engaged scholarship should be viewed by universities as primary vehicles through which knowledge can be generated to have a meaningful impact on society. There are far too many ‘ordinary’ studies that fall into the realm of credentialism and certification as opposed to significant studies that generate new knowledge and provide a trajectory for further academic discourse. Secondly, the paper will look at models of supervision that are different to the dominant ‘apprentice’ or individual approach. A reflective practitioner approach would be used to discuss a range of supervision models that resonate well with the principles of interdisciplinarity, growth in the postgraduate sector and a commitment to engaged scholarship. The global demand for postgraduate education has resulted in increased intake and new demands to limited supervision capacity at institutions. Team supervision lodged within large-scale research projects, working with a cohort of students within a research theme, the journal article route of doctoral studies and the professional PhD are some of the models that provide an alternative to the traditional approach. International cooperation should be encouraged in the production of high-impact research and institutions should be committed to stimulating international linkages which would result in co-supervision and mobility of postgraduate students and global significance of postgraduate research. International linkages are also valuable in increasing the capacity for supervision at new and developing universities. Innovative co-supervision and joint-degree options with global partners should be explored within strategic planning for innovative postgraduate programmes. Co-supervision of PhD students is probably the strongest driver (besides funding) for collaborative research as it provides the glue of shared interest, advantage and commitment between supervisors. The students’ field serves and informs the co-supervisors own research agendas and helps to shape over-arching research themes through shared research findings.Keywords: interdisciplinarity, internationalisation, postgraduate, supervision
Procedia PDF Downloads 2385480 Long Short-Term Memory Stream Cruise Control Method for Automated Drift Detection and Adaptation
Authors: Mohammad Abu-Shaira, Weishi Shi
Abstract:
Adaptive learning, a commonly employed solution to drift, involves updating predictive models online during their operation to react to concept drifts, thereby serving as a critical component and natural extension for online learning systems that learn incrementally from each example. This paper introduces LSTM-SCCM “Long Short-Term Memory Stream Cruise Control Method”, a drift adaptation-as-a-service framework for online learning. LSTM-SCCM automates drift adaptation through prompt detection, drift magnitude quantification, dynamic hyperparameter tuning, performing shortterm optimization and model recalibration for immediate adjustments, and, when necessary, conducting long-term model recalibration to ensure deeper enhancements in model performance. LSTM-SCCM is incorporated into a suite of cutting-edge online regression models, assessing their performance across various types of concept drift using diverse datasets with varying characteristics. The findings demonstrate that LSTM-SCCM represents a notable advancement in both model performance and efficacy in handling concept drift occurrences. LSTM-SCCM stands out as the sole framework adept at effectively tackling concept drifts within regression scenarios. Its proactive approach to drift adaptation distinguishes it from conventional reactive methods, which typically rely on retraining after significant degradation to model performance caused by drifts. Additionally, LSTM-SCCM employs an in-memory approach combined with the Self-Adjusting Memory (SAM) architecture to enhance real-time processing and adaptability. The framework incorporates variable thresholding techniques and does not assume any particular data distribution, making it an ideal choice for managing high-dimensional datasets and efficiently handling large-scale data. Our experiments, which include abrupt, incremental, and gradual drifts across both low- and high-dimensional datasets with varying noise levels, and applied to four state-of-the-art online regression models, demonstrate that LSTM-SCCM is versatile and effective, rendering it a valuable solution for online regression models to address concept drift.Keywords: automated drift detection and adaptation, concept drift, hyperparameters optimization, online and adaptive learning, regression
Procedia PDF Downloads 115479 Transformation of Industrial Policy towards Industry 4.0 and Its Impact on Firms' Competition
Authors: Arūnas Burinskas
Abstract:
Although Europe is on the threshold of a new industrial revolution called Industry 4.0, many believe that this will increase the flexibility of production, the mass adaptation of products to consumers and the speed of their service; it will also improve product quality and dramatically increase productivity. However, as expected, all the benefits of Industry 4.0 face many of the inevitable changes and challenges they pose. One of them is the inevitable transformation of current competition and business models. This article examines the possible results of competitive conversion from the classic Bertrand and Cournot models to qualitatively new competition based on innovation. Ability to deliver a new product quickly and the possibility to produce the individual design (through flexible and quickly configurable factories) by reducing equipment failures and increasing process automation and control is highly important. This study shows that the ongoing transformation of the competition model is changing the game. This, together with the creation of complex value networks, means huge investments that make it particularly difficult for small and medium-sized enterprises. In addition, the ongoing digitalization of data raises new concerns regarding legal obligations, intellectual property, and security.Keywords: Bertrand and Cournot Competition, competition model, industry 4.0, industrial organisation, monopolistic competition
Procedia PDF Downloads 1385478 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Orlin Davchev
Abstract:
The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction
Procedia PDF Downloads 635477 Prediction of Gully Erosion with Stochastic Modeling by using Geographic Information System and Remote Sensing Data in North of Iran
Authors: Reza Zakerinejad
Abstract:
Gully erosion is a serious problem that threading the sustainability of agricultural area and rangeland and water in a large part of Iran. This type of water erosion is the main source of sedimentation in many catchment areas in the north of Iran. Since in many national assessment approaches just qualitative models were applied the aim of this study is to predict the spatial distribution of gully erosion processes by means of detail terrain analysis and GIS -based logistic regression in the loess deposition in a case study in the Golestan Province. This study the DEM with 25 meter result ion from ASTER data has been used. The Landsat ETM data have been used to mapping of land use. The TreeNet model as a stochastic modeling was applied to prediction the susceptible area for gully erosion. In this model ROC we have set 20 % of data as learning and 20 % as learning data. Therefore, applying the GIS and satellite image analysis techniques has been used to derive the input information for these stochastic models. The result of this study showed a high accurate map of potential for gully erosion.Keywords: TreeNet model, terrain analysis, Golestan Province, Iran
Procedia PDF Downloads 5355476 Day of the Week Patterns and the Financial Trends' Role: Evidence from the Greek Stock Market during the Euro Era
Authors: Nikolaos Konstantopoulos, Aristeidis Samitas, Vasileiou Evangelos
Abstract:
The purpose of this study is to examine if the financial trends influence not only the stock markets’ returns, but also their anomalies. We choose to study the day of the week effect (DOW) for the Greek stock market during the Euro period (2002-12), because during the specific period there are not significant structural changes and there are long term financial trends. Moreover, in order to avoid possible methodological counterarguments that usually arise in the literature, we apply several linear (OLS) and nonlinear (GARCH family) models to our sample until we reach to the conclusion that the TGARCH model fits better to our sample than any other. Our results suggest that in the Greek stock market there is a long term predisposition for positive/negative returns depending on the weekday. However, the statistical significance is influenced from the financial trend. This influence may be the reason why there are conflict findings in the literature through the time. Finally, we combine the DOW’s empirical findings from 1985-2012 and we may assume that in the Greek case there is a tendency for long lived turn of the week effect.Keywords: day of the week effect, GARCH family models, Athens stock exchange, economic growth, crisis
Procedia PDF Downloads 410